timbram
timbram

Reputation: 1865

How to redirect entire output of spark-submit to a file

So, I am trying to redirect the output of an apache spark-submit command to text file but some output fails to populate file. Here is the command I am using:

spark-submit something.py > results.txt

I can see the output in the terminal but I do not see it in the file. What am I forgetting or doing wrong here?

Edit:

If I use

spark-submit something.py | less

I can see all the output being piped into less

Upvotes: 13

Views: 38469

Answers (2)

philantrovert
philantrovert

Reputation: 10082

spark-submit prints most of it's output to STDERR

To redirect the entire output to one file, you can use:

spark-submit something.py > results.txt 2>&1

Or

spark-submit something.py &> results.txt

Upvotes: 37

Avishek Bhattacharya
Avishek Bhattacharya

Reputation: 6974

If you are running the spark-submit on a cluster the logs are stored with the application Id. You can see the logs once the application finishes.

yarn logs --applicationId <your applicationId> > myfile.txt

Should fetch you the log of your job

The applicationId of your job is given when you submit the spark job. You will be able to see that in the console where you are submitting or from the Hadoop UI.

Upvotes: 5

Related Questions