Joshua Fox
Joshua Fox

Reputation: 19685

How do I set Python's optimized mode (-O) on spark executor?

How do I set python optimized mode (-O param for interpreter) on an executor running on a Spark slave?

(Apparently the Python interpreter for the executor is launched using this line

 val pb = new ProcessBuilder(Arrays.asList(pythonExec, "-m", "pyspark.worker")) 

in org/apache/spark/api/python/PythonWorkerFactory.scala.

But I don't see a way of setting the -O flag.)

Upvotes: 2

Views: 323

Answers (2)

Daniel Darabos
Daniel Darabos

Reputation: 27455

The Python executable is set by the PYSPARK_DRIVER_PYTHON or PYSPARK_PYTHON environmental variable (the latter sets it both for the executors and the driver). You could create a wrapper that runs python -O:

#!/bin/sh
exec python -O "$@"

And use this wrapper by setting PYSPARK_PYTHON=/home/daniel/python_opt.sh.

Upvotes: 4

Daniel Darabos
Daniel Darabos

Reputation: 27455

You cannot set -O on the Spark worker processes. This option is mostly useless anyway. (See What is the use of Python's basic optimizations mode? (python -O).)

Upvotes: -1

Related Questions