Berlin
Berlin

Reputation: 1464

submit pyspark job with virtual environment using livy to AWS EMR

I have created an EMR cluster with the below configurations following AWS documentation

https://aws.amazon.com/premiumsupport/knowledge-center/emr-pyspark-python-3x/

{
    "Classification": "livy-conf",
    "Properties": {
      "livy.spark.deploy-mode": "cluster",
      "livy.impersonation.enabled": "true",
      "livy.spark.yarn.appMasterEnv.PYSPARK_PYTHON": "/usr/bin/python3"
    }
  },

When I submit the pyspark job using livy with the following post request

```
payload = {
    'file': self.py_file,
    'pyFiles': self.py_files,
    'name': self.job_name,
    'archives': ['s3://test.test.bucket/venv.zip#venv', 's3://test.test.bucket/requirements.pip'],
    'proxyUser': 'hadoop',
    "conf": {
      "PYSPARK_PYTHON": "./venv/bin/python",
      "spark.yarn.appMasterEnv.PYSPARK_PYTHON": "./venv/bin/python",
      "spark.yarn.executorEnv.PYSPARK_PYTHON": "./venv/bin/python",
      "spark.yarn.appMasterEnv.VIRTUAL_ENV": "./venv/bin/python",
      "spark.yarn.executorEnv.VIRTUAL_ENV": "./venv/bin/python",
      "livy.spark.yarn.appMasterEnv.PYSPARK_PYTHON": "./venv/bin/python",
      "livy.spark.yarn.appMasterEnv.PYSPARK_PYTHON": "./venv/bin/python",
      "spark.pyspark.virtualenv.enabled": "true",
      "spark.pyspark.virtualenv.type": "native",
      "spark.pyspark.virtualenv.requirements": "s3://test.test.bucket/requirements.pip",
      "spark.pyspark.virtualenv.path": "./venv/bin/python"
     }
}

```

I get the following error message:

```
Log Type: stdout
Could not find platform independent libraries <prefix>
Could not find platform dependent libraries <exec_prefix>
Consider setting $PYTHONHOME to <prefix>[:<exec_prefix>]
Fatal Python error: Py_Initialize: Unable to get the locale encoding
ImportError: No module named 'encodings'
Current thread 0x00007efc72b57740 (most recent call first)
```

I also tried to change the PYTHONHOME PYTHONPATH to the parent folder of the bin file of python in the virtual environment, but nothing works.

```
"spark.yarn.appMasterEnv.PYTHONPATH": "./venv/bin/",
"spark.yarn.executorEnv.PYTHONPATH": "./venv/bin/",
"livy.spark.yarn.appMasterEnv.PYTHONPATH": "./venv/bin/",
"livy.spark.yarn.executorEnv.PYTHONPATH": "./venv/bin/",
#
"spark.yarn.appMasterEnv.PYTHONHOME": "./venv/bin/",
"spark.yarn.executorEnv.PYTHONHOME": "./venv/bin/",
"livy.spark.yarn.appMasterEnv.PYTHONHOME": "./venv/bin/",
"livy.spark.yarn.executorEnv.PYTHONHOME": "./venv/bin/",
```

Error:

Fatal Python error: Py_Initialize: Unable to get the locale encoding ImportError: No module named 'encodings' Current thread 0x00007f7351d53740 (most recent call first):

This is how I created the virtual environment

```
python3 -m venv venv/
source venv/bin/activate
python3 -m pip install -r requirements.pip
deactivate
pushd venv/
zip -rq ../venv.zip *
popd
```

virtual environment structure:

drwxrwxr-x  2   4096 Oct 15 12:37 bin/
drwxrwxr-x  2   4096 Oct 15 12:37 include/
drwxrwxr-x  3   4096 Oct 15 12:37 lib/
lrwxrwxrwx  1      3 Oct 15 12:37 lib64 -> lib/
-rw-rw-r--  1     59 Oct 15 12:37 pip-selfcheck.json
-rw-rw-r--  1     69 Oct 15 12:37 pyvenv.cfg
drwxrwxr-x  3   4096 Oct 15 12:37 share/

bin dir:

activate  activate.csh  activate.fish  chardetect  easy_install  easy_install-3.5  pip  pip3  pip3.5  python  python3

lib dir:

python3.5/site-packages/

Aws support saying it's an ongoing bug.

https://issues.apache.org/jira/browse/SPARK-13587

https://issues.apache.org/jira/browse/ZEPPELIN-2233

Any suggestions ?

Thanks!

Upvotes: 4

Views: 4396

Answers (1)

Matteo
Matteo

Reputation: 11

I needed to submit a PySpark job with virtual environment. To use a virtual virtual-env in EMR with the 5.x distribution I did this :

Go in the root of your code folder (example: /home/hadoop) and run:

virtualenv -p /usr/bin/python3 <your-venv_name>

source <your-venv_name>/bin/activate

Go into <your-venv_name>/bin and run:

./pip3 freeze --> ensure that is empty

sudo ./pip3 install -r <CODE FOLDER PATH>/requirements.txt

./pip3 freeze --> ensure that is populated

To submit my job I used (with basic config) this command:

spark-submit --conf spark.pyspark.virtualenv.bin.path=<path-to-your-venv_name>- --conf spark.pyspark.python=<path-to-your-venv_name>/bin/python3 --conf spark.pyspark.driver.python=<path-to-your-venv_name>/bin/python3 <path-to-your-main.py>

in the main.py code you also have to specify the "PYSPARK PYTHON" in the environment:

import os
os.environ["PYSPARK_PYTHON"]="/usr/bin/python3"

Upvotes: 1

Related Questions