Reputation: 12254
I've built a docker image containing a number of environment variables, including one called SPARK_HOME
. Here is the line from the Dockerfile that declares that env var:
ENV SPARK_HOME="/opt/spark"
When I issue docker run
I can see that the env var exists but any reference to it doesn't return anything, as demonstrated in a simple echo
:
$ docker run --rm myimage /bin/bash -c "env | grep SPARK_HOME ; echo SPARK_HOME=$SPARK_HOME"
SPARK_HOME=/opt/spark
SPARK_HOME=
$
Am I missing something obvious here? Why can I not refer to the value of an existing env var?
EDIT 1: As requested in the comments the Dockerfile content is included below, below the break.
EDIT 2: Discovered that the var can be referred to if I run the container interactively
$ docker run --rm -it myimage /bin/bash
root@419dd5f13a6f:/tmp# echo $SPARK_HOME
/opt/spark
FROM our.internal.artifact.store/python:3.7-stretch
WORKDIR /tmp
ENV SPARK_VERSION=2.2.1
ENV HADOOP_VERSION=2.8.4
ARG ARTIFACTORY_USER
ARG ARTIFACTORY_ENCRYPTED_PASSWORD
ARG ARTIFACTORY_PATH=our.internal.artifact.store/artifactory/generic-dev/ceng/external-dependencies
ARG SPARK_BINARY_PATH=https://${ARTIFACTORY_PATH}/spark-${SPARK_VERSION}-bin-hadoop2.7.tgz
ARG HADOOP_BINARY_PATH=https://${ARTIFACTORY_PATH}/hadoop-${HADOOP_VERSION}.tar.gz
ADD files/apt-transport-https_1.4.8_amd64.deb /tmp
RUN echo "deb https://username:[email protected]/artifactory/debian-main-remote stretch main" >/etc/apt/sources.list.d/main.list &&\
echo "deb https://username:[email protected]/artifactory/maria-db-debian stretch main" >>/etc/apt/sources.list.d/main.list &&\
echo 'Acquire::CompressionTypes::Order:: "gz";' > /etc/apt/apt.conf.d/02update &&\
echo 'Acquire::http::Timeout "10";' > /etc/apt/apt.conf.d/99timeout &&\
echo 'Acquire::ftp::Timeout "10";' >> /etc/apt/apt.conf.d/99timeout &&\
dpkg -i /tmp/apt-transport-https_1.4.8_amd64.deb &&\
apt-get install --allow-unauthenticated -y /tmp/apt-transport-https_1.4.8_amd64.deb &&\
apt-get update --allow-unauthenticated -y -o Dir::Etc::sourcelist="sources.list.d/main.list" -o Dir::Etc::sourceparts="-" -o APT::Get::List-Cleanup="0"
RUN apt-get update && \
apt-get -y install default-jdk
# Detect JAVA_HOME and export in bashrc.
# This will result in something like this being added to /etc/bash.bashrc
# export JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64
RUN echo export JAVA_HOME="$(readlink -f /usr/bin/java | sed "s:/jre/bin/java::")" >> /etc/bash.bashrc
# Configure Spark-${SPARK_VERSION}
RUN curl --fail -u "${ARTIFACTORY_USER}:${ARTIFACTORY_ENCRYPTED_PASSWORD}" -X GET "${SPARK_BINARY_PATH}" -o /opt/spark-${SPARK_VERSION}-bin-hadoop2.7.tgz \
&& cd /opt \
&& tar -xvzf /opt/spark-${SPARK_VERSION}-bin-hadoop2.7.tgz \
&& rm spark-${SPARK_VERSION}-bin-hadoop2.7.tgz \
&& ln -s spark-${SPARK_VERSION}-bin-hadoop2.7 spark \
&& sed -i '/log4j.rootCategory=INFO, console/c\log4j.rootCategory=CRITICAL, console' /opt/spark/conf/log4j.properties.template \
&& mv /opt/spark/conf/log4j.properties.template /opt/spark/conf/log4j.properties \
&& mkdir /opt/spark-optional-jars/ \
&& mv /opt/spark/conf/spark-defaults.conf.template /opt/spark/conf/spark-defaults.conf \
&& printf "spark.driver.extraClassPath /opt/spark-optional-jars/*\nspark.executor.extraClassPath /opt/spark-optional-jars/*\n">>/opt/spark/conf/spark-defaults.conf \
&& printf "spark.driver.extraJavaOptions -Dderby.system.home=/tmp/derby" >> /opt/spark/conf/spark-defaults.conf
# Configure Hadoop-${HADOOP_VERSION}
RUN curl --fail -u "${ARTIFACTORY_USER}:${ARTIFACTORY_ENCRYPTED_PASSWORD}" -X GET "${HADOOP_BINARY_PATH}" -o /opt/hadoop-${HADOOP_VERSION}.tar.gz \
&& tar -xvzf /opt/hadoop-${HADOOP_VERSION}.tar.gz \
&& rm /opt/hadoop-${HADOOP_VERSION}.tar.gz \
&& ln -s hadoop-${HADOOP_VERSION} hadoop
# Set Environment Variables.
ENV SPARK_HOME="/opt/spark" \
HADOOP_HOME="/opt/hadoop" \
PYSPARK_SUBMIT_ARGS="--master=local[*] pyspark-shell --executor-memory 1g --driver-memory 1g --conf spark.ui.enabled=false spark.executor.extrajavaoptions=-Xmx=1024m" \
PYTHONPATH="/opt/spark/python:/opt/spark/python/lib/py4j-0.10.7-src.zip:$PYTHONPATH" \
PATH="$PATH:/opt/spark/bin:/opt/hadoop/bin" \
PYSPARK_DRIVER_PYTHON="/usr/local/bin/python" \
PYSPARK_PYTHON="/usr/local/bin/python"
# Upgrade pip and setuptools
RUN pip install --index-url https://username:[email protected]/artifactory/api/pypi/pypi-virtual-all/simple --upgrade pip setuptools
# Install core python packages
RUN pip install --index-url https://username:[email protected]/artifactory/api/pypi/pypi-virtual-all/simple pipenv
ADD Pipfile /tmp
ADD pysparkdf_helloworld.py /tmp
Upvotes: 0
Views: 126
Reputation: 4327
Ok, contrary to my comment, thats not weird at all.
The issue is just that your local shell already interpolates $SPARK_HOME
before sending it to the container, so you're basically calling echo SPARK_HOME=
To fix, just escape the env var in the command: $SPARK_HOME
->\$SPARK_HOME
Demo:
$ export SPARK_HOME=foo
$ docker run ... /bin/bash -c "env | grep SPARK_HOME ; echo SPARK_HOME=$SPARK_HOME"
> SPARK_HOME=/opt/spark
> SPARK_HOME=foo
Upvotes: 1