Reputation: 41
I'm trying to run a Spark jar on Kubernetes. I have built my own docker image using the template spark-2.4.4-bin-hadoop2.7
and tried to run my yaml file with this docker image.
I have got below error from driver log /opt/entrypoint.sh
:
line 133: /sbin/tini: No such file or directory.
I understand that /sbin/tini
is not present in docker image. Anybody could help me solve this issue? How to get this tini in docker image path?
command entered:
kubectl create -f spark.yaml
driver log:
kubectl logs spark-wordcount-7-driver
++ id -u
+ myuid=0
++ id -g
+ mygid=0
+ set +e
++ getent passwd 0
+ uidentry=root:x:0:0:root:/root:/bin/bash
+ set -e
+ '[' -z root:x:0:0:root:/root:/bin/bash ']'
+ SPARK_K8S_CMD=driver
+ case "$SPARK_K8S_CMD" in
+ shift 1
+ SPARK_CLASSPATH=':/opt/spark/jars/*'
+ grep SPARK_JAVA_OPT_
+ env
+ sort -t_ -k4 -n
+ sed 's/[^=]*=\(.*\)/\1/g'
+ readarray -t SPARK_EXECUTOR_JAVA_OPTS
+ '[' -n '' ']'
+ '[' -n '' ']'
+ PYSPARK_ARGS=
+ '[' -n '' ']'
+ R_ARGS=
+ '[' -n '' ']'
+ '[' '' == 2 ']'
+ '[' '' == 3 ']'
+ case "$SPARK_K8S_CMD" in
+ CMD=("$SPARK_HOME/bin/spark-submit" --conf "spark.driver.bindAddress=$SPARK_DRIVER_BIND_ADDRESS" --deploy-mode client "$@")
+ exec /sbin/tini -s -- /opt/spark/bin/spark-submit --conf spark.driver.bindAddress=172.17.0.6 --deploy-mode client --properties-file /opt/spark/conf/spark.properties --class com.walmart.WordCount spark-internal
/opt/entrypoint.sh: line 133: /sbin/tini: No such file or directory
Upvotes: 2
Views: 4800
Reputation: 31
Since, you are using alpine image, you have to use /sbin/tini
NOTE: alpine has moved tini to /sbin/tini
In file: entrypoint.sh
:
Do the following change
exec /usr/bin/tini -s -- /usr/bin/spark-operator "$@"
to
exec /sbin/tini -s -- /usr/bin/spark-operator "$@"
Upvotes: 3