A.GARMES
A.GARMES

Reputation: 61

spark kafka security kerberos

I try to use kafka(0.9.1) with secure mode. I would read data with Spark, so I must pass the JAAS conf file to the JVM. I use this cmd to start my job :

    /opt/spark/bin/spark-submit -v --master spark://master1:7077    \
    --conf "spark.executor.extraJavaOptions=-Djava.security.auth.login.conf=kafka_client_jaas.conf" \
    --files "./conf/kafka_client_jaas.conf,./conf/kafka.client.1.keytab" \
    --class kafka.ConsumerSasl  ./kafka.jar --topics test

I still have the same error :

Caused by: java.lang.IllegalArgumentException: You must pass java.security.auth.login.config in secure mode.
    at org.apache.kafka.common.security.kerberos.Login.login(Login.java:289)
    at org.apache.kafka.common.security.kerberos.Login.<init>(Login.java:104)
    at org.apache.kafka.common.security.kerberos.LoginManager.<init>(LoginManager.java:44)
    at org.apache.kafka.common.security.kerberos.LoginManager.acquireLoginManager(LoginManager.java:85)
    at org.apache.kafka.common.network.SaslChannelBuilder.configure(SaslChannelBuilder.java:55)

I think the spark does not inject the parameter Djava.security.auth.login.conf in the jvm !!

Upvotes: 3

Views: 3060

Answers (1)

Harry
Harry

Reputation: 41

The main cause of this issue is that you have mentioned wrong property name. it should be java.security.auth.login.config and not -Djava.security.auth.login.conf. Moreover if you are using keytab file. make sure to make it available on all executors using --files argument in spark-submit. if you are using kerberos ticket make sure to set KRB5CCNAME on all executors using property SPARK_YARN_USER_ENV.

if you are using older version of spark 1.6.x or earlier. then there are some known issues with spark that this integration will not work then you have to write a custom receiver.

For spark 1.8 and later, you can see configuration here

Incase you need to create custom receiver you can see this

Upvotes: 1

Related Questions