Apeksha Agnihotri
Apeksha Agnihotri

Reputation: 11

Apache Kafka Secure and non secure connections with spark 1.6.3

Getting error when trying to use Kerberos enabled Apache Kafka(0.9) with Apache spark 1.6.3.Zookeeper version is 3.4.5 I have to connect to two kafka. One is keberos enabled and the other is not,so I am not setting java.security.auth.login.config property in spark executor's extra java opts.

Kafka Initialization failed: org.apache.kafka.common.KafkaException: Failed to construct kafka consumer
    at org.apache.kafka.clients.consumer.KafkaConsumer.<init>(KafkaConsumer.java:648)
    at org.apache.kafka.clients.consumer.KafkaConsumer.<init>(KafkaConsumer.java:542)
    at org.apache.kafka.clients.consumer.KafkaConsumer.<init>(KafkaConsumer.java:524)
    at com.spark.receiver.helper.KafkaChannelHelper.initializeConnection(KafkaChannelHelper.java:277)
    at com.spark.receiver.helper.KafkaChannelHelper$2.run(KafkaChannelHelper.java:240)
Caused by: org.apache.kafka.common.KafkaException: java.lang.IllegalArgumentException: Could not find a 'KafkaClient' entry in `/home/user/kafka_client.conf`.
    at org.apache.kafka.common.network.SaslChannelBuilder.configure(SaslChannelBuilder.java:74)
    at org.apache.kafka.common.network.ChannelBuilders.create(ChannelBuilders.java:60)
    at org.apache.kafka.clients.ClientUtils.createChannelBuilder(ClientUtils.java:79)
    at org.apache.kafka.clients.consumer.KafkaConsumer.<init>(KafkaConsumer.java:577)
    ... 4 more
Caused by: java.lang.IllegalArgumentException: Could not find a 'KafkaClient' entry in `/home/user/kafka_client.conf`.
    at org.apache.kafka.common.security.kerberos.Login.login(Login.java:294)
    at org.apache.kafka.common.security.kerberos.Login.<init>(Login.java:104)
    at org.apache.kafka.common.security.kerberos.LoginManager.<init>(LoginManager.java:44)
    at org.apache.kafka.common.security.kerberos.LoginManager.acquireLoginManager(LoginManager.java:85)
    at org.apache.kafka.common.network.SaslChannelBuilder.configure(SaslChannelBuilder.java:55)
    ... 7 more

java.security.auth.login.config is set in consumer itself.The code which connects to kafkaConsumer is:

public void initializeConnection() {
    props.put(CommonClientConfigs.SECURITY_PROTOCOL_CONFIG, "SASL_PLAINTEXT");
         System.setProperty("java.security.auth.login.config", jassFilePath);
        try {
            this.consumer = new KafkaConsumer<String, byte[]>(props);
        } catch (Exception e) {
            LOGGER.error("Kafka Initialization failed: ", e);
        }
    }

kafka_client.conf contains only below section:

KafkaClient{
    com.sun.security.auth.module.Krb5LoginModule required
    debug=true
    useKeyTab=true
    keyTab="/etc/security/keytabs/user.keytab"
    storeKey=true
    principal="user@REALM"
    serviceName="kafka";
};

Upvotes: 0

Views: 1279

Answers (2)

Samt
Samt

Reputation: 194

I have similar problem with kafka 1.11.0.

The monitor program in same JVM is accessing multiple brokers, some of brokers are using SASL Kerberos, and others are insecure.

The argument is added by program self when accessing secure clusters.

-Djava.security.auth.login.config=/home/kafka-user/kafka-jaas.conf

But the program throws an exception:


Could not find a 'KafkaClient' entry in the JAAS configuration. System property 'java.security.auth.login.config' is /path/to/jaas/kafka_client_jaas_usekeytab.conf

It's weird the java.security.auth.login.config is truly set correctly and the content in this file is fine.

The other program with single cluster works fine.

The kafka offical document JAAS configuration for Kafka clients says:

Clients may specify JAAS configuration as a producer or consumer property without creating a physical configuration file. 

This mode also enables different producers and consumers within the same JVM to use different credentials by specifying different properties for each client. 

If both static JAAS configuration system property java.security.auth.login.config and client property sasl.jaas.config are specified, the client property will be used.

another question here says:

he faced some issues with only java.security.auth.login.config.

Maybe The solution is:

provide sasl.jaas.config and java.security.auth.login.config in your program.

I will try to verify it for this case.

Upvotes: 0

Ashish Singh
Ashish Singh

Reputation: 275

Two things should be considered before publish/consume data to/from secured environment:-

  • Configure security.protocol
Properties props = new Properties();
props.put("security.protocol", "PLAINTEXTSASL");
  • Pass the jaas configuration along with java vm option
java -Djava.security.auth.login.config=/home/kafka-user/kafka-jaas.conf \
-Djava.security.krb5.conf=/etc/krb5.conf \
-Djavax.security.auth.useSubjectCredsOnly=false \
-cp hdp-kafka-sample-1.0-SNAPSHOT.jar:/usr/hdp/current/kafka-broker/libs/* \
hdp.sample.KafkaProducer one.hdp:6667 test

Check at secure-kafka-java-producer-with-kerberos for full explanation.

Upvotes: 1

Related Questions