Hi All,

Apologies for cross posting, I posted this on Kafka user group. Is there
any way that I can use kerberos ticket cache of the spark executor for
reading from secured kafka? I am not 100% that the executor would do a
kinit, but I presume so to be able to run code / read hdfs etc as the user
that submitted the job.

I am connecting to a secured kafka cluster from spark. My jaas.conf looks
like below -
KafkaClient {
com.sun.security.auth.module.Krb5LoginModule required
useTicketCache=true
keyTab="./user.keytab"
principal="u...@example.com";
};

export KAFKA_OPTS="-Djava.security.auth.login.config=/home/user/jaas.conf"

I tested connectivity using kafka-console-consumer and I am able to read
data from kafka topic. However when I used the same in spark-submit using
the below options, I get a kerberos error -

spark-sbumit .... --files jaas.conf#jaas.conf --driver-java-options "-Djava
.security.auth.login.config=./jaas.conf" --conf "spark.executor.
extraJavaOptions=-Djava.security.auth.login.config=./jaas.conf" ....
*Could not login: the client is being asked for a password, but the Kafka
client code does not currently support obtaining a password from the user.
not available to garner  authentication information from the user*

My question - Can we not use the spark executor ticket cache (spark running
the job as "user" )? Do we always need to provide the keytab file also
using --files? I also tested using --principal u...@example.com --keytab
<file>, but still got the same error. Is there any way that I can use the
ticketcache from spark  executor for kafka?

PS - I read this link - https://docs.confluent.io/2.0.0/kafka/sasl.html#
kerberos which says that *"For command-line utilities like
kafka-console-consumer or kafka-console-producer, kinit can be used along
with useTicketCache=true "*

Not sure if this is as per design or am I missing something.

Thanks,
Hugo

Reply via email to