Issue: 

Not able to broadcast or place the files locally in the Spark worker nodes
from Spark application in Cluster deploy mode.Spark job always throws
FileNotFoundException.


Issue Description:

We are trying to access Kafka Cluster which is configured with SSL for
encryption from Spark Streaming job which is like a Consumer application. In
Spark Streaming application, we have to configure SSL related properties
like jaas.conf, keystore and truststore files to access Kafka cluster setup
with SSL. Also, we have to set below System property, since the application
is expecting to read the jaas.conf file in the System level property. So, we
had to place the jaas.conf file in all the worker nodes locally so that
Spark application is able to find the file while running in the Cluster
mode. The same issue is applicable for keystore and truststore files. So we
are placing all these files in all the worker nodes in the same path.

System.setProperty("java.security.auth.login.config", "jaas.conf")


I have created the SparkContext object from Streaming Context object in
Spark and tried to use SparkContext.addFile(“jaas.conf”)  and to retrieve
the file, I used SparkFiles.get("jaas.conf”) to retrieve the file. But it
still throws me FileNotFound exception.

I tried to put the above file in HDFS but at the JVM level using the
System.setProperty, I was not able to use HDFS path. So in the Spark
application, I tried to download from HDFS to local in the current folder
where the Spark application is running. But still it did not work. 

Solution:

We placed jaas.conf, keystore and truststore file in all the worker nodes
locally and then referencing the path in the Spark application, then it
worked.


Thanks



--
Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscr...@spark.apache.org

Reply via email to