Thanks for the reply RK.
Using the first option, my application doesn't recognize
spark.driver.extraJavaOptions.
With the second option, the issue remains as same,
2016-07-21 12:59:41 ERROR SparkContext:95 - Error initializing SparkContext.
org.apache.spark.SparkException: Found
This has worked for me:
--conf "spark.driver.extraJavaOptions
-Dlog4j.configuration=file:/some/path/search-spark-service-log4j-Driver.properties"
\
you may want to try it.
If that doesn't work, then you may use --properties-file
--
View this message in context:
http://apache-
ile starting the application in CLUSTER mode, i want to pass
> custom log4j.properies file to both driver & executor.
>
> *I have the below command :-*
>
> spark-submit \
> --class xyx.search.spark.Boot \
> --conf "spark.cores.max=6" \
> --conf "spar
command :-*
spark-submit \
--class xyx.search.spark.Boot \
--conf "spark.cores.max=6" \
--conf "spark.eventLog.enabled=true" \
*--conf
"spark.driver.extraJavaOptions=-Dlog4j.configuration=file:/some/path/search-spark-service-log4j-Driver.properties"
\
-