Hello there,

I am running a Java Spark application. Most of the modules write to a log
file (not the spark log file). I can either use "java -jar " or
"spark-submit" to run the application.

If I use "java -jar myApp.jar" the log file will be generated in the
directory $LOG_DIR or in a default dir if the environmental variable
$LOG_DIR is not set.  The problem is that it doesn't take spark config
options, such as "--master local[3]" in the command line. I have to
hardcoded in the code master("local[3]") and such. Apparently I can't
hardcoded for running in a cluster.

If I use "spark-submit myMay.jar --master local[3] --xxx ", no customer log
files will be generated anywhere, neither in $LOG_DIR or the default dir.

My question is: How can I use "spark-submit" and still have the customized
log file generated?
Does anyone know what happened and how to fix it?

Thanks in advance!

Best,

Mann

Reply via email to