For Yarn, you need to upload your log4j.properties separately from your app's jar, because of some internal issues that are too boring to explain here. :-)
Basically: spark-submit --master yarn --files log4j.properties blah blah blah Having to keep it outside your app jar is sub-optimal, and I think there's a bug filed to fix this, but so far no one has really spent time looking at it. On Wed, Feb 11, 2015 at 4:29 AM, Emre Sevinc <emre.sev...@gmail.com> wrote: > Hello, > > I'm building an Apache Spark Streaming application and cannot make it log to > a file on the local filesystem when running it on YARN. How can achieve > this? > > I've set log4.properties file so that it can successfully write to a log > file in /tmp directory on the local file system (shown below partially): > > log4j.appender.file=org.apache.log4j.FileAppender > log4j.appender.file.File=/tmp/application.log > log4j.appender.file.append=false > log4j.appender.file.layout=org.apache.log4j.PatternLayout > log4j.appender.file.layout.ConversionPattern=%d{yyyy-MM-dd HH:mm:ss} %-5p > %c{1}:%L - %m%n > > When I run my Spark application locally by using the following command: > > spark-submit --class myModule.myClass --master local[2] --deploy-mode > client myApp.jar > > It runs fine and I can see that log messages are written to > /tmp/application.log on my local file system. > > But when I run the same application via YARN, e.g. > > spark-submit --class myModule.myClass --master yarn-client --name > "myModule" --total-executor-cores 1 --executor-memory 1g myApp.jar > > or > > spark-submit --class myModule.myClass --master yarn-cluster --name > "myModule" --total-executor-cores 1 --executor-memory 1g myApp.jar > > I cannot see any /tmp/application.log on the local file system of the > machine that runs YARN. > > What am I missing? > > > -- > Emre Sevinç -- Marcelo --------------------------------------------------------------------- To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org