Hi,

I would like to report on trying the first option proposed by Lan - putting
the log4j.properties file under the root of my application jar.
It doesn't look like it is working on in my case: submitting the
application to spark from the application code (not with spark-submit).
It seems that in this case the executor doesn't see the log4j.properties
that is located int he application jar and will use the default properties
file.
I can conclude it from the fact that the log is not created, and that's
what I see in the executor console:

Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties

I've also tried to add the below config to the application code, which
didn't have any apparent influence:

sparkConf.set("spark.executor.extraJavaOptions",
"-Dlog4j.configuration=log4j.properties");

On Mon, Apr 20, 2015 at 5:59 PM, Lan Jiang <ljia...@gmail.com> wrote:

> Rename your log4j_special.properties file as log4j.properties and place it
> under the root of your jar file, you should be fine.
>
> If you are using Maven to build your jar, please the log4j.properties in
> the src/main/resources folder.
>
> However, please note that if you have other dependency jar file in the
> classpath that contains another log4j.properties file this way, it might
> not work since the first log4j.properties file that is loaded will be used.
>
> You can also do spark-submit —file log4j_special.properties … ,which
> should transfer your log4j property file to the worker nodes automatically
> without you copying them manually.
>
> Lan
>
>
> > On Apr 20, 2015, at 9:26 AM, Michael Ryabtsev <michael...@gmail.com>
> wrote:
> >
> > Hi all,
> >
> > I need to configure spark executor log4j.properties on a standalone
> cluster.
> > It looks like placing the relevant properties file in the spark
> > configuration folder and  setting the spark.executor.extraJavaOptions
> from
> > my application code:
> > sparkConf.set("spark.executor.extraJavaOptions",
> > "-Dlog4j.configuration=log4j_special.properties");
> > does the work, and the executor logs are written in the required place
> and
> > level. As far as I understand, it works, because the spark configuration
> > folder is on the class path, and passing parameter without path works
> here.
> > However, I would like to avoid deploying these properties to each worker
> > spark configuration folder.
> > I wonder, if I put the properties in my application jar, is there any
> way of
> > telling executor to load them?
> >
> > Thanks,
> >
> >
> >
> > --
> > View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/Configuring-logging-properties-for-executor-tp22572.html
> > Sent from the Apache Spark User List mailing list archive at Nabble.com.
> >
> > ---------------------------------------------------------------------
> > To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> > For additional commands, e-mail: user-h...@spark.apache.org
> >
>
>

Reply via email to