Prithish,

I saw you posted this on SO, so I responded there just now. See
http://stackoverflow.com/questions/42452622/custom-log4j-properties-on-aws-emr/42516161#42516161

In short, an hdfs:// path can't be used to configure log4j because log4j
knows nothing about hdfs. Instead, since you are using EMR, you should use
the Configuration API when creating your cluster to configure the
spark-log4j configuration classification. See
http://docs.aws.amazon.com/emr/latest/ReleaseGuide/emr-configure-apps.html for
more info.

~ Jonathan

On Sun, Feb 26, 2017 at 8:17 PM Prithish <prith...@gmail.com> wrote:

> Steve, I tried that, but didn't work. Any other ideas?
>
> On Mon, Feb 27, 2017 at 1:42 AM, Steve Loughran <ste...@hortonworks.com>
> wrote:
>
> try giving a resource of a file in the JAR, e.g add a file
> "log4j-debugging.properties into the jar, and give a config option of
> -Dlog4j.configuration=/log4j-debugging.properties   (maybe also try without
> the "/")
>
>
> On 26 Feb 2017, at 16:31, Prithish <prith...@gmail.com> wrote:
>
> Hoping someone can answer this.
>
> I am unable to override and use a Custom log4j.properties on Amazon EMR. I
> am running Spark on EMR (Yarn) and have tried all the below combinations in
> the Spark-Submit to try and use the custom log4j.
>
> In Client mode
> --driver-java-options "-Dlog4j.configuration=
> hdfs://host:port/user/hadoop/log4j.properties"
>
> In Cluster mode
> --conf "spark.driver.extraJavaOptions=-Dlog4j.configuration=
> hdfs://host:port/user/hadoop/log4j.properties"
>
> I have also tried picking from local filesystem using file://// instead
> of hdfs. None of this seem to work. However, I can get this working when
> running on my local Yarn setup.
>
> Any ideas?
>
> I have also posted on Stackoverflow (link below)
>
> http://stackoverflow.com/questions/42452622/custom-log4j-properties-on-aws-emr
>
>
>
>

Reply via email to