Re: Exclude slf4j-log4j12 from the classpath via spark-submit

2015-08-25 Thread Utkarsh Sengar
Looks like I stuck then, I am using mesos. Adding these 2 jars to all executors might be a problem for me, I will probably try to remove the dependency on the otj-logging lib then and just use log4j. On Tue, Aug 25, 2015 at 2:15 PM, Marcelo Vanzin wrote: > On Tue, Aug 25, 2015 at 1:50 PM, Utkars

Re: Exclude slf4j-log4j12 from the classpath via spark-submit

2015-08-25 Thread Marcelo Vanzin
On Tue, Aug 25, 2015 at 1:50 PM, Utkarsh Sengar wrote: > So do I need to manually copy these 2 jars on my spark executors? Yes. I can think of a way to work around that if you're using YARN, but not with other cluster managers. > On Tue, Aug 25, 2015 at 10:51 AM, Marcelo Vanzin > wrote: >> >> O

Re: Exclude slf4j-log4j12 from the classpath via spark-submit

2015-08-25 Thread Utkarsh Sengar
So do I need to manually copy these 2 jars on my spark executors? On Tue, Aug 25, 2015 at 10:51 AM, Marcelo Vanzin wrote: > On Tue, Aug 25, 2015 at 10:48 AM, Utkarsh Sengar > wrote: > > Now I am going to try it out on our mesos cluster. > > I assumed "spark.executor.extraClassPath" takes csv

Re: Exclude slf4j-log4j12 from the classpath via spark-submit

2015-08-25 Thread Marcelo Vanzin
On Tue, Aug 25, 2015 at 10:48 AM, Utkarsh Sengar wrote: > Now I am going to try it out on our mesos cluster. > I assumed "spark.executor.extraClassPath" takes csv as jars the way "--jars" > takes it but it should be ":" separated like a regular classpath jar. Ah, yes, those options are just raw c

Re: Exclude slf4j-log4j12 from the classpath via spark-submit

2015-08-25 Thread Utkarsh Sengar
This worked for me locally: spark-1.4.1-bin-hadoop2.4/bin/spark-submit --conf spark.executor.extraClassPath=/.m2/repository/ch/qos/logback/logback-core/1.1.2/logback-core-1.1.2.jar:/.m2/repository/ch/qos/logback/logback-classic/1.1.2/logback-classic-1.1.2.jar --conf spark.driver.extraClassPath=/.m2

Re: Exclude slf4j-log4j12 from the classpath via spark-submit

2015-08-24 Thread Utkarsh Sengar
I get the same error even when I set the SPARK_CLASSPATH: export SPARK_CLASSPATH=/.m2/repository/ch/qos/logback/logback-classic/1.1.2/logback-classic-1.1.1.jar:/.m2/repository/ch/qos/logback/logback-core/1.1.2/logback-core-1.1.2.jar And I run the job like this: /spark-1.4.1-bin-hadoop2.4/bin/spark-

Re: Exclude slf4j-log4j12 from the classpath via spark-submit

2015-08-24 Thread Utkarsh Sengar
I assumed that's the case beacause of the error I got and the documentation which says: "Extra classpath entries to append to the classpath of the driver." This is where I stand now: org.apache.spark spark-core_2.10 1.4.1

Re: Exclude slf4j-log4j12 from the classpath via spark-submit

2015-08-24 Thread Marcelo Vanzin
On Mon, Aug 24, 2015 at 3:58 PM, Utkarsh Sengar wrote: > That didn't work since "extraClassPath" flag was still appending the jars at > the end, so its still picking the slf4j jar provided by spark. Out of curiosity, how did you verify this? The "extraClassPath" options are supposed to prepend en

Re: Exclude slf4j-log4j12 from the classpath via spark-submit

2015-08-24 Thread Utkarsh Sengar
That didn't work since "extraClassPath" flag was still appending the jars at the end, so its still picking the slf4j jar provided by spark. Although I found this flag: --conf "spark.executor.userClassPathFirst=true" (http://spark.apache.org/docs/latest/configuration.html) and tried this: ➜ simspa

Re: Exclude slf4j-log4j12 from the classpath via spark-submit

2015-08-24 Thread Marcelo Vanzin
Hi Utkarsh, A quick look at slf4j's source shows it loads the first "StaticLoggerBinder" in your classpath. How are you adding the logback jar file to spark-submit? If you use "spark.driver.extraClassPath" and "spark.executor.extraClassPath" to add the jar, it should take precedence over the log4

Re: Exclude slf4j-log4j12 from the classpath via spark-submit

2015-08-24 Thread Utkarsh Sengar
Hi Marcelo, When I add this exclusion rule to my pom: org.apache.spark spark-core_2.10 1.4.1 org.slf4j slf4j-log4j12 The SparkRunner class work

Re: Exclude slf4j-log4j12 from the classpath via spark-submit

2015-08-24 Thread Marcelo Vanzin
Hi Utkarsh, Unfortunately that's not going to be easy. Since Spark bundles all dependent classes into a single fat jar file, to remove that dependency you'd need to modify Spark's assembly jar (potentially in all your nodes). Doing that per-job is even trickier, because you'd probably need some ki