Looks like I stuck then, I am using mesos.
Adding these 2 jars to all executors might be a problem for me, I will
probably try to remove the dependency on the otj-logging lib then and just
use log4j.
On Tue, Aug 25, 2015 at 2:15 PM, Marcelo Vanzin wrote:
> On Tue, Aug 25, 2015 at 1:50 PM, Utkars
On Tue, Aug 25, 2015 at 1:50 PM, Utkarsh Sengar wrote:
> So do I need to manually copy these 2 jars on my spark executors?
Yes. I can think of a way to work around that if you're using YARN,
but not with other cluster managers.
> On Tue, Aug 25, 2015 at 10:51 AM, Marcelo Vanzin
> wrote:
>>
>> O
So do I need to manually copy these 2 jars on my spark executors?
On Tue, Aug 25, 2015 at 10:51 AM, Marcelo Vanzin
wrote:
> On Tue, Aug 25, 2015 at 10:48 AM, Utkarsh Sengar
> wrote:
> > Now I am going to try it out on our mesos cluster.
> > I assumed "spark.executor.extraClassPath" takes csv
On Tue, Aug 25, 2015 at 10:48 AM, Utkarsh Sengar wrote:
> Now I am going to try it out on our mesos cluster.
> I assumed "spark.executor.extraClassPath" takes csv as jars the way "--jars"
> takes it but it should be ":" separated like a regular classpath jar.
Ah, yes, those options are just raw c
This worked for me locally:
spark-1.4.1-bin-hadoop2.4/bin/spark-submit --conf
spark.executor.extraClassPath=/.m2/repository/ch/qos/logback/logback-core/1.1.2/logback-core-1.1.2.jar:/.m2/repository/ch/qos/logback/logback-classic/1.1.2/logback-classic-1.1.2.jar
--conf
spark.driver.extraClassPath=/.m2
I get the same error even when I set the SPARK_CLASSPATH: export
SPARK_CLASSPATH=/.m2/repository/ch/qos/logback/logback-classic/1.1.2/logback-classic-1.1.1.jar:/.m2/repository/ch/qos/logback/logback-core/1.1.2/logback-core-1.1.2.jar
And I run the job like this: /spark-1.4.1-bin-hadoop2.4/bin/spark-
I assumed that's the case beacause of the error I got and the documentation
which says: "Extra classpath entries to append to the classpath of the
driver."
This is where I stand now:
org.apache.spark
spark-core_2.10
1.4.1
On Mon, Aug 24, 2015 at 3:58 PM, Utkarsh Sengar wrote:
> That didn't work since "extraClassPath" flag was still appending the jars at
> the end, so its still picking the slf4j jar provided by spark.
Out of curiosity, how did you verify this? The "extraClassPath"
options are supposed to prepend en
That didn't work since "extraClassPath" flag was still appending the jars
at the end, so its still picking the slf4j jar provided by spark.
Although I found this flag: --conf "spark.executor.userClassPathFirst=true"
(http://spark.apache.org/docs/latest/configuration.html) and tried this:
➜ simspa
Hi Utkarsh,
A quick look at slf4j's source shows it loads the first
"StaticLoggerBinder" in your classpath. How are you adding the logback
jar file to spark-submit?
If you use "spark.driver.extraClassPath" and
"spark.executor.extraClassPath" to add the jar, it should take
precedence over the log4
Hi Marcelo,
When I add this exclusion rule to my pom:
org.apache.spark
spark-core_2.10
1.4.1
org.slf4j
slf4j-log4j12
The SparkRunner class work
Hi Utkarsh,
Unfortunately that's not going to be easy. Since Spark bundles all
dependent classes into a single fat jar file, to remove that
dependency you'd need to modify Spark's assembly jar (potentially in
all your nodes). Doing that per-job is even trickier, because you'd
probably need some ki
12 matches
Mail list logo