Hi guys,
This has been solved. These emails are from last week when the mailing list 
didn’t work.

From: Tathagata Das [mailto:tathagata.das1...@gmail.com]
Sent: May-15-14 4:50 PM
To: user@spark.apache.org
Cc: u...@spark.incubator.apache.org
Subject: Re: same log4j slf4j error in spark 9.1

Spark 0.9.1 does not depend on log4j-over-slf4j 
(here<https://git-wip-us.apache.org/repos/asf?p=spark.git;a=blob;f=project/SparkBuild.scala;h=52e894e25635a2029464f35a87d65fc108975350;hb=4c43182b6d1b0b7717423f386c0214fe93073208>
 is the SBT file for 0.9.1). Are you sure that no other dependency in your 
project is bringing in dependency in the classpath? Alternatively, if you dont 
want slf4j-log4j12 from spark, you can safely exclude in the dependencies.

TD

On Thu, May 8, 2014 at 12:56 PM, Adrian Mocanu 
<amoc...@verticalscope.com<mailto:amoc...@verticalscope.com>> wrote:
I recall someone from the Spark team (TD?) saying that Spark 9.1 will change 
the logger and the circular loop error between slf4j and log4j wouldn’t show up.

Yet on Spark 9.1 I still get
SLF4J: Detected both log4j-over-slf4j.jar AND slf4j-log4j12.jar on the class 
path, preempting StackOverflowError.
SLF4J: See also http://www.slf4j.org/codes.html#log4jDelegationLoop for more 
details.

Any solutions?

-Adrian


Reply via email to