This is not released yet but we're planning to cut a 0.9.1 release very soon (e.g. most likely this week). In the mean time you'll have checkout branch-0.9 of Spark and publish it locally then depend on the snapshot version. Or just wait it out...
On Fri, Mar 14, 2014 at 2:01 PM, Adrian Mocanu <amoc...@verticalscope.com> wrote: > That's great! > > How would I pull that with sbt? > > > > I currently use these 2 (mvnrepository.com/artifact/org.spark-project seems > to be down atm): > > val spark="org.apache.spark" % "spark-core_2.10" % "0.9.0-incubating" > > val sparkStreaming= "org.apache.spark" % "spark-streaming_2.10" % > "0.9.0-incubating" > > > > Thanks! > > -A > > > > From: Sean Owen [mailto:so...@cloudera.com] > Sent: March-14-14 4:33 PM > To: user@spark.apache.org > Cc: u...@spark.incubator.apache.org > Subject: Re: slf4j and log4j loop > > > > Yes, I think you are interested in this issue and fix: > https://github.com/apache/spark/pull/107 > > > -- > Sean Owen | Director, Data Science | London > > > > On Fri, Mar 14, 2014 at 1:04 PM, Adrian Mocanu <amoc...@verticalscope.com> > wrote: > > Hi > > Have you encountered a slf4j and log4j loop when using Spark? I pull a few > packages via sbt. > > Spark package uses slf4j-log4j12.jar and another package uses use > log4j-over-slf4j.jar which creates the circular loop between the 2 loggers > and thus the exception below. Do you know of a fix for this? > > > > > > SLF4J: Detected both log4j-over-slf4j.jar AND slf4j-log4j12.jar on the class > path, preempting StackOverflowError. > > SLF4J: See also http://www.slf4j.org/codes.html#log4jDelegationLoop for more > details. > > An exception or error caused a run to abort. > > java.lang.ExceptionInInitializerError > > at org.apache.log4j.Logger.getLogger(Logger.java:40) > > at > org.apache.spark.util.AkkaUtils$.createActorSystem(AkkaUtils.scala:58) > > at org.apache.spark.SparkEnv$.create(SparkEnv.scala:126) > > at > org.apache.spark.SparkContext.<init>(SparkContext.scala:139) > > at > org.apache.spark.streaming.StreamingContext$.createNewSparkContext(StreamingContext.scala:500) > > at > org.apache.spark.streaming.StreamingContext.<init>(StreamingContext.scala:76) > > ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... > ... > > Caused by: java.lang.IllegalStateException: Detected both > log4j-over-slf4j.jar AND slf4j-log4j12.jar on the class path, preempting > StackOverflowError. See also > http://www.slf4j.org/codes.html#log4jDelegationLoop for more details. > > at > org.apache.log4j.Log4jLoggerFactory.<clinit>(Log4jLoggerFactory.java:51) > > ... 54 more > > > > > > Thanks > > -Adrian > > > >