Scala 2.11 external dependencies

2014-08-02 Thread Anand Avati
We are currently blocked on non availability of the following external dependencies for porting Spark to Scala 2.11 [SPARK-1812 Jira]: - akka-*_2.11 (2.3.4-shaded-protobuf from org.spark-project). The shaded protobuf needs to be 2.5.0, and the shading is needed because Hadoop1 specifically needs p

Re: Using mllib-1.1.0-SNAPSHOT on Spark 1.0.1

2014-08-02 Thread Xiangrui Meng
Yes, that should work. spark-mllib-1.1.0 should be compatible with spark-core-1.0.1. On Sat, Aug 2, 2014 at 10:54 AM, Debasish Das wrote: > Let me try it... > > Will this be fixed if I generate a assembly file with mllib-1.1.0 SNAPSHOT > jar and other dependencies with the rest of the application

Re: Using mllib-1.1.0-SNAPSHOT on Spark 1.0.1

2014-08-02 Thread Debasish Das
Let me try it... Will this be fixed if I generate a assembly file with mllib-1.1.0 SNAPSHOT jar and other dependencies with the rest of the application code ? On Sat, Aug 2, 2014 at 10:46 AM, Xiangrui Meng wrote: > You can try enabling "spark.files.userClassPathFirst". But I'm not > sure whet

Re: Using mllib-1.1.0-SNAPSHOT on Spark 1.0.1

2014-08-02 Thread Xiangrui Meng
You can try enabling "spark.files.userClassPathFirst". But I'm not sure whether it could solve your problem. -Xiangrui On Sat, Aug 2, 2014 at 10:13 AM, Debasish Das wrote: > Hi, > > I have deployed spark stable 1.0.1 on the cluster but I have new code that > I added in mllib-1.1.0-SNAPSHOT. > > I

Using mllib-1.1.0-SNAPSHOT on Spark 1.0.1

2014-08-02 Thread Debasish Das
Hi, I have deployed spark stable 1.0.1 on the cluster but I have new code that I added in mllib-1.1.0-SNAPSHOT. I am trying to access the new code using spark-submit as follows: spark-job --class com.verizon.bda.mllib.recommendation.ALSDriver --executor-memory 16g --total-executor-cores 16 --jar

Low Level Kafka Consumer for Spark

2014-08-02 Thread Dibyendu Bhattacharya
Hi, I have implemented a Low Level Kafka Consumer for Spark Streaming using Kafka Simple Consumer API. This API will give better control over the Kafka offset management and recovery from failures. As the present Spark KafkaUtils uses HighLevel Kafka Consumer API, I wanted to have a better control

Re: ASF JIRA is down for maintenance

2014-08-02 Thread Nicholas Chammas
Seems to be back up now. On Sat, Aug 2, 2014 at 2:06 AM, Patrick Wendell wrote: > Please don't let this prevent you from merging patches, just keep a list > and we can update the JIRA later. > > - Patrick >

branch-1.1 of Spark has been cut

2014-08-02 Thread Patrick Wendell
Hey All, I'm happy to announce branch-1.1 of Spark [1] - this branch will eventually become the 1.1 release. Committers: new patches will need to be explicitly back-ported into this branch in order to appear in the 1.1 release. Thanks so much to all the committers and contributors who were extrem