Yes, that should work. spark-mllib-1.1.0 should be compatible with
spark-core-1.0.1.

On Sat, Aug 2, 2014 at 10:54 AM, Debasish Das <debasish.da...@gmail.com> wrote:
> Let me try it...
>
> Will this be fixed if I generate a assembly file with mllib-1.1.0 SNAPSHOT
> jar and other dependencies with the rest of the application code ?
>
>
>
> On Sat, Aug 2, 2014 at 10:46 AM, Xiangrui Meng <men...@gmail.com> wrote:
>>
>> You can try enabling "spark.files.userClassPathFirst". But I'm not
>> sure whether it could solve your problem. -Xiangrui
>>
>> On Sat, Aug 2, 2014 at 10:13 AM, Debasish Das <debasish.da...@gmail.com>
>> wrote:
>> > Hi,
>> >
>> > I have deployed spark stable 1.0.1 on the cluster but I have new code
>> > that
>> > I added in mllib-1.1.0-SNAPSHOT.
>> >
>> > I am trying to access the new code using spark-submit as follows:
>> >
>> > spark-job --class com.verizon.bda.mllib.recommendation.ALSDriver
>> > --executor-memory 16g --total-executor-cores 16 --jars
>> > spark-mllib_2.10-1.1.0-SNAPSHOT.jar,scopt_2.10-3.2.0.jar
>> > sag-core-0.0.1-SNAPSHOT.jar --rank 25 --numIterations 10 --lambda 1.0
>> > --qpProblem 2 inputPath outputPath
>> >
>> > I can see the jars are getting added to httpServer as expected:
>> >
>> > 14/08/02 12:50:04 INFO SparkContext: Added JAR
>> > file:/vzhome/v606014/spark-glm/spark-mllib_2.10-1.1.0-SNAPSHOT.jar at
>> > http://10.145.84.20:37798/jars/spark-mllib_2.10-1.1.0-SNAPSHOT.jar with
>> > timestamp 1406998204236
>> >
>> > 14/08/02 12:50:04 INFO SparkContext: Added JAR
>> > file:/vzhome/v606014/spark-glm/scopt_2.10-3.2.0.jar at
>> > http://10.145.84.20:37798/jars/scopt_2.10-3.2.0.jar with timestamp
>> > 1406998204237
>> >
>> > 14/08/02 12:50:04 INFO SparkContext: Added JAR
>> > file:/vzhome/v606014/spark-glm/sag-core-0.0.1-SNAPSHOT.jar at
>> > http://10.145.84.20:37798/jars/sag-core-0.0.1-SNAPSHOT.jar with
>> > timestamp
>> > 1406998204238
>> >
>> > But the job still can't access code form mllib-1.1.0 SNAPSHOT.jar...I
>> > think
>> > it's picking up the mllib from cluster which is at 1.0.1...
>> >
>> > Please help. I will ask for a PR tomorrow but internally we want to
>> > generate results from the new code.
>> >
>> > Thanks.
>> >
>> > Deb
>
>

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
For additional commands, e-mail: dev-h...@spark.apache.org

Reply via email to