Re: Double lhbase dependency in spark 0.9.1

2014-04-17 Thread Tathagata Das
Aaah, this should have been ported to Spark 0.9.1! TD On Thu, Apr 17, 2014 at 12:08 PM, Sean Owen wrote: > I remember that too, and it has been fixed already in master, but > maybe it was not included in 0.9.1: > > https://github.com/apache/spark/blob/master/project/SparkBuild.scala#L367 > --

Re: Spark REPL question

2014-04-17 Thread Zhan Zhang
Clear to me now. Thanks. -- View this message in context: http://apache-spark-developers-list.1001551.n3.nabble.com/Spark-REPL-question-tp6331p6335.html Sent from the Apache Spark Developers List mailing list archive at Nabble.com.

Re: Spark REPL question

2014-04-17 Thread Michael Armbrust
Yeah, I think that is correct. On Thu, Apr 17, 2014 at 2:47 PM, Zhan Zhang wrote: > Thanks a lot. > > By "spins up", do you mean using the same directory, specified by > following? > > /** Local directory to save .class files too */ > val outputDir = { > val tmp = System.get

Re: Spark REPL question

2014-04-17 Thread Zhan Zhang
Thanks a lot. By "spins up", do you mean using the same directory, specified by following? /** Local directory to save .class files too */ val outputDir = { val tmp = System.getProperty("java.io.tmpdir") val rootDir = new SparkConf().get("spark.repl.classdir", tmp)

Re: Spark REPL question

2014-04-17 Thread Michael Armbrust
The REPL spins up an org.apache.spark.HttpServer, which provides classes that are generated by the REPL as well as jars from addJar. Michael On Thu, Apr 17, 2014 at 1:58 PM, Zhan Zhang wrote: > Please help, I am knew to both Spark and scala. > > I am trying to figure out how spark distribute t

Spark REPL question

2014-04-17 Thread Zhan Zhang
Please help, I am knew to both Spark and scala. I am trying to figure out how spark distribute the task to workers in REPL. I only found the place where task is serialized and sent, and workers deserialize and load the task with the class name by ExecutorClassLoader. But I didn't find how the dri

Re: Double lhbase dependency in spark 0.9.1

2014-04-17 Thread Sean Owen
I remember that too, and it has been fixed already in master, but maybe it was not included in 0.9.1: https://github.com/apache/spark/blob/master/project/SparkBuild.scala#L367 -- Sean Owen | Director, Data Science | London On Thu, Apr 17, 2014 at 8:03 PM, Dmitriy Lyubimov wrote: > Not sure if I

Double lhbase dependency in spark 0.9.1

2014-04-17 Thread Dmitriy Lyubimov
Not sure if I am seeing double. SparkBuild.scala for 0.9.1 has dobule hbase declaration "org.apache.hbase" % "hbase" % "0.94.6" excludeAll(excludeNetty, excludeAsm), "org.apache.hbase" % "hbase" % HBASE_VERSION excludeAll(excludeNetty, excludeAsm), as a result i am no