Re: can't sc.paralellize in Spark 0.7.3 spark-shell

2014-04-15 Thread Walrus theCat
Dankeschön ! On Tue, Apr 15, 2014 at 11:29 AM, Aaron Davidson wrote: > This is probably related to the Scala bug that :cp does not work: > https://issues.scala-lang.org/browse/SI-6502 > > > On Tue, Apr 15, 2014 at 11:21 AM, Walrus theCat wrote: > >> Actually altering the classpath in the REPL c

Re: can't sc.paralellize in Spark 0.7.3 spark-shell

2014-04-15 Thread Aaron Davidson
This is probably related to the Scala bug that :cp does not work: https://issues.scala-lang.org/browse/SI-6502 On Tue, Apr 15, 2014 at 11:21 AM, Walrus theCat wrote: > Actually altering the classpath in the REPL causes the provided > SparkContext to disappear: > > scala> sc.parallelize(List(1,2,

Re: can't sc.paralellize in Spark 0.7.3 spark-shell

2014-04-15 Thread Walrus theCat
Actually altering the classpath in the REPL causes the provided SparkContext to disappear: scala> sc.parallelize(List(1,2,3)) res0: spark.RDD[Int] = ParallelCollectionRDD[0] at parallelize at :13 scala> :cp /root Added '/root'. Your new classpath is: ":/root/jars/aspectjrt.jar:/root/jars/aspectj

Re: can't sc.paralellize in Spark 0.7.3 spark-shell

2014-04-14 Thread Walrus theCat
Nevermind -- I'm like 90% sure the problem is that I'm importing stuff that declares a SparkContext as sc. If it's not, I'll report back. On Mon, Apr 14, 2014 at 2:55 PM, Walrus theCat wrote: > Hi, > > Using the spark-shell, I can't sc.parallelize to get an RDD. > > Looks like a bug. > > scala>