Nevermind -- I'm like 90% sure the problem is that I'm importing stuff that
declares a SparkContext as sc.  If it's not, I'll report back.


On Mon, Apr 14, 2014 at 2:55 PM, Walrus theCat <walrusthe...@gmail.com>wrote:

> Hi,
>
> Using the spark-shell, I can't sc.parallelize to get an RDD.
>
> Looks like a bug.
>
> scala> sc.parallelize(Array("a","s","d"))
> java.lang.NullPointerException
>     at <init>(<console>:17)
>     at <init>(<console>:22)
>     at <init>(<console>:24)
>     at <init>(<console>:26)
>     at <init>(<console>:28)
>     at <init>(<console>:30)
>     at <init>(<console>:32)
>     at <init>(<console>:34)
>     at <init>(<console>:36)
>     at .<init>(<console>:40)
>     at .<clinit>(<console>)
>     at .<init>(<console>:11)
>     at .<clinit>(<console>)
>     at $export(<console>)
>     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>     at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>     at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>     at java.lang.reflect.Method.invoke(Method.java:606)
>     at spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:629)
>     at
> spark.repl.SparkIMain$Request$$anonfun$10.apply(SparkIMain.scala:890)
>     at
> scala.tools.nsc.interpreter.Line$$anonfun$1.apply$mcV$sp(Line.scala:43)
>     at scala.tools.nsc.io.package$$anon$2.run(package.scala:25)
>     at java.lang.Thread.run(Thread.java:744)
>

Reply via email to