I am a newbie to spark, when I use IntelliJ idea to write some scala code, i 
found it reports error when using spark's implicit conversion.e.g. whe use the 
RDD as Pair RDD to get reduceByKey function. However, the project can run 
normally in the cluster.
As somebody says it needs import org.apache.spark.SparkContext._ , 
http://stackoverflow.com/questions/24084335/reducebykey-method-not-being-found-in-intellij
 I did it ,but it still gets error..
Has anybody encountered the problem and how do you solve it ?
BTY, I have tried both sbt and maven , and the idea version 14.0.3 and spark 
version is 1.6.0

Reply via email to