​Hi Wenchao,
I use steps described in the page and it works great, you can have a try:)
http://danielnee.com/2015/01/setting-up-intellij-for-spark/​


On Mon, Mar 28, 2016 at 9:38 AM, 吴文超 <wuwenc...@miaozhen.com> wrote:

> for the simplest word count,
> val wordCounts = textFile.flatMap(line => line.split(" ")).map(word =>
> (word, 1)).reduceByKey((a, b) => a + b)
> In the idea IDE, it shows reduceByKey can not be resolved.
> Maybe it is the problem of version incompatible。What do you use to
> construct the projec
> *t?*
>
> -----原始邮件-----
> *发件人:*"Eugene Morozov" <evgeny.a.moro...@gmail.com>
> *发送时间:*2016-03-28 07:46:03 (星期一)
> *收件人:* "吴文超" <wuwenc...@miaozhen.com>
> *抄送:* user <user@spark.apache.org>
> *主题:* Re: IntelliJ idea not work well with spark
>
>
> Could you, pls share your code, so that I could try it.
>
> --
> Be well!
> Jean Morozov
>
> On Sun, Mar 27, 2016 at 5:20 PM, 吴文超 <wuwenc...@miaozhen.com> wrote:
>
>> I am a newbie to spark, when I use IntelliJ idea to write some scala
>> code, i found it reports error when using spark's implicit conversion.e.g.
>> whe use the RDD as Pair RDD to get reduceByKey function. However, the
>> project can run normally in the cluster.
>> As somebody says it needs import org.apache.spark.SparkContext._ ,
>> http://stackoverflow.com/questions/24084335/reducebykey-method-not-being-found-in-intellij
>>  I did it ,but it still gets error..
>> Has anybody encountered the problem and how do you solve it ?
>> BTY, I have tried both sbt and maven , and the idea version 14.0.3 and
>> spark version is 1.6.0
>>
>
>

Reply via email to