Yeah, I hit this too. IntelliJ picks this up from the build but then it can't run its own scalac with this plugin added.
Go to Preferences > Build, Execution, Deployment > Scala Compiler and clear the "Additional compiler options" field. It will work then although the option will come back when the project reimports. Right now I don't know of a better fix. There's another recent open question about updating IntelliJ docs: https://issues.apache.org/jira/browse/SPARK-5136 Should this stuff go in the site docs, or wiki? I vote for wiki I suppose and make the site docs point to the wiki. I'd be happy to make wiki edits if I can get permission, or propose this text along with other new text on the JIRA. On Thu, Jan 8, 2015 at 10:00 AM, Jakub Dubovsky <spark.dubovsky.ja...@seznam.cz> wrote: > Hi devs, > > I'd like to ask if anybody has experience with using intellij 14 to step > into spark code. Whatever I try I get compilation error: > > Error:scalac: bad option: -P:/home/jakub/.m2/repository/org/scalamacros/ > paradise_2.10.4/2.0.1/paradise_2.10.4-2.0.1.jar > > Project is set up by Patrick's instruction [1] and packaged by mvn - > DskipTests clean install. Compilation works fine. Then I just created > breakpoint in test code and run debug with the error. > > Thanks for any hints > > Jakub > > [1] https://cwiki.apache.org/confluence/display/SPARK/Useful+Developer+ > Tools#UsefulDeveloperTools-BuildingSparkinIntelliJIDEA --------------------------------------------------------------------- To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org For additional commands, e-mail: dev-h...@spark.apache.org