Hi Ron,
   A possible recommendation is to use maven for the entire process
(avoiding  the sbt artifacts/processing).  IJ is pretty solid in its maven
support.

a) mvn -DskipTests  -Pyarn -Phive -Phadoop-2.3 compile package
b)  Inside IJ:  Open the parent/root pom.xml as a new maven project
c)  Inside IJ: Build | Project
d) Enjoy running/debugging the individual scalatest classes
e) (No unnecessary recompilation pain after the initial build)

Also i found that when using maven there are no duplicate class directories
(whereas in sbt everything is doubled up under the project/target directory
for some reason).



2014-08-11 12:57 GMT-07:00 Ron's Yahoo! <zlgonza...@yahoo.com.invalid>:

> Hi,
>   I’ve been able to get things compiled on my environment, but I’m
> noticing that it’s been quite difficult in IntelliJ. It always recompiles
> everything when I try to run one test like BroadcastTest, for example,
> despite having compiled make-distribution previously. In eclipse, I have no
> such recompilation issues. IntelliJ unfortunately does not support auto
> compilation for Scala. It also doesn’t seem as if IntelliJ knows that that
> there are classes that have already been compiled since it always opts to
> recompile everything. I’m new to IntelliJ so it might really just be a lack
> of knowledge on my part.
>   Can anyone share any tips on how they are productive compiling against
> the Spark code base using IntelliJ?
>
> Thanks,
> Ron
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
> For additional commands, e-mail: dev-h...@spark.apache.org
>
>

Reply via email to