You say you did the maven package but did you do a maven install and define 
your local maven repo in SBT?

-Paul

Sent from my iPhone

> On Oct 11, 2017, at 5:48 PM, Stephen Boesch <java...@gmail.com> wrote:
> 
> When attempting to run any example program w/ Intellij I am running into 
> guava versioning issues:
> 
> Exception in thread "main" java.lang.NoClassDefFoundError: 
> com/google/common/cache/CacheLoader
>       at 
> org.apache.spark.SparkConf.loadFromSystemProperties(SparkConf.scala:73)
>       at org.apache.spark.SparkConf.<init>(SparkConf.scala:68)
>       at org.apache.spark.SparkConf.<init>(SparkConf.scala:55)
>       at 
> org.apache.spark.sql.SparkSession$Builder$$anonfun$7.apply(SparkSession.scala:919)
>       at 
> org.apache.spark.sql.SparkSession$Builder$$anonfun$7.apply(SparkSession.scala:918)
>       at scala.Option.getOrElse(Option.scala:121)
>       at 
> org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:918)
>       at 
> org.apache.spark.examples.ml.KMeansExample$.main(KMeansExample.scala:40)
>       at org.apache.spark.examples.ml.KMeansExample.main(KMeansExample.scala)
> Caused by: java.lang.ClassNotFoundException: 
> com.google.common.cache.CacheLoader
>       at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
>       at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
>       at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
>       at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
>       ... 9 more
> 
> The *scope*s for the spark dependencies were already changed from *provided* 
> to *compile* .  Both `sbt assembly` and `mvn package` had already been run 
> (successfully) from command line - and the (mvn) project completely rebuilt 
> inside intellij.
> 
> The spark testcases run fine: this is a problem only in the examples module.  
> Anyone running these successfully in IJ?  I have tried for 2.1.0-SNAPSHOT and 
> 2.3.0-SNAPSHOT - with the same outcome.
> 
> 

Reply via email to