Please also see this thread:
http://search-hadoop.com/m/JW1q5De7pU1

On Tue, Jan 20, 2015 at 3:58 PM, Sean Owen <so...@cloudera.com> wrote:

> Guava is shaded in Spark 1.2+. It looks like you are mixing versions
> of Spark then, with some that still refer to unshaded Guava. Make sure
> you are not packaging Spark with your app and that you don't have
> other versions lying around.
>
> On Tue, Jan 20, 2015 at 11:55 PM, Shailesh Birari <sbirar...@gmail.com>
> wrote:
> > Hello,
> >
> > I recently upgraded my setup from Spark 1.1 to Spark 1.2.
> > My existing applications are working fine on ubuntu cluster.
> > But, when I try to execute Spark MLlib application from Eclipse (Windows
> > node) it gives java.lang.NoClassDefFoundError:
> > com/google/common/base/Preconditions exception.
> >
> > Note,
> >    1. With Spark 1.1 this was working fine.
> >    2. The Spark 1.2 jar files are linked in Eclipse project.
> >    3. Checked the jar -tf output and found the above
> com.google.common.base
> > is not present.
> >
> >
> -----------------------------------------------------------------------------------------------------------------Exception
> > log:
> >
> > Exception in thread "main" java.lang.NoClassDefFoundError:
> > com/google/common/base/Preconditions
> >         at
> >
> org.apache.spark.network.client.TransportClientFactory.<init>(TransportClientFactory.java:94)
> >         at
> >
> org.apache.spark.network.TransportContext.createClientFactory(TransportContext.java:77)
> >         at
> >
> org.apache.spark.network.netty.NettyBlockTransferService.init(NettyBlockTransferService.scala:62)
> >         at
> org.apache.spark.storage.BlockManager.initialize(BlockManager.scala:194)
> >         at org.apache.spark.SparkContext.<init>(SparkContext.scala:340)
> >         at
> >
> org.apache.spark.examples.mllib.TallSkinnySVD$.main(TallSkinnySVD.scala:74)
> >         at
> org.apache.spark.examples.mllib.TallSkinnySVD.main(TallSkinnySVD.scala)
> > Caused by: java.lang.ClassNotFoundException:
> > com.google.common.base.Preconditions
> >         at java.net.URLClassLoader$1.run(Unknown Source)
> >         at java.net.URLClassLoader$1.run(Unknown Source)
> >         at java.security.AccessController.doPrivileged(Native Method)
> >         at java.net.URLClassLoader.findClass(Unknown Source)
> >         at java.lang.ClassLoader.loadClass(Unknown Source)
> >         at sun.misc.Launcher$AppClassLoader.loadClass(Unknown Source)
> >         at java.lang.ClassLoader.loadClass(Unknown Source)
> >         ... 7 more
> >
> >
> -----------------------------------------------------------------------------------------------------------------
> >
> > jar -tf output:
> >
> >
> > consb2@CONSB2A
> > /cygdrive/c/SB/spark-1.2.0-bin-hadoop2.4/spark-1.2.0-bin-hadoop2.4/lib
> > $ jar -tf spark-assembly-1.2.0-hadoop2.4.0.jar | grep Preconditions
> > org/spark-project/guava/common/base/Preconditions.class
> > org/spark-project/guava/common/math/MathPreconditions.class
> > com/clearspring/analytics/util/Preconditions.class
> > parquet/Preconditions.class
> > com/google/inject/internal/util/$Preconditions.class
> >
> >
> ---------------------------------------------------------------------------------------------------------------
> >
> > Please help me in resolving this.
> >
> > Thanks,
> >   Shailesh
> >
> >
> >
> >
> > --
> > View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/Spark-1-2-com-google-common-base-Preconditions-java-lang-NoClassDefFoundErro-tp21271.html
> > Sent from the Apache Spark User List mailing list archive at Nabble.com.
> >
> > ---------------------------------------------------------------------
> > To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> > For additional commands, e-mail: user-h...@spark.apache.org
> >
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>

Reply via email to