Thanks Aaron.
Adding Guava jar resolves the issue.
Shailesh
On Wed, Jan 21, 2015 at 3:26 PM, Aaron Davidson wrote:
> Spark's network-common package depends on guava as a "provided" dependency
> in order to avoid conflicting with other libraries (e.g., Hadoop) that
> depend on specific versions
18:26:32 -0800
Subject: Re: Spark 1.2 - com/google/common/base/Preconditions
java.lang.NoClassDefFoundErro
To: sbirar...@gmail.com
CC: fnoth...@berkeley.edu; so...@cloudera.com; user@spark.apache.org
Spark's network-common package depends on guava as a "provided" dependenc
Spark's network-common package depends on guava as a "provided" dependency
in order to avoid conflicting with other libraries (e.g., Hadoop) that
depend on specific versions. com/google/common/base/Preconditions has been
present in Guava since version 2, so this is likely a "dependency not
found" r
Hi Frank,
Its a normal eclipse project where I added Scala and Spark libraries as
user libraries.
Though, I am not attaching any hadoop libraries, in my application code I
have following line.
System.setProperty("hadoop.home.dir", "C:\\SB\\HadoopWin")
This Hadoop home dir contains "winutils.ex
Shailesh,
To add, are you packaging Hadoop in your app? Hadoop will pull in Guava. Not
sure if you are using Maven (or what) to build, but if you can pull up your
builds dependency tree, you will likely find com.google.guava being brought in
by one of your dependencies.
Regards,
Frank Austin
Hello,
I double checked the libraries. I am linking only with Spark 1.2.
Along with Spark 1.2 jars I have Scala 2.10 jars and JRE 7 jars linked and
nothing else.
Thanks,
Shailesh
On Wed, Jan 21, 2015 at 12:58 PM, Sean Owen wrote:
> Guava is shaded in Spark 1.2+. It looks like you are mixing
Please also see this thread:
http://search-hadoop.com/m/JW1q5De7pU1
On Tue, Jan 20, 2015 at 3:58 PM, Sean Owen wrote:
> Guava is shaded in Spark 1.2+. It looks like you are mixing versions
> of Spark then, with some that still refer to unshaded Guava. Make sure
> you are not packaging Spark with
Guava is shaded in Spark 1.2+. It looks like you are mixing versions
of Spark then, with some that still refer to unshaded Guava. Make sure
you are not packaging Spark with your app and that you don't have
other versions lying around.
On Tue, Jan 20, 2015 at 11:55 PM, Shailesh Birari wrote:
> Hel
Hello,
I recently upgraded my setup from Spark 1.1 to Spark 1.2.
My existing applications are working fine on ubuntu cluster.
But, when I try to execute Spark MLlib application from Eclipse (Windows
node) it gives java.lang.NoClassDefFoundError:
com/google/common/base/Preconditions exception.
Not