Figured it out and reported
https://issues.apache.org/jira/browse/SPARK-12736. Fix's coming...

Pozdrawiam,
Jacek

Jacek Laskowski | https://medium.com/@jaceklaskowski/
Mastering Apache Spark
==> https://jaceklaskowski.gitbooks.io/mastering-apache-spark/
Follow me at https://twitter.com/jaceklaskowski


On Sat, Jan 9, 2016 at 11:17 AM, Jacek Laskowski <ja...@japila.pl> wrote:
> Hi,
>
> I think the change is related:
> https://github.com/apache/spark/commit/659fd9d04b988d48960eac4f352ca37066f43f5c
> as it touches the dependency in pom.xml.
>
> Pozdrawiam,
> Jacek
>
> Jacek Laskowski | https://medium.com/@jaceklaskowski/
> Mastering Apache Spark
> ==> https://jaceklaskowski.gitbooks.io/mastering-apache-spark/
> Follow me at https://twitter.com/jaceklaskowski
>
>
> On Sat, Jan 9, 2016 at 10:59 AM, Jacek Laskowski <ja...@japila.pl> wrote:
>> Hi,
>>
>> With today's sources I'm facing "NoClassDefFoundError:
>> org/spark-project/guava/collect/Maps" while starting standalone Master
>> using ./sbin/start-master.sh.
>>
>> Anyone's working on it? File an issue?
>>
>> Spark Command: 
>> /Library/Java/JavaVirtualMachines/Current/Contents/Home/bin/java
>> -cp 
>> /Users/jacek/dev/oss/spark/conf/:/Users/jacek/dev/oss/spark/assembly/target/scala-2.11/spark-assembly-2.0.0-SNAPSHOT-hadoop2.7.1.jar:/Users/jacek/dev/oss/spark/lib_managed/jars/datanucleus-api-jdo-3.2.6.jar:/Users/jacek/dev/oss/spark/lib_managed/jars/datanucleus-core-3.2.10.jar:/Users/jacek/dev/oss/spark/lib_managed/jars/datanucleus-rdbms-3.2.9.jar
>> -Xms1g -Xmx1g org.apache.spark.deploy.master.Master --ip japila.local
>> --port 7077 --webui-port 8080
>> ========================================
>> Setting default log level to "WARN".
>> To adjust logging level use sc.setLogLevel(newLevel).
>> Exception in thread "main" java.lang.NoClassDefFoundError:
>> org/spark-project/guava/collect/Maps
>>         at 
>> org.apache.hadoop.metrics2.lib.MetricsRegistry.<init>(MetricsRegistry.java:42)
>>         at 
>> org.apache.hadoop.metrics2.impl.MetricsSystemImpl.<init>(MetricsSystemImpl.java:94)
>>         at 
>> org.apache.hadoop.metrics2.impl.MetricsSystemImpl.<init>(MetricsSystemImpl.java:141)
>>         at 
>> org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.<init>(DefaultMetricsSystem.java:38)
>>         at 
>> org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.<clinit>(DefaultMetricsSystem.java:36)
>>         at 
>> org.apache.hadoop.security.UserGroupInformation$UgiMetrics.create(UserGroupInformation.java:120)
>>         at 
>> org.apache.hadoop.security.UserGroupInformation.<clinit>(UserGroupInformation.java:236)
>>         at 
>> org.apache.spark.util.Utils$$anonfun$getCurrentUserName$1.apply(Utils.scala:2156)
>>         at 
>> org.apache.spark.util.Utils$$anonfun$getCurrentUserName$1.apply(Utils.scala:2156)
>>         at scala.Option.getOrElse(Option.scala:121)
>>         at org.apache.spark.util.Utils$.getCurrentUserName(Utils.scala:2156)
>>         at org.apache.spark.SecurityManager.<init>(SecurityManager.scala:214)
>>         at 
>> org.apache.spark.deploy.master.Master$.startRpcEnvAndEndpoint(Master.scala:1108)
>>         at org.apache.spark.deploy.master.Master$.main(Master.scala:1093)
>>         at org.apache.spark.deploy.master.Master.main(Master.scala)
>> Caused by: java.lang.ClassNotFoundException:
>> org.spark-project.guava.collect.Maps
>>         at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
>>         at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
>>         at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
>>         at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
>>         ... 15 more
>>
>> Pozdrawiam,
>> Jacek
>>
>> Jacek Laskowski | https://medium.com/@jaceklaskowski/
>> Mastering Apache Spark
>> ==> https://jaceklaskowski.gitbooks.io/mastering-apache-spark/
>> Follow me at https://twitter.com/jaceklaskowski

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
For additional commands, e-mail: dev-h...@spark.apache.org

Reply via email to