Oh so you're using Windows. What command are you using to start the Thrift server then?

On 11/25/14 4:25 PM, Judy Nash wrote:

Made progress but still blocked.

After recompiling the code on cmd instead of PowerShell, now I can see all 5 classes as you mentioned.

However I am still seeing the same error as before. Anything else I can check for?

*From:*Judy Nash [mailto:judyn...@exchange.microsoft.com]
*Sent:* Monday, November 24, 2014 11:50 PM
*To:* Cheng Lian; u...@spark.incubator.apache.org
*Subject:* RE: latest Spark 1.2 thrift server fail with NoClassDefFoundError on Guava

This is what I got from jar tf:

org/spark-project/guava/common/base/Preconditions.class

org/spark-project/guava/common/math/MathPreconditions.class

com/clearspring/analytics/util/Preconditions.class

parquet/Preconditions.class

I seem to have the line that reported missing, but I am missing this file:

com/google/inject/internal/util/$Preconditions.class

Any suggestion on how to fix this?

Very much appreciate the help as I am very new to Spark and open source technologies.

*From:*Cheng Lian [mailto:lian.cs....@gmail.com]
*Sent:* Monday, November 24, 2014 8:24 PM
*To:* Judy Nash; u...@spark.incubator.apache.org <mailto:u...@spark.incubator.apache.org> *Subject:* Re: latest Spark 1.2 thrift server fail with NoClassDefFoundError on Guava

Hm, I tried exactly the same commit and the build command locally, but couldn’t reproduce this.

Usually this kind of errors are caused by classpath misconfiguration. Could you please try this to ensure corresponding Guava classes are included in the assembly jar you built?

|jar tf 
assembly/target/scala-|2.10|/spark-assembly-|1.2|.|1|-SNAPSHOT-hadoop2.|4.0|.jar
 | grep Preconditions|

On my machine I got these lines (the first line is the one reported as missing in your case):

|org/spark-project/guava/common/base/Preconditions.class|
|org/spark-project/guava/common/math/MathPreconditions.class|
|com/clearspring/analytics/util/Preconditions.class|
|parquet/Preconditions.class|
|com/google/inject/internal/util/$Preconditions.class|||

On 11/25/14 6:25 AM, Judy Nash wrote:

    Thank you Cheng for responding.


    Here is the commit SHA1 on the 1.2 branch I saw this failure in:

    commit 6f70e0295572e3037660004797040e026e440dbd

    Author: zsxwing <zsxw...@gmail.com> <mailto:zsxw...@gmail.com>

    Date: Fri Nov 21 00:42:43 2014 -0800

    [SPARK-4472][Shell] Print "Spark context available as sc." only
    when SparkContext is created...

        ... successfully

        It's weird that printing "Spark context available as sc" when
    creating SparkContext unsuccessfully.

    Let me know if you need anything else.

    *From:*Cheng Lian [mailto:lian.cs....@gmail.com]
    *Sent:* Friday, November 21, 2014 8:02 PM
    *To:* Judy Nash; u...@spark.incubator.apache.org
    <mailto:u...@spark.incubator.apache.org>
    *Subject:* Re: latest Spark 1.2 thrift server fail with
    NoClassDefFoundError on Guava

    Hi Judy, could you please provide the commit SHA1 of the version
    you're using? Thanks!

    On 11/22/14 11:05 AM, Judy Nash wrote:

        Hi,

        Thrift server is failing to start for me on latest spark 1.2
        branch.

        I got the error below when I start thrift server.

        Exception in thread "main" java.lang.NoClassDefFoundError:
        com/google/common/bas

        e/Preconditions

                at
        org.apache.hadoop.conf.Configuration$DeprecationDelta.<init>(Configur

        ation.java:314)….

        Here is my setup:

        1)Latest spark 1.2 branch build

        2)Used build command:

        mvn -Pyarn -Phadoop-2.4 -Dhadoop.version=2.4.0 -Phive
        -Phive-thriftserver -DskipTests clean package

        3)Added hive-site.xml to \conf

        4)Version on the box: Hive 0.13, Hadoop 2.4

        Is this a real bug or am I doing something wrong?

        -----------------------------------

        Full Stacktrace:

        Exception in thread "main" java.lang.NoClassDefFoundError:
        com/google/common/bas

        e/Preconditions

                at
        org.apache.hadoop.conf.Configuration$DeprecationDelta.<init>(Configur

        ation.java:314)

                at
        org.apache.hadoop.conf.Configuration$DeprecationDelta.<init>(Configur

        ation.java:327)

                at
        org.apache.hadoop.conf.Configuration.<clinit>(Configuration.java:409)

                at
        org.apache.spark.deploy.SparkHadoopUtil.newConfiguration(SparkHadoopU

        til.scala:82)

                at
        org.apache.spark.deploy.SparkHadoopUtil.<init>(SparkHadoopUtil.scala:

        42)

                at
        org.apache.spark.deploy.SparkHadoopUtil$.<init>(SparkHadoopUtil.scala

        :202)

                at
        org.apache.spark.deploy.SparkHadoopUtil$.<clinit>(SparkHadoopUtil.sca

        la)

                at
        org.apache.spark.util.Utils$.getSparkOrYarnConfig(Utils.scala:1784)

                at
        org.apache.spark.storage.BlockManager.<init>(BlockManager.scala:105)

                at
        org.apache.spark.storage.BlockManager.<init>(BlockManager.scala:180)

                at org.apache.spark.SparkEnv$.create(SparkEnv.scala:292)

                at
        org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:159)

                at
        org.apache.spark.SparkContext.<init>(SparkContext.scala:230)

                at
        org.apache.spark.sql.hive.thriftserver.SparkSQLEnv$.init(SparkSQLEnv.

        scala:38)

                at
        org.apache.spark.sql.hive.thriftserver.HiveThriftServer2$.main(HiveTh

        riftServer2.scala:56)

                at
        org.apache.spark.sql.hive.thriftserver.HiveThriftServer2.main(HiveThr

        iftServer2.scala)

                at sun.reflect.NativeMethodAccessorImpl.invoke0(Native
        Method)

                at
        sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.

        java:57)

                at
        sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcces

        sorImpl.java:43)

                at java.lang.reflect.Method.invoke(Method.java:606)

                at
        org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:353)

                at
        org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:75)

                at
        org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

        Caused by: java.lang.ClassNotFoundException:
        com.google.common.base.Precondition

        s

                at java.net.URLClassLoader$1.run(URLClassLoader.java:366)

                at java.net.URLClassLoader$1.run(URLClassLoader.java:355)

                at java.security.AccessController.doPrivileged(Native
        Method)

                at
        java.net.URLClassLoader.findClass(URLClassLoader.java:354)

                at java.lang.ClassLoader.loadClass(ClassLoader.java:425)

                at
        sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)

                at java.lang.ClassLoader.loadClass(ClassLoader.java:358)

​


Reply via email to