I think [1] is might related issue.
Could you use 4.7.0 instead 4.4.0?

[1]
https://community.hortonworks.com/questions/17861/error-starting-spark-shell-with-phoenix-client-jar.html


2016-09-06 14:21 GMT+09:00 Vikash Kumar <vikash.ku...@resilinc.com>:

> Hi,
>
> I am loading the library through UI in spark interpreter as:
>
>
>
> 1.       org.apache.phoenix:phoenix-spark:4.4.0-HBase-1.1
>
> Excluded :- org.scala-lang:scala-library, org.scala-lang:scala-compiler,
> org.scala-lang:scala-reflect, org.apache.phoenix:phoenix-core
>
>
>
> 2.       org.apache.phoenix:phoenix-core:4.4.0-HBase-1.1
>
> Excluded :- com.sun.jersey:jersey-core, com.sun.jersey:jersey-server,
> com.sun.jersey:jersey-client, org.ow2.asm:asm, io.netty:netty
>
>
>
> Thanks and Regard,
>
> Vikash Kumar
>
>
>
> *From:* astros...@gmail.com [mailto:astros...@gmail.com] *On Behalf Of *Hyung
> Sung Shim
> *Sent:* Tuesday, September 6, 2016 10:47 AM
> *To:* users <users@zeppelin.apache.org>
> *Subject:* Re: Spark error when loading phoenix-spark dependency
>
>
>
> Hello.
>
> How did you load library?
>
>
>
>
>
> 2016-09-06 13:49 GMT+09:00 Vikash Kumar <vikash.ku...@resilinc.com>:
>
> Hi ,
>
> Is there anyone who is getting the same errors?
>
>
>
> Thanks and Regard,
>
> Vikash Kumar
>
>
>
> *From:* Vikash Kumar [mailto:vikash.ku...@resilinc.com]
> *Sent:* Thursday, September 1, 2016 11:08 AM
> *To:* users@zeppelin.apache.org
> *Subject:* Spark error when loading phoenix-spark dependency
>
>
>
> Hi all,
>
>                 I am getting the following error when loading the
> org.apache.phoenix:phoenix-spark:4.4.0-HBase-1.1 dependency from spark
> interpreter. I am using Zeppelin *Version 0.6.2-SNAPSHOT* with spark
> 1.6.1 and hdp 2.7.1.
>
>
>
> The packages that I am inporting is:
>
>             import org.apache.phoenix.spark._
>
> import org.apache.phoenix.spark.PhoenixRDD._
>
> import java.sql.{ Date, Timestamp}
>
> My build command is
>
>             mvn clean package -DskipTests -Drat.ignoreErrors=true
> -Dcheckstyle.skip=true -Pspark-1.6 -Dspark.version=1.6.1 -Phadoop-2.6 –Pyarn
>
>
>
>
>
> java.lang.NoSuchMethodError: org.apache.spark.util.Utils$.
> resolveURIs(Ljava/lang/String;)Ljava/lang/String;
>
>                 at org.apache.spark.repl.SparkILoop$.getAddedJars(
> SparkILoop.scala:1079)
>
>                 at org.apache.spark.repl.SparkILoop.createInterpreter(
> SparkILoop.scala:210)
>
>                 at org.apache.zeppelin.spark.SparkInterpreter.open(
> SparkInterpreter.java:698)
>
>                 at org.apache.zeppelin.interpreter.
> LazyOpenInterpreter.open(LazyOpenInterpreter.java:69)
>
>                 at org.apache.zeppelin.interpreter.
> LazyOpenInterpreter.interpret(LazyOpenInterpreter.java:93)
>
>                 at org.apache.zeppelin.interpreter.remote.
> RemoteInterpreterServer$InterpretJob.jobRun(RemoteInterpreterServer.java:
> 341)
>
>                 at org.apache.zeppelin.scheduler.Job.run(Job.java:176)
>
>                 at org.apache.zeppelin.scheduler.FIFOScheduler$1.run(
> FIFOScheduler.java:139)
>
>                 at java.util.concurrent.Executors$RunnableAdapter.
> call(Executors.java:511)
>
>                 at java.util.concurrent.FutureTask.run(FutureTask.
> java:266)
>
>                 at java.util.concurrent.ScheduledThreadPoolExecutor$
> ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180)
>
>                 at java.util.concurrent.ScheduledThreadPoolExecutor$
> ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293)
>
>                 at java.util.concurrent.ThreadPoolExecutor.runWorker(
> ThreadPoolExecutor.java:1142)
>
>                 at java.util.concurrent.ThreadPoolExecutor$Worker.run(
> ThreadPoolExecutor.java:617)
>
>                 at java.lang.Thread.run(Thread.java:745)
>
>
>
>
>
>
>
>
>
>
>
> Thanks and Regard,
>
> Vikash Kumar
>
>
>

Reply via email to