This appears to be an issue with the Spark to DocumentDB connector,
specifically version 0.0.1. Could you run the 0.0.3 version of the jar and
see if you're still getting the same error?  i.e.

    spark-shell --master yarn --jars
azure-documentdb-spark-0.0.3-SNAPSHOT.jar,azure-documentdb-1.10.0.jar


On Mon, May 8, 2017 at 5:01 AM ayan guha <guha.a...@gmail.com> wrote:

> Hi
>
> I am facing an issue while trying to use azure-document-db connector from
> Microsoft. Instructions/Github
> <https://github.com/Azure/azure-documentdb-spark/wiki/Azure-DocumentDB-Spark-Connector-User-Guide>
> .
>
> Error while trying to add jar in spark-shell:
>
> spark-shell --jars
> azure-documentdb-spark-0.0.1.jar,azure-documentdb-1.9.6.jar
> SPARK_MAJOR_VERSION is set to 2, using Spark2
> Setting default log level to "WARN".
> To adjust logging level use sc.setLogLevel(newLevel).
> [init] error: error while loading <root>, Error accessing
> /home/sshuser/azure-spark-docdb-test/v1/azure-documentdb-spark-0.0.1.jar
>
> Failed to initialize compiler: object java.lang.Object in compiler mirror
> not found.
> ** Note that as of 2.8 scala does not assume use of the java classpath.
> ** For the old behavior pass -usejavacp to scala, or if using a Settings
> ** object programmatically, settings.usejavacp.value = true.
>
> Failed to initialize compiler: object java.lang.Object in compiler mirror
> not found.
> ** Note that as of 2.8 scala does not assume use of the java classpath.
> ** For the old behavior pass -usejavacp to scala, or if using a Settings
> ** object programmatically, settings.usejavacp.value = true.
> Exception in thread "main" java.lang.NullPointerException
>         at
> scala.reflect.internal.SymbolTable.exitingPhase(SymbolTable.scala:256)
>         at
> scala.tools.nsc.interpreter.IMain$Request.x$20$lzycompute(IMain.scala:896)
>         at scala.tools.nsc.interpreter.IMain$Request.x$20(IMain.scala:895)
>         at
> scala.tools.nsc.interpreter.IMain$Request.headerPreamble$lzycompute(IMain.scala:895)
>         at
> scala.tools.nsc.interpreter.IMain$Request.headerPreamble(IMain.scala:895)
>         at
> scala.tools.nsc.interpreter.IMain$Request$Wrapper.preamble(IMain.scala:918)
>         at
> scala.tools.nsc.interpreter.IMain$CodeAssembler$$anonfun$apply$23.apply(IMain.scala:1337)
>         at
> scala.tools.nsc.interpreter.IMain$CodeAssembler$$anonfun$apply$23.apply(IMain.scala:1336)
>         at scala.tools.nsc.util.package$.stringFromWriter(package.scala:64)
>         at
> scala.tools.nsc.interpreter.IMain$CodeAssembler$class.apply(IMain.scala:1336)
>         at
> scala.tools.nsc.interpreter.IMain$Request$Wrapper.apply(IMain.scala:908)
>         at
> scala.tools.nsc.interpreter.IMain$Request.compile$lzycompute(IMain.scala:1002)
>         at
> scala.tools.nsc.interpreter.IMain$Request.compile(IMain.scala:997)
>         at scala.tools.nsc.interpreter.IMain.compile(IMain.scala:579)
>         at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:567)
>         at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:565)
>         at
> scala.tools.nsc.interpreter.ILoop.interpretStartingWith(ILoop.scala:807)
>         at scala.tools.nsc.interpreter.ILoop.command(ILoop.scala:681)
>         at scala.tools.nsc.interpreter.ILoop.processLine(ILoop.scala:395)
>         at
> org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply$mcV$sp(SparkILoop.scala:38)
>         at
> org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:37)
>         at
> org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:37)
>         at scala.tools.nsc.interpreter.IMain.beQuietDuring(IMain.scala:214)
>         at
> org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:37)
>         at org.apache.spark.repl.SparkILoop.loadFiles(SparkILoop.scala:94)
>         at
> scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply$mcZ$sp(ILoop.scala:920)
>         at
> scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909)
>         at
> scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909)
>         at
> scala.reflect.internal.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:97)
>         at scala.tools.nsc.interpreter.ILoop.process(ILoop.scala:909)
>         at org.apache.spark.repl.Main$.doMain(Main.scala:68)
>         at org.apache.spark.repl.Main$.main(Main.scala:51)
>         at org.apache.spark.repl.Main.main(Main.scala)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>         at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>         at java.lang.reflect.Method.invoke(Method.java:498)
>         at
> org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:736)
>         at
> org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:185)
>         at
> org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:210)
>         at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:124)
>         at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
> sshuser@ed0-svochd:~/azure-spark-docdb-test/v1$
>
> I think I am missing some basic configuration here or there is classpath
> related issue. Can anyone help?
>
> Additional info:
> Environment: HDInsight 3.5, based on HDP 2.5
>
> sshuser@ed0-svochd:~/azure-spark-docdb-test/v1$ echo $JAVA_HOME
> /usr/lib/jvm/java-8-openjdk-amd64
>
> sshuser@ed0-svochd:~/azure-spark-docdb-test/v1$ echo $SPARK_HOME
> /usr/hdp/current/spark2-client
>
> sshuser@ed0-svochd:~/azure-spark-docdb-test/v1$ java -version
> openjdk version "1.8.0_121"
> OpenJDK Runtime Environment (build
> 1.8.0_121-8u121-b13-0ubuntu1.16.04.2-b13)
> OpenJDK 64-Bit Server VM (build 25.121-b13, mixed mode)
>
> sshuser@ed0-svochd:~/azure-spark-docdb-test/v1$ uname -a
> Linux ed0-svochd 4.4.0-72-generic #93-Ubuntu SMP Fri Mar 31 14:07:41 UTC
> 2017 x86_64 x86_64 x86_64 GNU/Linux
> sshuser@ed0-svochd:~/azure-spark-docdb-test/v1$
>
> --
> Best Regards,
> Ayan Guha
>

Reply via email to