do you have JAVA_HOME set to a java 7 jdk?

2015-10-23 7:12 GMT-04:00 emlyn <em...@swiftkey.com>:

> xjlin0 wrote
> > I cannot enter REPL shell in 1.4.0/1.4.1/1.5.0/1.5.1(with pre-built with
> > or without Hadoop or home compiled with ant or maven).  There was no
> error
> > message in v1.4.x, system prompt nothing.  On v1.5.x, once I enter
> > $SPARK_HOME/bin/pyspark or spark-shell, I got
> >
> > Error: Could not find or load main class org.apache.spark.launcher.Main
>
> I have the same problem (on MacOS X Yosemite, all spark versions since 1.4,
> installed both with homebrew and downloaded manually). I've been trying to
> start the pyspark shell, but it also fails in the same way for spark-shell
> and spark-sql and spark-submit. I've narrowed it down to the following line
> in the spark-class script:
>
> done < <("$RUNNER" -cp "$LAUNCH_CLASSPATH" org.apache.spark.launcher.Main
> "$@")
>
> (where $RUNNER is "java" and $LAUNCH_CLASSPATH is
>
> "/usr/local/Cellar/apache-spark/1.5.1/libexec/lib/spark-assembly-1.5.1-hadoop2.6.0.jar",
> which does exist and does contain the org.apache.spark.launcher.Main class,
> despite the message that it can't be found)
>
> If I run it manually, using:
>
> SPARK_HOME=/usr/local/Cellar/apache-spark/1.5.1/libexec java -cp
>
> /usr/local/Cellar/apache-spark/1.5.1/libexec/lib/spark-assembly-1.5.1-hadoop2.6.0.jar
> org.apache.spark.launcher.Main org.apache.spark.deploy.SparkSubmit
> pyspark-shell-main --name PySparkShell
>
> It runs without that error, and instead prints out (where "\0" is a nul
> character):
>
> env\0PYSPARK_SUBMIT_ARGS="--name" "PySparkShell" "pyspark-shell"\0python\0
>
> I'm not really sure what to try next, maybe with this extra information
> someone has an idea what's going wrong, and how to fix it.
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/Cannot-start-REPL-shell-since-1-4-0-tp24921p25176.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>

Reply via email to