Is your Spark working .. can you try running spark shell?
http://spark.apache.org/docs/0.9.1/quick-start.html
If spark is working we can move this to shark user list(copied here)
Also I am anything but a sir :)

Regards
Mayur

Mayur Rustagi
Ph: +1 (760) 203 3257
http://www.sigmoidanalytics.com
@mayur_rustagi <https://twitter.com/mayur_rustagi>



On Wed, May 14, 2014 at 12:49 PM, Sophia <sln-1...@163.com> wrote:

> My configuration is just like this,the slave's node has been
> configuate,but I
> donnot know what's happened to the shark?Can you help me Sir?
> shark-env.sh
> export SPARK_USER_HOME=/root
> export SPARK_MEM=2g
> export SCALA_HOME="/root/scala-2.11.0-RC4"
> export SHARK_MASTER_MEM=1g
> export HIVE_CONF_DIR="/usr/lib/hive/conf"
> export HIVE_HOME="/usr/lib/hive"
> export HADOOP_HOME="/usr/lib/hadoop"
> export SPARK_HOME="/root/spark-0.9.1"
> export MASTER="spark://192.168.10.220:7077"
> export SHARK_EXEC_MODE=yarn
>
> SPARK_JAVA_OPTS=" -Dspark.local.dir=/tmp "
> SPARK_JAVA_OPTS+="-Dspark.kryoserializer.buffer.mb=10 "
> SPARK_JAVA_OPTS+="-verbose:gc -XX:-PrintGCDetails -XX:+PrintGCTimeStamps "
> export SPARK_JAVA_OPTS
> export
>
> SPARK_ASSEMBLY_JAR="/root/spark-0.9.1/assembly/target/scala-2.10/spark-assembly_2.10-0.9.1-hadoop2.2.0.jar"
> export
>
> SHARK_ASSEMBLY_JAR="/root/shark-0.9.1-bin-hadoop2/target/scala-2.10/shark_2.10-0.9.1.jar"
>
> Best regards,
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/How-to-run-shark-tp5581p5688.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>

Reply via email to