I run Spark in local mode.

Command line (added some debug info):
hduser@hadoop7:~/spark-terasort$ ./bin/run-example SparkPi 10
Jar:
/home/hduser/spark-terasort/examples/target/scala-2.10/spark-examples-1.3.0-SNAPSHOT-hadoop2.4.0.jar
/home/hduser/spark-terasort/bin/spark-submit --master local[*] --class
org.apache.spark.examples.SparkPi
/home/hduser/spark-terasort/examples/target/scala-2.10/spark-examples-1.3.0-SNAPSHOT-hadoop2.4.0.jar
10
15/03/30 17:37:28 INFO spark.SparkContext: Running Spark version
1.3.0-SNAPSHOT
15/03/30 17:37:28 WARN spark.SparkConf: In Spark 1.0 and later
spark.local.dir will be overridden by the value set by the cluster manager
(via SPARK_LOCAL_DIRS in mesos/standalone and LOCAL_DIRS in YARN).
15/03/30 17:37:28 INFO spark.SparkContext: Spark configuration:
spark.app.name=Spark Pi
spark.default.parallelism=8
spark.driver.extraJavaOptions=-Dos.arch=ppc64le
spark.driver.memory=4G
spark.eventLog.dir=/home/hduser/spark/spark-events
spark.eventLog.enabled=true
spark.executor.extraJavaOptions=-Dos.arch=ppc64le
spark.executor.memory=32G
spark.jars=file:/home/hduser/spark-terasort/examples/target/scala-2.10/spark-examples-1.3.0-SNAPSHOT-hadoop2.4.0.jar
spark.local.dir=/home/hduser/spark
spark.logConf=true
spark.master=local[*]

Stack trace:
15/03/30 17:37:30 INFO storage.BlockManagerMaster: Registered BlockManager
Exception in thread "main" java.lang.IllegalArgumentException: Log
directory ~/spark/spark-events does not exist.
    at
org.apache.spark.scheduler.EventLoggingListener.start(EventLoggingListener.scala:90)
    at org.apache.spark.SparkContext.<init>(SparkContext.scala:363)
    at org.apache.spark.examples.SparkPi$.main(SparkPi.scala:28)
    at org.apache.spark.examples.SparkPi.main(SparkPi.scala)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:95)
    at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:55)
    at java.lang.reflect.Method.invoke(Method.java:495)
    at org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:365)
    at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:75)
    at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)



On 30 March 2015 at 18:28, Marcelo Vanzin <van...@cloudera.com> wrote:

> Are you running Spark in cluster mode by any chance?
>
> (It always helps to show the command line you're actually running, and
> if there's an exception, the first few frames of the stack trace.)
>
> On Mon, Mar 30, 2015 at 4:11 PM, Tom Hubregtsen <thubregt...@gmail.com>
> wrote:
> > Updated spark-defaults and spark-env:
> > "Log directory /home/hduser/spark/spark-events does not exist."
> > (Also, in the default /tmp/spark-events it also did not work)
> >
> > On 30 March 2015 at 18:03, Marcelo Vanzin <van...@cloudera.com> wrote:
> >>
> >> Are those config values in spark-defaults.conf? I don't think you can
> >> use "~" there - IIRC it does not do any kind of variable expansion.
> >>
> >> On Mon, Mar 30, 2015 at 3:50 PM, Tom <thubregt...@gmail.com> wrote:
> >> > I have set
> >> > spark.eventLog.enabled true
> >> > as I try to preserve log files. When I run, I get
> >> > "Log directory /tmp/spark-events does not exist."
> >> > I set
> >> > spark.local.dir ~/spark
> >> > spark.eventLog.dir ~/spark/spark-events
> >> > and
> >> > SPARK_LOCAL_DIRS=~/spark
> >> > Now I get:
> >> > "Log directory ~/spark/spark-events does not exist."
> >> > I am running spark as "hduser", which I also use on the cmd(as
> verified
> >> > in
> >> > the stdout "Set(hduser); users with modify permissions:
> Set(hduser)"). I
> >> > am
> >> > able to cd into this directory. I can also create, view and delete
> files
> >> > in
> >> > this directory, logged in as hduser. I checked the folder, it is owned
> >> > by
> >> > hduser. I even performed chmod 777, but Spark keeps on crashing when I
> >> > run
> >> > with spark.eventLog.enabled. It works without. Any hints?
> >> >
> >> > Thanks,
> >> >
> >> > Tom
> >> >
> >> >
> >> >
> >> > --
> >> > View this message in context:
> >> >
> http://apache-spark-user-list.1001560.n3.nabble.com/Spark-events-does-not-exist-error-while-it-does-with-all-the-req-rights-tp22308.html
> >> > Sent from the Apache Spark User List mailing list archive at
> Nabble.com.
> >> >
> >> > ---------------------------------------------------------------------
> >> > To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> >> > For additional commands, e-mail: user-h...@spark.apache.org
> >> >
> >>
> >>
> >>
> >> --
> >> Marcelo
> >
> >
>
>
>
> --
> Marcelo
>

Reply via email to