Hi,

I'm trying to run Zeppelin over an existing Spark cluster.

My zeppelin-env.sh has the entry:
export MASTER=spark://spark:7077

In the first paragraph, I executed bash commands:
%sh
hadoop fs -ls /user/root

This returned:
drwxr-xr-x - root supergroup 0 2015-01-15 09:05 /user/root/input -rw-r--r--
3 root supergroup 29966462 2015-03-23 01:06 /user/root/product.txt

In the next paragraph, I executed the following:
%spark
val prodRaw = sc.textFile("hdfs://user/root/product.txt")
prodRaw.count

This doesn't return any result, or any errors on the console. Instead, I
see a new context create every time I execute something:
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
------ Create new SparkContext spark://spark:7077 -------
------ Create new SparkContext spark://spark:7077 -------
------ Create new SparkContext spark://spark:7077 -------

Is this expected behavior? Seems like Zeppelin should be holding the
context.

Same issues when executing the sample notebook.

Appreciate any help!

Reply via email to