Hi, Are you able to access 'sc' or 'sqlContext'? And do you see any special error in the log files? While all they're created in similar way [1], i guess if one available, the others available, too. vise versa.
Thanks, moon [1] https://github.com/apache/incubator-zeppelin/blob/master/spark/src/main/java/org/apache/zeppelin/spark/SparkInterpreter.java#L549 On Mon, Apr 25, 2016 at 5:24 PM Ydalia Delgado <ydelgad...@gmail.com> wrote: > Hi, > > I have set in conf/zeppelin-env.sh > > export SPARK_HOME="/home/username/libs/spark-1.6.1-bin-hadoop2.6/" > export SPARK_SUBMIT_OPTIONS="--driver-memory 2G --executor-memory 4G > --driver-class-path > /home/username/libs/mysql-spark/mysql-connector-java-5.1.38-bin.jar" > > I have tried everything from > https://mail-archives.apache.org/mod_mbox/incubator-zeppelin-users/201509.mbox/%3cCALf24saa=TW_uapCKju4Z=j+C=ey6nnvrshvjdhsy+fgzto...@mail.gmail.com%3e > but it doesn't work. > > I need to connect to a MySQL DB using JDBC. The only way I get it working > is setting SPARK_HOME and SPARK_SUBMIT_OPTIONS. But then I get the > following error: > > val inputValue = z.input("value") > <console>:27: error: not found: value z > > Thank you a lot for your help! > Ydalia >