I am trying to use the databricks csv reader and have tried multiple ways
to get this package available to pyspark. I have modified both
spark-defaults.conf and zeppelin-env.sh (as stated below). I've included
the spark-interpreter log from Zeppelin which seems to show it adding the
jar properly.
Hello.
I think you missed the SPARK_HOME in zeppelin-env.sh and you can refer to
[1].
[1]
http://zeppelin.incubator.apache.org/docs/0.5.6-incubating/interpreter/spark.html
I hope this is help.
2016년 4월 17일 일요일, John Omernik님이 작성한 메시지:
> I am trying to use the databricks csv reader and have tri