For Oracle JDBC driver we had to feed ojdb7.jar
into SPARK_SUBMIT_OPTIONS through --jars parameter
and into ZEPPELIN_INTP_CLASSPATH_OVERRIDES, like:

zeppelin-env.sh:

export SPARK_SUBMIT_OPTIONS=". . . --jars /var/lib/sqoop/ojdbc7.jar"
> export
> ZEPPELIN_INTP_CLASSPATH_OVERRIDES=/etc/hive/conf:/var/lib/sqoop/ojdbc7.jar





-- 
Ruslan Dautkhanov

On Mon, Jul 10, 2017 at 12:10 PM, <dar...@ontrenet.com> wrote:

> Hi
>
> We want to use a jdbc driver with pyspark through Zeppelin. Not the custom
> interpreter but from sqlContext where we can read into dataframe.
>
> I added the jdbc driver jar to zeppelin spark submit options "--jars" but
> it still says driver class not found.
>
> Does it have to reside somewhere else?
>
> Thanks in advance!
>
>
>
> Get Outlook for Android <https://aka.ms/ghei36>
>
>

Reply via email to