The --jars flag should be available for PySpark as well (I could be wrong,
I've only used Spark 1.4 onward). Take, for example, the command I'm using
to stark a PySpark shell for a Jupyter Notebook:
"--jars hdfs://{our namenode}/tmp/postgresql-9.4-1204.jdbc42.jar
--driver-class-path
/usr/local/sha
Hi,
I have been trying to do this today at work with impala as the data source
. I have been getting the same error as well.
I am using PySpark APIs with Spark 1.3 version and I was wondering if
there is any workaround for Pyspark. I don't think we can use --jars option
in PySpark.
Cheer
I recently had this same issue. Though I didn't find the cause, I was able
to work around it by loading the JAR into hdfs. Once in HDFS, I used the
--jars flag with the full hdfs path: --jars hdfs://{our
namenode}/tmp/postgresql-9.4-1204-jdbc42.jar
James
On Fri, Nov 13, 2015 at 10:14 AM satish ch