o do this today at work with impala as the data source
> . I have been getting the same error as well.
>
> I am using PySpark APIs with Spark 1.3 version and I was wondering if
> there is any workaround for Pyspark. I don't think we can use --jars option
> in PySpark.
>
>
I recently had this same issue. Though I didn't find the cause, I was able
to work around it by loading the JAR into hdfs. Once in HDFS, I used the
--jars flag with the full hdfs path: --jars hdfs://{our
namenode}/tmp/postgresql-9.4-1204-jdbc42.jar
James
On Fri, Nov 13, 2015 at 10:14 AM satish ch