> Hi Chanh, > > I had the sdame issue with oracle jdbc driver. So I have this in spark-submit > > --jars /home/hduser/jars/ojdbc6.jar,/home/hduser/jars/jconn4.jar > > still getting errors. it happened after 1.6.1 upgrade. So I had to add these > to conf/spark-defaults.conf > spark.driver.extraClassPath > /home/hduser/jars/ojdbc6.jar:/home/hduser/jars/jconn4.jar > > spark.executor.extraClassPath > /home/hduser/jars/ojdbc6.jar:/home/hduser/jars/jconn4.jar > > > and t works > HTH >
Thank you Mich for the hint. > Many (all?) JDBC drivers need to be in the system classpath. --jars > places them in an app-specific class loader, so it doesn't work. Thanks for your explanation. Now I totally understand. Regards, Chanh > On Oct 5, 2016, at 11:20 PM, Marcelo Vanzin <van...@cloudera.com> wrote: > > Many (all?) JDBC drivers need to be in the system classpath. --jars > places them in an app-specific class loader, so it doesn't work. > > On Wed, Oct 5, 2016 at 3:32 AM, Chanh Le <giaosu...@gmail.com> wrote: >> Hi everyone, >> I just wondering why when I run my program I need to add jdbc.jar into >> —driver-class-path instead treat it like a dependency by —jars. >> >> My program works with these config >> ./bin/spark-submit --packages >> org.apache.spark:spark-streaming-kafka_2.10:1.6.1 --master "local[4]" >> --class com.ants.util.kafka.PersistenceData --driver-class-path >> /Users/giaosudau/Downloads/postgresql-9.3-1102.jdbc41.jar >> /Users/giaosudau/workspace/KafkaJobs/target/scala-2.10/kafkajobs-prod.jar >> >> According by http://stackoverflow.com/a/30947090/523075 and >> http://stackoverflow.com/a/31012955/523075 >> >> This is a bug related the the classloader >> >> >> I checked this https://github.com/apache/spark/pull/6900 was merged. >> >> I am using Spark 1.6.1 and by issue tell that already fixed in 1.4.1 and 1.5 >> >> >> Regards, >> Chanh > > > > -- > Marcelo > > --------------------------------------------------------------------- > To unsubscribe e-mail: user-unsubscr...@spark.apache.org >