Hi Dhimant,

I believe if you change your spark-shell to pass "-driver-class-path
/usr/local/spark/lib/mysql-connector-java-5.1.34-bin.jar" vs putting it in
"--jars".

-Todd

On Wed, Feb 18, 2015 at 10:41 PM, Dhimant <dhimant84.jays...@gmail.com>
wrote:

> Found solution from one of the post found on internet.
> I updated spark/bin/compute-classpath.sh and added database connector jar
> into classpath.
> CLASSPATH="$CLASSPATH:/data/mysql-connector-java-5.1.14-bin.jar"
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/No-suitable-driver-found-error-Create-table-in-hive-from-spark-sql-tp21714p21715.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>

Reply via email to