Well, this just won't work when you are running Spark on Hadoop...
On 12/10/20 9:14 PM, lec ssmi wrote:
If you can not assembly the jdbc driver jar in your application jar
package, you can put the jdbc driver jar in the spark classpath,
generally, $SPARK_HOME/jars or $SPARK_HOME/lib.
Artemi
If you can not assembly the jdbc driver jar in your application jar
package, you can put the jdbc driver jar in the spark classpath, generally,
$SPARK_HOME/jars or $SPARK_HOME/lib.
Artemis User 于2020年12月11日周五 上午5:21写道:
> What happened was that you made the mysql jar file only available to the
What happened was that you made the mysql jar file only available to the
spark driver, not the executors. Use the --jars parameter instead of
driver-class-path to specify your third-party jar files, or copy the
third-party jar files to the jars directory for Spark in your HDFS, and
specify the