Hi,

The error is in Spark 3.1.1 with Scala

The JDBC connection used to work fine in spark-2.4.3

val s = HiveContext.read.format("jdbc").options(
       Map("url" -> url,
       "dbtable" -> _dbtable,
       "user" -> _username,
       "password" -> _password)).load

However, in 3.1.1 it is failing with a strange error

scala> val s = HiveContext.read.format("jdbc").options(
     |        Map("url" -> url,
     |        "dbtable" -> _dbtable,
     |        "user" -> _username,
     |        "password" -> _password)).load
java.util.ServiceConfigurationError:
org.apache.spark.sql.sources.DataSourceRegister:
Provider org.apache.spark.sql.delta.sources.DeltaDataSource could not be
instantiated
  at java.util.ServiceLoader.fail(ServiceLoader.java:232)
  at java.util.ServiceLoader.access$100(ServiceLoader.java:185)
  at
java.util.ServiceLoader$LazyIterator.nextService(ServiceLoader.java:384)
  at java.util.ServiceLoader$LazyIterator.next(ServiceLoader.java:404)
  at java.util.ServiceLoader$1.next(ServiceLoader.java:480)
  at
scala.collection.convert.Wrappers$JIteratorWrapper.next(Wrappers.scala:44)
  at scala.collection.Iterator.foreach(Iterator.scala:941)
  at scala.collection.Iterator.foreach$(Iterator.scala:941)
  at scala.collection.AbstractIterator.foreach(Iterator.scala:1429)
  at scala.collection.IterableLike.foreach(IterableLike.scala:74)
  at scala.collection.IterableLike.foreach$(IterableLike.scala:73)
  at scala.collection.AbstractIterable.foreach(Iterable.scala:56)
  at scala.collection.TraversableLike.filterImpl(TraversableLike.scala:255)
  at scala.collection.TraversableLike.filterImpl$(TraversableLike.scala:249)
  at scala.collection.AbstractTraversable.filterImpl(Traversable.scala:108)
  at scala.collection.TraversableLike.filter(TraversableLike.scala:347)
  at scala.collection.TraversableLike.filter$(TraversableLike.scala:347)
  at scala.collection.AbstractTraversable.filter(Traversable.scala:108)
  at
org.apache.spark.sql.execution.datasources.DataSource$.lookupDataSource(DataSource.scala:659)
  at
org.apache.spark.sql.execution.datasources.DataSource$.lookupDataSourceV2(DataSource.scala:743)
  at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:266)
  at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:226)
  ... 53 elided
Caused by: java.lang.NoClassDefFoundError:
org/apache/spark/internal/Logging$class

Thanks


   view my Linkedin profile
<https://www.linkedin.com/in/mich-talebzadeh-ph-d-5205b2/>



*Disclaimer:* Use it at your own risk. Any and all responsibility for any
loss, damage or destruction of data or any other property which may arise
from relying on this email's technical content is explicitly disclaimed.
The author will in no case be liable for any monetary damages arising from
such loss, damage or destruction.

Reply via email to