has anyone tried to make HiveContext only if the class is available?

i tried this:
 implicit lazy val sqlc: SQLContext = try {
    Class.forName("org.apache.spark.sql.hive.HiveContext", true,
Thread.currentThread.getContextClassLoader)

.getConstructor(classOf[SparkContext]).newInstance(sc).asInstanceOf[SQLContext]
  } catch { case e: ClassNotFoundException => new SQLContext(sc) }

it compiles fine, but i get classloader issues when i actually use it on a
cluster. for example:

Exception in thread "main" java.lang.RuntimeException: Failed to load class
for data source: com.databricks.spark.csv
    at scala.sys.package$.error(package.scala:27)
    at
org.apache.spark.sql.sources.ResolvedDataSource$.lookupDataSource(ddl.scala:216)
    at org.apache.spark.sql.sources.ResolvedDataSource$.apply(ddl.scala:229)
    at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:114)
    at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:104)

Reply via email to