I'm running spark in LOCAL mode and trying to get it to talk to alluxio. I'm
getting the error: java.lang.ClassNotFoundException: Class
alluxio.hadoop.FileSystem not found
The cause of this error is apparently that Spark cannot find the alluxio
client jar in its classpath.

I have looked at the page here:
https://www.alluxio.org/docs/master/en/Debugging-Guide.html#q-why-do-i-see-exceptions-like-javalangruntimeexception-javalangclassnotfoundexception-class-alluxiohadoopfilesystem-not-found

Which details the steps to take in this situation, but I'm not finding
success.

According to Spark documentation, I can instance a local Spark like so:

SparkSession.builder
  .appName("App")
  .getOrCreate

Then I can add the alluxio client library like so:
sparkSession.conf.set("spark.driver.extraClassPath", ALLUXIO_SPARK_CLIENT)
sparkSession.conf.set("spark.executor.extraClassPath", ALLUXIO_SPARK_CLIENT)

I have verified that the proper jar file exists in the right location on my
local machine with:
logger.error(sparkSession.conf.get("spark.driver.extraClassPath"))
logger.error(sparkSession.conf.get("spark.executor.extraClassPath"))

But I still get the error. Is there anything else I can do to figure out why
Spark is not picking the library up?

Please note I am not using spark-submit - I am aware of the methods for
adding the client jar to a spark-submit job. My Spark instance is being
created as local within my application and this is the use case I want to
solve.

As an FYI there is another application in the cluster which is connecting to
my alluxio using the fs client and that all works fine. In that case,
though, the fs client is being packaged as part of the application through
standard sbt dependencies.





--
Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscr...@spark.apache.org

Reply via email to