andygrove commented on code in PR #435: URL: https://github.com/apache/datafusion-comet/pull/435#discussion_r1602126972
########## docs/source/user-guide/installation.md: ########## @@ -57,13 +57,17 @@ Note that the project builds for Scala 2.12 by default but can be built for Scal make release PROFILES="-Pspark-3.4 -Pscala-2.13" ``` -## Run Spark with Comet enabled +## Run Spark Shell with Comet enabled Make sure `SPARK_HOME` points to the same Spark version as Comet was built for. ```console +export COMET_JAR=spark/target/comet-spark-spark3.4_2.12-0.1.0-SNAPSHOT.jar + $SPARK_HOME/bin/spark-shell \ - --jars spark/target/comet-spark-spark3.4_2.12-0.1.0-SNAPSHOT.jar \ + --jars $COMET_JAR \ + --conf spark.driver.extraClassPath=$COMET_JAR \ Review Comment: Yes, we need to specify extraClassPath to resolve the `ClassNotFoundException` issues described in https://github.com/apache/datafusion-comet/issues/182. This is specific to `spark-shell` and not needed for `spark-submit`. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: [email protected] For queries about this service, please contact Infrastructure at: [email protected] --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
