Hi Julius,

You can add those external jars to spark while creating the sparkContext
(sc.addJar("/path/to/the/jar")), if you are submitting the job using
spark-submit then you can use the --jars option and get those jars shipped.

Thanks
Best Regards

On Sun, Dec 7, 2014 at 11:05 PM, Julius K <fooliuskool...@gmail.com> wrote:

> Hi everyone,
> I am new to Spark and encountered a problem.
> I want to use an external library in a java project and compiling
> works fine with maven, but during runtime (locally) I get a
> NoClassDefFoundError.
> Do I have to put the jars somewhere, or tell spark where they are?
>
> I can send the pom.xml and my imports or source code, if this helps you.
>
> Best regards
> Julius Kolbe
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>

Reply via email to