On Mon, Dec 29, 2014 at 7:39 PM, Jeremy Freeman <[email protected]> wrote: > Hi Stephen, it should be enough to include > >> --jars /path/to/file.jar > > in the command line call to either pyspark or spark-submit, as in > >> spark-submit --master local --jars /path/to/file.jar myfile.py
Unfortunately, you also need '--driver-class-path /path/to/file.jar' to make it accessible in driver. (This may be fixed in 1.3). > and you can check the bottom of the Web UI’s “Environment" tab to make sure > the jar gets on your classpath. Let me know if you still see errors related > to this. > > — Jeremy > > ------------------------- > jeremyfreeman.net > @thefreemanlab > > On Dec 29, 2014, at 7:55 PM, Stephen Boesch <[email protected]> wrote: > >> What is the recommended way to do this? We have some native database >> client libraries for which we are adding pyspark bindings. >> >> The pyspark invokes spark-submit. Do we add our libraries to >> the SPARK_SUBMIT_LIBRARY_PATH ? >> >> This issue relates back to an error we have been seeing "Py4jError: Trying >> to call a package" - the suspicion being that the third party libraries may >> not be available on the jvm side. > --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
