On Mon, Dec 29, 2014 at 7:39 PM, Jeremy Freeman
wrote:
> Hi Stephen, it should be enough to include
>
>> --jars /path/to/file.jar
>
> in the command line call to either pyspark or spark-submit, as in
>
>> spark-submit --master local --jars /path/to/file.jar myfile.py
Unfortunately, you also need
Hi Stephen, it should be enough to include
> --jars /path/to/file.jar
in the command line call to either pyspark or spark-submit, as in
> spark-submit --master local --jars /path/to/file.jar myfile.py
and you can check the bottom of the Web UI’s “Environment" tab to make sure the
jar gets on
What is the recommended way to do this? We have some native database
client libraries for which we are adding pyspark bindings.
The pyspark invokes spark-submit. Do we add our libraries to
the SPARK_SUBMIT_LIBRARY_PATH ?
This issue relates back to an error we have been seeing "Py4jError: Tryin