Use --py-files
See
https://spark.apache.org/docs/latest/submitting-applications.html#bundling-your-applications-dependencies
I hope that helps.
On Tue, 28 Jan 2020, 9:46 am Tharindu Mathew,
wrote:
> Hi,
>
> Newbie to pyspark/spark here.
>
> I'm trying to submit a job to pyspark with a dependen
Hi,
Newbie to pyspark/spark here.
I'm trying to submit a job to pyspark with a dependency. Spark DL in this
case. While the local environment has this the pyspark does not see it. How
do I correctly start pyspark so that it sees this dependency?
Using Spark 2.3.0 in a cloudera setup.
--
Regard