Ok so I am wondering.
Calling this outside of the driver
appName = config['common']['appName']
* spark_session = s.spark_session(appName)*
def spark_session(appName):
return SparkSession.builder \
.appName(appName) \
.enableHiveSupport() \
.getOrCreate()
It says
Yep, you can never use Spark inside Spark.
You could run N jobs in parallel from the driver using Spark, however.
On Mon, Mar 8, 2021 at 3:14 PM Mich Talebzadeh
wrote:
>
> In structured streaming with pySpark, I need to do some work on the row
> *foreach(process_row)*
>
> below
>
>
> *def proces