Hi Jack,

My use case is a bit different, I created a subprocess instead of thread. I
can't pass the args to subprocess.

Jack Goodson <jackagood...@gmail.com> 於 2022年12月12日 週一 晚上8:03寫道:

> apologies, the code should read as below
>
> from threading import Thread
>
> context = pyspark.sql.SparkSession.builder.appName("spark").getOrCreate()
>
> t1 = Thread(target=my_func, args=(context,))
> t1.start()
>
> t2 = Thread(target=my_func, args=(context,))
> t2.start()
>
> On Tue, Dec 13, 2022 at 4:10 PM Jack Goodson <jackagood...@gmail.com>
> wrote:
>
>> Hi Kevin,
>>
>> I had a similar use case (see below code) but with something that wasn’t
>> spark related. I think the below should work for you, you may need to edit
>> the context variable to suit your needs but hopefully it gives the general
>> idea of sharing a single object between multiple threads.
>>
>> Thanks
>>
>>
>> from threading import Thread
>>
>> context = pyspark.sql.SparkSession.builder.appName("spark").getOrCreate()
>>
>> t1 = Thread(target=order_creator, args=(app_id, sleep_time,))
>> t1.start(target=my_func, args=(context,))
>>
>> t2 = Thread(target=order_creator, args=(app_id, sleep_time,))
>> t2.start(target=my_func, args=(context,))
>>
>

Reply via email to