022年12月12日 週一 下
> 午2:39寫道:
>
> Spark Connect :)
>
> (It’s work in progress)
>
>
> On Mon, Dec 12 2022 at 2:29 PM, Kevin Su
>
Hi Jack,
My use case is a bit different, I created a subprocess instead of thread. I
can't pass the args to subprocess.
Jack Goodson 於 2022年12月12日 週一 晚上8:03寫道:
> apologies, the code should read as below
>
> from threading import Thread
>
> context = pyspark.sql.SparkSession.builder.appName("spa
apologies, the code should read as below
from threading import Thread
context = pyspark.sql.SparkSession.builder.appName("spark").getOrCreate()
t1 = Thread(target=my_func, args=(context,))
t1.start()
t2 = Thread(target=my_func, args=(context,))
t2.start()
On Tue, Dec 13, 2022 at 4:10 PM Jack G
Hi Kevin,
I had a similar use case (see below code) but with something that wasn’t
spark related. I think the below should work for you, you may need to edit
the context variable to suit your needs but hopefully it gives the general
idea of sharing a single object between multiple threads.
Thanks
午2:39寫道:
> >
> > Spark Connect :)
> >
> > (It’s work in progress)
> >
> >
> > On Mon, Dec 12 2022 at 2:29 PM, Kevin Su
> > mailto:pings...@gmail.com>>
.@databricks.com>> 於 2022年12月12日 週一 下
午2:39寫道:
Spark Connect :)
(It’s work in progress)
On Mon, Dec 12 2022 at 2:29 PM, Kevin Su
mailto:pings...@gmail.com>> wrote:
R or Jira ticket for it?
>>>>
>>>> Reynold Xin 於 2022年12月12日 週一 下午2:39寫道:
>>>>
>>>>> Spark Connect :)
>>>>>
>>>>> (It’s work in progress)
>>>>>
>>>>>
>>>>> On Mon, Dec 12 2022 at
月12日 週一 下午2:42寫道:
>>
>>> Thanks for the quick response? Do we have any PR or Jira ticket for it?
>>>
>>> Reynold Xin 於 2022年12月12日 週一 下午2:39寫道:
>>>
>>>> Spark Connect :)
>>>>
>>>> (It’s work in progress)
>>>>
gt; Thanks for the quick response? Do we have any PR or Jira ticket for it?
>>
>> Reynold Xin 於 2022年12月12日 週一 下午2:39寫道:
>>
>>> Spark Connect :)
>>>
>>> (It’s work in progress)
>>>
>>>
>>> On Mon, Dec 12 2022 at 2:29 PM, K
Spark Connect :)
(It’s work in progress)
On Mon, Dec 12 2022 at 2:29 PM, Kevin Su < pings...@gmail.com > wrote:
>
> Hey there, How can I get the same spark context in two different python
> processes?
> Let’s say I create a context in Process A, and then I want to use python
Hey there, How can I get the same spark context in two different python
processes?
Let’s say I create a context in Process A, and then I want to use python
subprocess B to get the spark context created by Process A. How can I
achieve that?
I've tried pyspark.sql.SparkSession.builder.ap
11 matches
Mail list logo