uot;
Date: Wednesday, April 3, 2019 at 13:14
To: Vinoo Ganesh
Cc: Sean Owen , Arun Mahadevan ,
"dev@spark.apache.org"
Subject: Re: Closing a SparkSession stops the SparkContext
For #1, do we agree on the behavior? I think that closing a SparkSession should
not close the SparkContext u
rately stop the SparkContext then.
>
>
>
> *From: *Vinoo Ganesh
> *Date: *Tuesday, April 2, 2019 at 13:24
> *To: *Arun Mahadevan , Ryan Blue
> *Cc: *Sean Owen , "dev@spark.apache.org" <
> dev@spark.apache.org>
> *Subject: *Re: Closing a SparkSession stops the Sp
at 13:24
To: Arun Mahadevan , Ryan Blue
Cc: Sean Owen , "dev@spark.apache.org"
Subject: Re: Closing a SparkSession stops the SparkContext
// Merging threads
Thanks everyone for your thoughts. I’m very much in sync with Ryan here.
@Sean – To the point that Ryan made, it feels wrong th
On Tue, Apr 2, 2019 at 12:23 PM Vinoo Ganesh wrote:
> @Sean – To the point that Ryan made, it feels wrong that stopping a session
> force stops the global context. Building in the logic to only stop the
> context when the last session is stopped also feels like a solution, but the
> best way I
:31
To: Ryan Blue
Cc: Vinoo Ganesh , Sean Owen ,
"dev@spark.apache.org"
Subject: Re: Closing a SparkSession stops the SparkContext
I am not sure how would it cause a leak though. When a spark session or the
underlying context is stopped it should clean up everything. The getO
I am not sure how would it cause a leak though. When a spark session or the
underlying context is stopped it should clean up everything. The
getOrCreate is supposed to return the active thread local or the global
session. May be if you keep creating new sessions after explicitly clearing
the defaul
I think Vinoo is right about the intended behavior. If we support multiple
sessions in one context, then stopping any one session shouldn't stop the
shared context. The last session to be stopped should stop the context, but
not any before that. We don't typically run multiple sessions in the same
Yeah there's one global default session, but it's possible to create
others and set them as the thread's active session, to allow for
different configurations in the SparkSession within one app. I think
you're asking why closing one of them would effectively shut all of
them down by stopping the Sp
Hey Sean - Cool, maybe I'm misunderstanding the intent of clearing a session
vs. stopping it.
The cause of the leak looks to be because of this line here
https://github.com/apache/spark/blob/master/sql/core/src/main/scala/org/apache/spark/sql/util/QueryExecutionListener.scala#L131.
The Executi
What are you expecting there ... that sounds correct? something else
needs to be closed?
On Tue, Apr 2, 2019 at 9:45 AM Vinoo Ganesh wrote:
>
> Hi All -
>
>I’ve been digging into the code and looking into what appears to be a
> memory leak (https://jira.apache.org/jira/browse/SPARK-27337) an
10 matches
Mail list logo