Thanks for the response. I am wondering how you achieve that because I want to
implement such mechanism in my project. Are there any source codes I could
refer to?
From: Jeff Zhang
Date: Monday, June 27, 2022 at 7:30 PM
To: users
Subject: Re: Inquiry about how spark session is shared in
You can use per-note scoped mode, so that there will be multiple python
processes but share the same spark session.
Check this doc for more details
https://zeppelin.apache.org/docs/0.10.1/usage/interpreter/interpreter_binding_mode.html
On Tue, Jun 28, 2022 at 1:08 AM Chenyang Zhang wrote:
> Hi,
Hi,
Here is Chenyang. I am working on a project using PySpark and I am blocked
because I want to share data between different Spark applications. The
situation is that we have a running java server which can handles incoming
requests with a thread pool, and each thread has a corresponding pytho