Thanks for doing this!
I'd love to review these. I'm not an expert in Spark but i have some
historical context and understanding of the SparkRunner codebase. Feel free
to ping me any time if I take too long to get to one. Sometimes I lose
track of things :-)
Kenn
On Wed, Dec 4, 2024 at 9:50 AM L
This is your daily summary of Beam's current high priority issues that may need
attention.
See https://beam.apache.org/contribute/issue-priorities for the meaning and
expectations around issue priorities.
Unassigned P1 Issues:
https://github.com/apache/beam/issues/33254 The PreCommit Java
Is there any kind of multiprocess parallelism built into the Python sdk
worker? In other words, is there a way for my runner to start a worker and
have it has multiple cores instead of having one worker per core? I thought
I saw this capability somewhere but now that I look I can only see the
pipel
On Thu, Dec 5, 2024 at 4:03 PM Joey Tran wrote:
>
> Is there any kind of multiprocess parallelism built into the Python sdk
> worker? In other words, is there a way for my runner to start a worker and
> have it has multiple cores instead of having one worker per core? I thought I
> saw this cap