s post shuffle partition
number on each exchange which can help your use case. This feature is available
from Spark 3.
From: Anupam Singh
Sent: Saturday, September 10, 2022 10:23 PM
To: Vibhor Gupta
Cc: user@spark.apache.org
Subject: [EXTERNAL] Re: Dynamic shuffle partitions in a single job
You
Commenting for better reach :)
On Thu, Sep 8, 2022, 11:56 AM Vibhor Gupta
wrote:
> Hi Community,
>
> Is it possible to set no of shuffle partitions per exchange ?
>
> My spark query contains a lot of joins/aggregations involving big tables
> and small tables, so keeping a high value of spark.sql
Hi Community,
Is it possible to set no of shuffle partitions per exchange ?
My spark query contains a lot of joins/aggregations involving big tables and
small tables, so keeping a high value of spark.sql.shuffle.partitions helps
with big tables, for small tables it creates a lot of overhead on