Stage Level Scheduling -  https://issues.apache.org/jira/browse/SPARK-27495

Tom    On Monday, June 29, 2020, 11:07:18 AM CDT, Dongjoon Hyun 
<dongjoon.h...@gmail.com> wrote:  
 
 Hi, All.
After a short celebration of Apache Spark 3.0, I'd like to ask you the 
community opinion on Apache Spark 3.1 feature expectations.
First of all, Apache Spark 3.1 is scheduled for December 2020.- 
https://spark.apache.org/versioning-policy.html
I'm expecting the following items:
1. Support Scala 2.132. Use Apache Hadoop 3.2 by default for better cloud 
support3. Declaring Kubernetes Scheduler GA    In my perspective, the last main 
missing piece was Dynamic allocation and    - Dynamic allocation with shuffle 
tracking is already shipped at 3.0.    - Dynamic allocation with worker 
decommission/data migration is targeting 3.1. (Thanks, Holden)4. DSv2 
Stabilization
I'm aware of some more features which are on the way currently, but I love to 
hear the opinions from the main developers and more over the main users who 
need those features.
Thank you in advance. Welcome for any comments.
Bests,Dongjoon.  

Reply via email to