+1. Thanks, Yi ~

Bests,
Kent Yao 
@ Data Science Center, Hangzhou Research Institute, NetEase Corp.
a spark enthusiast
kyuubiis a unified multi-tenant JDBC interface for large-scale data processing and analytics, built on top of Apache Spark.

spark-authorizerA Spark SQL extension which provides SQL Standard Authorization for Apache Spark.
spark-postgres A library for reading data from and transferring data to Postgres / Greenplum with Spark SQL and DataFrames, 10~100x faster.
itatchiA library that brings useful functions from various modern database management systems to Apache Spark.




On 06/9/2021 11:54Takeshi Yamamuro<linguin....@gmail.com> wrote:
+1. Thank you, Yi ~

Bests,
Takeshi

On Wed, Jun 9, 2021 at 12:18 PM Mridul Muralidharan <mri...@gmail.com> wrote:

+1

Regards,
Mridul 

On Tue, Jun 8, 2021 at 10:11 PM Hyukjin Kwon <gurwls...@gmail.com> wrote:
Yeah, +1

2021년 6월 9일 (수) 오후 12:06, Yi Wu <yi...@databricks.com>님이 작성:
Hi, All.

Since Apache Spark 3.0.2 tag creation (Feb 16),
new 119 patches (92 issues resolved) arrived at branch-3.0.

Shall we make a new release, Apache Spark 3.0.3, as the 3rd release at the 3.0 line?
I'd like to volunteer as the release manager for Apache Spark 3.0.3.
I'm thinking about starting the first RC at the end of this week.

$ git log --oneline v3.0.2..HEAD | wc -l
     119

# Known correctness issues
SPARK-34534     New protocol FetchShuffleBlocks in OneForOneBlockFetcher lead to data loss or correctness
SPARK-34545     PySpark Python UDF return inconsistent results when applying 2 UDFs with different return type to 2 columns together
SPARK-34719     fail if the view query has duplicated column names
SPARK-34794     Nested higher-order functions broken in DSL

# Notable user-facing changes
SPARK-32924     Web UI sort on duration is wrong
SPARK-35405     Submitting Applications documentation has outdated information about K8s client mode support

Thanks,
Yi


--
---
Takeshi Yamamuro
--------------------------------------------------------------------- To unsubscribe e-mail: dev-unsubscr...@spark.apache.org

Reply via email to