Hi dev,

After looking into the details of this and discussing with the other authors that previously created PR for SPARK-20384. It resulted in the following PR: https://github.com/apache/spark/pull/33205

The core of the change (excluding new test cases) is quite small. It is only a change of `24 insertions(+), 9 deletions(-)` in `sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/ScalaReflection.scala` . The change is backwards compatible and will not break any currently working code.

The main driver for us wanting this change is that having value classes not work means that we can't incrementally use/move to these spark features without changing a bunch of our current modeling.

Also I think that the changes being brought up multiple times before shows that there is some general interests from the community of fixing this.

I encourage anyone interested in this feature to leave a review/comments on that PR.

Also it would be great if some admin would be able to review it. Or any tips on steps that should be taken in order to have the PR reviewed would be appreciated.

/ Emil


On 25/05/2021 16:33, Emil Ejbyfeldt wrote:
Hi dev,

I am interested getting the support value classes in schemas of Dataset merged and I am willing to work on it.

There are two previous PRs created for this JIRA (SPARK-20384) first https://github.com/apache/spark/pull/22309 and more recently https://github.com/apache/spark/pull/27153 (marked stale ~1year ago). It does not seem to me that the PR have been meet with any resistance but have the activity has just died out and therefore the changes have not been merged.

Before spending more time on this I would like to that there is any known problems with supporting this that has caused the previous PRs to not be merged?

I think the changes proposed in the later PR is still valid and a good approach for adding support. Should I ask to have that PR reopened or creating a new one since I am not the original author?

/ Emil



---------------------------------------------------------------------
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org

Reply via email to