Hi Spark devs, Thank you, all who start working on the new data type. I added new 10 sub-tasks in the umbrella JIRA: https://issues.apache.org/jira/browse/SPARK-51162 including:
"starter" tasks: - SPARK-51455: Time literal generator in tests - SPARK-51403: Test TimeType as ordered and atomic type - SPARK-51557: Add tests for TIME data type in Java API - SPARK-51556: Add the try_to_time() function - SPARK-51562: Add the time() function - SPARK-51563: Support fully qualified type name TIME(n) WITHOUT TIME ZONE and those slightly more complex tasks: - SPARK-51413: Support TIME in Arrow - SPARK-51415: Support the time type by make_timestamp() - SPARK-51516: Support the TIME data type by Thrift Server - SPARK-51553: Modify EXTRACT to support TIME data type - SPARK-51554: Add the time_trunc() function - SPARK-51555: Add the timediff() function - SPARK-51564: TIME literals in the 12hr clock format If you are working on one of the tasks or plan to work, please, leave a comment in the JIRAs. Yours faithfully, Max Gekk On Thu, Mar 6, 2025 at 8:44 PM Sakthi <sak...@apache.org> wrote: > > This is great, Max! > > I would like to pick up the following two tasks: > - https://issues.apache.org/jira/browse/SPARK-51420 (Get Minutes of time) > [Starter] > - https://issues.apache.org/jira/browse/SPARK-51423 (Add the current_time() > function) > > Regards, > Sakthi > > On Thu, Mar 6, 2025 at 8:33 AM Rob Reeves <robert.p.ree...@gmail.com> wrote: >> >> Hi Max, >> >> I'll work on Add the make_time() function. >> >> Thanks, >> Rob >> >> On Thu, Mar 6, 2025 at 3:16 AM Max Gekk <max.g...@gmail.com> wrote: >>> >>> Hi Spark devs, >>> >>> I would like to invite you to develop the new data type TIME in Spark >>> SQL. At the moment, there are > 10 sub-tasks in the umbrella JIRA: >>> https://issues.apache.org/jira/browse/SPARK-51162 including some >>> "starter" tasks. If you have some ideas and proposals where we need to >>> support the new type in Spark SQL, please, and open another sub-tasks >>> in the umbrella JIRA. >>> >>> Just in case, if you start working on a sub-tasks, please, leave a >>> comment like "I am working on the task". In this way, we might avoid >>> conflicts and parallelize the work. >>> >>> Thank you in advance, >>> Max Gekk >>> >>> --------------------------------------------------------------------- >>> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org >>> --------------------------------------------------------------------- To unsubscribe e-mail: dev-unsubscr...@spark.apache.org