What's your use case to compare intervals? It's tricky in Spark as there is only one interval type and you can't really compare one month with 30 days.
On Wed, Feb 12, 2020 at 12:01 AM Enrico Minack <m...@enrico.minack.dev> wrote: > Hi Devs, > > I would like to know what is the current roadmap of making > CalendarInterval comparable and orderable again (SPARK-29679, > SPARK-29385, #26337). > > With #27262, this got reverted but SPARK-30551 does not mention how to > go forward in this matter. I have found SPARK-28494, but this seems to > be stale. > > While I find it useful to compare such intervals, I cannot find a way to > work around the missing comparability. Is there a way to get, e.g. the > seconds that an interval represents to be able to compare intervals? In > org.apache.spark.sql.catalyst.util.IntervalUtils there are methods like > getEpoch or getDuration, which I cannot see are exposed to SQL or in the > org.apache.spark.sql.functions package. > > Thanks for the insights, > Enrico > > > --------------------------------------------------------------------- > To unsubscribe e-mail: dev-unsubscr...@spark.apache.org > >