Re: Unablee to get to_timestamp with Timezone Information

2020-04-28 Thread Chetan Khatri
Thanks Enrico, Magnus On Thu, Apr 2, 2020 at 11:49 AM Enrico Minack wrote: > Once parsed into a Timestamp the timestamp is store internally as UTC and > printed as your local timezone (e.g. as defined by > spark.sql.session.timeZone). Spark is good at hiding timezone information > from you. > >

Re: Unablee to get to_timestamp with Timezone Information

2020-04-02 Thread Enrico Minack
Once parsed into a Timestamp the timestamp is store internally as UTC and printed as your local timezone (e.g. as defined by spark.sql.session.timeZone). Spark is good at hiding timezone information from you. You can get the timezone information via date_format(column, format): import org.apa

Re: Unablee to get to_timestamp with Timezone Information

2020-03-31 Thread Chetan Khatri
Sorry misrepresentation the question also. Thanks for your great help. What I want is the time zone information as it is 2020-04-11T20:40:00-05:00 in timestamp datatype. so I can write to downstream application as it is. I can correct the lacking UTC offset info. On Tue, Mar 31, 2020 at 1:15 PM

Re: Unablee to get to_timestamp with Timezone Information

2020-03-31 Thread Magnus Nilsson
And to answer your question (sorry, read too fast). The string is not in proper ISO8601. Extended form must be used throughout, ie 2020-04-11T20:40:00-05:00, there's a colon (:) lacking in the UTC offset info. br, Magnus On Tue, Mar 31, 2020 at 7:11 PM Magnus Nilsson wrote: > Timestamps aren't

Re: Unablee to get to_timestamp with Timezone Information

2020-03-31 Thread Magnus Nilsson
Timestamps aren't timezoned. If you parse ISO8601 strings they will be converted to UTC automatically. If you parse timestamps without timezone they will converted to the the timezone the server Spark is running on uses. You can change the timezone Spark uses with spark.conf.set("spark.sql.session

Unablee to get to_timestamp with Timezone Information

2020-03-31 Thread Chetan Khatri
Hi Spark Users, I am losing the timezone value from below format, I tried couple of formats but not able to make it. Can someone throw lights? scala> val sampleDF = Seq("2020-04-11T20:40:00-0500").toDF("value") sampleDF: org.apache.spark.sql.DataFrame = [value: string] scala> sampleDF.select('va