Did you re-create your df when you update the timezone conf?
On Wed, Apr 24, 2019 at 9:18 PM Shubham Chaurasia
wrote:
> Writing:
> scala> df.write.orc("")
>
> For looking into contents, I used orc-tools-X.Y.Z-uber.jar (
> https://orc.apache.org/docs/java-tools.html)
>
> On Wed, Apr 24, 2019 at 6
Writing:
scala> df.write.orc("")
For looking into contents, I used orc-tools-X.Y.Z-uber.jar (
https://orc.apache.org/docs/java-tools.html)
On Wed, Apr 24, 2019 at 6:24 PM Wenchen Fan wrote:
> How did you read/write the timestamp value from/to ORC file?
>
> On Wed, Apr 24, 2019 at 6:30 PM Shubha
How did you read/write the timestamp value from/to ORC file?
On Wed, Apr 24, 2019 at 6:30 PM Shubham Chaurasia
wrote:
> Hi All,
>
> Consider the following(spark v2.4.0):
>
> Basically I change values of `spark.sql.session.timeZone` and perform an
> orc write. Here are 3 samples:-
>
> 1)
> scala>
Hi All,
Consider the following(spark v2.4.0):
Basically I change values of `spark.sql.session.timeZone` and perform an
orc write. Here are 3 samples:-
1)
scala> spark.conf.set("spark.sql.session.timeZone", "Asia/Kolkata")
scala> val df = sc.parallelize(Seq("2019-04-23
09:15:04.0")).toDF("ts").w