Hi all,
I encountered weird behavior for timestamp. It seems that when using lit to add 
it to column, the timestamp goes from milliseconds representation to seconds 
representation:


scala> spark.range(1).withColumn("a", lit(new 
java.sql.Timestamp(1485503350000L)).cast("long")).show()
+---+----------+
| id|         a|
+---+----------+
|  0|1485503350|
+---+----------+


scala> spark.range(1).withColumn("a", 
lit(1485503350000L).cast(org.apache.spark.sql.types.TimestampType).cast(org.apache.spark.sql.types.LongType)).show()
+---+-------------+
| id|            a|
+---+-------------+
|  0|1485503350000|
+---+-------------+


Is this a bug or am I missing something here?

Thanks,
        Assaf





--
View this message in context: 
http://apache-spark-developers-list.1001551.n3.nabble.com/Possible-bug-inconsistent-timestamp-behavior-tp22144.html
Sent from the Apache Spark Developers List mailing list archive at Nabble.com.

Reply via email to