Can you try query like “SELECT timestamp, CAST(timestamp as string) FROM logs 
LIMIT 5”, I guess you probably ran into the timestamp precision or the timezone 
shifting problem.

(And it’s not mandatory, but you’d better change the field name from 
“timestamp” to something else, as “timestamp” is the keyword of data type in 
Hive/Spark SQL.)

From: Alessandro Panebianco [mailto:ale.panebia...@me.com]
Sent: Monday, November 24, 2014 11:12 AM
To: Wang, Daoyuan
Cc: u...@spark.incubator.apache.org
Subject: Re: SparkSQL Timestamp query failure

Hey Daoyuan,

following your suggestion I obtain the same result as when I do:

where l.timestamp = '2012-10-08 16:10:36.0’

what happens using either your suggestion or simply using single quotes as I 
just typed in the example before is that the query does not fail but it doesn’t 
return anything either as it should.

If I do a simple :

SELECT timestamp FROM Logs limit 5").collect.foreach(println)

I get:

[2012-10-08 16:10:36.0]
[2012-10-08 16:10:36.0]
[2012-10-08 16:10:36.0]
[2012-10-08 16:10:41.0]
[2012-10-08 16:10:41.0]

that is why I am sure that putting one of those timestamps should not return an 
empty arrray.

Id really love to find a solution to this problem. Since Spark supports 
Timestamp it should provide simple comparison actions with them in my opinion.

Any other help would be greatly appreciated.

Alessandro




On Nov 23, 2014, at 8:10 PM, Wang, Daoyuan 
<daoyuan.w...@intel.com<mailto:daoyuan.w...@intel.com>> wrote:

Hi,

I think you can try
cast(l.timestamp as string)='2012-10-08 16:10:36.0'

Thanks,
Daoyuan

-----Original Message-----
From: whitebread [mailto:ale.panebia...@me.com]
Sent: Sunday, November 23, 2014 12:11 AM
To: u...@spark.incubator.apache.org<mailto:u...@spark.incubator.apache.org>
Subject: Re: SparkSQL Timestamp query failure

Thanks for your answer Akhil,

I have already tried that and the query actually doesn't fail but it doesn't 
return anything either as it should.
Using single quotes I think it reads it as a string and not as a timestamp.

I don't know how to solve this. Any other hint by any chance?

Thanks,

Alessandro



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/SparkSQL-Timestamp-query-failure-tp19502p19554.html
Sent from the Apache Spark User List mailing list archive at 
Nabble.com<http://Nabble.com>.

---------------------------------------------------------------------
To unsubscribe, e-mail: 
user-unsubscr...@spark.apache.org<mailto:user-unsubscr...@spark.apache.org> For 
additional commands, e-mail: 
user-h...@spark.apache.org<mailto:user-h...@spark.apache.org>

Reply via email to