Hi -

When I run the following Spark SQL query in Spark-shell ( version 1.1.0) :

val rdd = sqlContext.sql("SELECT a FROM x WHERE ts >= '2012-01-01T00:00:00' AND 
ts <= '2012-03-31T23:59:59' ")

it gives the following error:
rdd: org.apache.spark.sql.SchemaRDD =
SchemaRDD[294] at RDD at SchemaRDD.scala:103
== Query Plan ==
== Physical Plan ==
java.util.NoSuchElementException: head of empty list

The ts column in the where clause has timestamp data and is of type timestamp. 
If I replace the string '2012-01-01T00:00:00' in the where clause with its 
epoch value, then the query works fine.

It looks I have run into an issue described in this pull request: 
https://github.com/apache/spark/pull/2084

Is that PR not merged in Spark version 1.1.0? Or am I missing something?

Thanks,
Mohammed


Reply via email to