ql.DataFrame = [_1: int, _2: timestamp]
> >
> > scala> df.filter($"_2" <= "2014-06-01").show
> > +--+--+
> > |_1|_2|
> > +--+--+
> > +--+--+
> >
> > Not sure if that is intended, but I cannot find any doc mentioning these
> > inconsistencies.
> >
> > Thanks.
> >
> > Justin
> >
> >
> > View this message in context: Inconsistent behavior with Dataframe
> Timestamp
> > between 1.3.1 and 1.4.0
> > Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
))
> df: org.apache.spark.sql.DataFrame = [_1: int, _2: timestamp]
>
> scala> df.filter($"_2" <= "2014-06-01").show
> +--+--+
> |_1|_2|
> +--+--+
> +--+--+
>
> Not sure if that is intended, but I cannot find any doc mentioning these
> inconsistencie
Hello,
I am trying out 1.4.0 and notice there are some differences in behavior
with Timestamp between 1.3.1 and 1.4.0.
In 1.3.1, I can compare a Timestamp with string.
scala> val df = sqlContext.createDataFrame(Seq((1,
Timestamp.valueOf("2015-01-01 00:00:00")), (2,
Timestamp.valueOf("2014-01-01 0