I think you are running into a bug that will be fixed by this PR: https://github.com/apache/spark/pull/2850
On Mon, Oct 20, 2014 at 4:34 PM, tridib <tridib.sama...@live.com> wrote: > Hello Experts, > After repeated attempt I am unable to run query on map json date string. I > tried two approaches: > > *** Approach 1 *** created a Bean class with timespamp field. When I try to > run it I get scala.MatchError: class java.sql.Timestamp (of class > java.lang.Class). Here is the code: > import java.sql.Timestamp; > > public class ComplexClaim { > private Timestamp timestamp; > > public Timestamp getTimestamp() { > return timestamp; > } > > public void setTimestamp(Timestamp timestamp) { > this.timestamp = timestamp; > } > } > > JavaSparkContext ctx = getCtx(sc); > JavaSQLContext sqlCtx = getSqlCtx(getCtx(sc)); > String path = "/hdd/spark/test.json"; > JavaSchemaRDD test = sqlCtx.applySchema(ctx.textFile(path) , > ComplexClaim.class); > sqlCtx.registerRDDAsTable(test, "test"); > execSql(sqlCtx, "select * from test", 1); > > > > *** Approach 2 *** > Created a StructType to map the date field. I got scala.MatchError: > TimestampType (of class > org.apache.spark.sql.catalyst.types.TimestampType$). > here is the code: > > public StructType createStructType() { > List<StructField> fields = new ArrayList<StructField>(); > fields.add(DataType.createStructField("timestamp", > DataType.TimestampType, false)); > return DataType.createStructType(fields); > } > > public void testJsonStruct(SparkContext sc) { > JavaSQLContext sqlCtx = getSqlCtx(getCtx(sc)); > String path = "/hdd/spark/test.json"; > JavaSchemaRDD test = sqlCtx.jsonFile(path, createStructType()); > sqlCtx.registerRDDAsTable(test, "test"); > execSql(sqlCtx, "select * from test", 1); > } > > Input file has a single record: > {"timestamp":"2014-10-10T01:01:01"} > > > Thanks > Tridib > > > > > > -- > View this message in context: > http://apache-spark-user-list.1001560.n3.nabble.com/spark-sql-timestamp-in-json-fails-tp16864.html > Sent from the Apache Spark User List mailing list archive at Nabble.com. > > --------------------------------------------------------------------- > To unsubscribe, e-mail: user-unsubscr...@spark.apache.org > For additional commands, e-mail: user-h...@spark.apache.org > >