The complete code is as follows:
JavaHiveContext ctx;
JavaSchemaRDD schemas=ctx.jsonRDD(arg0);
schemas.insertInto("test", true);
JavaSchemaRDD teeagers=ctx.hql("SELECT a,b FROM test");
List teeagerNames1=teeagers.map(new Function()
{
sorry ,a mistake by me.
the above code generate a result exactly like the one seen from hive.
NOW my question is can a hive table be applied to insertinto function?
why I keep geting 111,NULL instead of 111,222
--
View this message in context:
http://apache-spark-user-list.1001560.n3.na
It seems that the second problem is dependency issue.
and it works exactly like the first one.
*this is the complete code:*
JavaSchemaRDD schemas=ctx.jsonRDD(arg0);
schemas.insertInto("test", true);
JavaSchemaRDD teeagers=ctx.hql("SELECT a,b FROM test");
List teeagerNames1=teeagers.
eager to know this issue too,does any one knows how?
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/hdfs-replication-on-saving-RDD-tp289p9700.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.