JavaSparkContext ctx = new JavaSparkContext(sparkConf); SQLContext sqlContext = new SQLContext(ctx);
DataFrame parquetFile = sqlContext.parquetFile( "hdfs:/XYZ:8020/user/hdfs/parquet/*.parquet"); parquetFile.registerTempTable("parquetFile"); DataFrame tempDF = sqlContext.sql("SELECT TOUR.CITIES, BUDJET from parquetFile"); JavaRDD<Row>jRDD = tempDF.toJavaRDD(); JavaRDD<String> ones = jRDD.map(new Function<Row,String>() { public String call(Row row) throws Exception { return row.getString(1); } }); *Thanks*, <https://in.linkedin.com/in/ramkumarcs31> On Tue, Apr 26, 2016 at 3:48 PM, Hyukjin Kwon <gurwls...@gmail.com> wrote: > Could you maybe share your codes? > On 26 Apr 2016 9:51 p.m., "Ramkumar V" <ramkumar.c...@gmail.com> wrote: > >> Hi, >> >> I had loaded JSON file in parquet format into SparkSQL. I can't able to >> read List which is inside JSON. >> >> Sample JSON >> >> { >> "TOUR" : { >> "CITIES" : ["Paris","Berlin","Prague"] >> }, >> "BUDJET" : 1000000 >> } >> >> I want to read value of CITIES. >> >> *Thanks*, >> <https://in.linkedin.com/in/ramkumarcs31> >> >>