>From Spark Documentation: DataFrame parquetFile = sqlContext.read().parquet("people.parquet");
JavaRDD<String> jRDD= parquetFile.javaRDD() javaRDD() method will convert the DF to RDD On Thu, Mar 31, 2016 at 2:51 PM, Ramkumar V <ramkumar.c...@gmail.com> wrote: > Hi, > > I'm trying to read parquet log files in Java Spark. Parquet log files are > stored in hdfs. I want to read and convert that parquet file into JavaRDD. > I could able to find Sqlcontext dataframe api. How can I read if it > is sparkcontext and rdd ? what is the best way to read it ? > > *Thanks*, > <https://in.linkedin.com/in/ramkumarcs31> > >