I changed the code to below...
JavaPairRDD rdd = sc.newAPIHadoopFile(inputFile,
ParquetInputFormat.class, NullWritable.class, String.class, mrConf);
JavaRDD words = rdd.values().flatMap(
new FlatMapFunction() {
public Iterable call(String x) {
return Arrays.asLi
You are missing input. Mrconf is not the way to add input files. In spark,
try Dataframe read functions or sc.textfile function.
Best
Ayan
On 23 Aug 2016 07:12, "shamu" wrote:
> Hi All,
> I am a newbie to Spark/Hadoop.
> I want to read a parquet file and a perform a simple word-count. Below is
>