I am using Spark version is 1.5.1, I am getting errors in first program of
spark,ie.e., word count. Please help me to solve this

*scala> val inputfile = sc.textFile("input.txt")*
*inputfile: org.apache.spark.rdd.RDD[String] = MapPartitionsRDD[13] at
textFile at <console>:21*

*scala> val counts = inputFile.flatMap(line => line.split(" ")).map(word =>
(word,1)).reduceByKey(_ + _);*
*<console>:19: error: not found: value inputFile*
*       val counts = inputFile.flatMap(line => line.split(" ")).map(word =>
(word,1)).reduceByKey(_ + _);*
*                    ^*

Reply via email to