just use SPARK CSV, all other ways of splitting and working is just trying to reinvent the wheel and a magnanimous waste of time.
Regards, Gourav On Mon, Sep 5, 2016 at 1:48 PM, Ashok Kumar <[email protected]> wrote: > Hi, > > I have a text file as below that I read in > > 74,20160905-133143,98.11218069128827594148 > 75,20160905-133143,49.52776998815916807742 > 76,20160905-133143,56.08029957123980984556 > 77,20160905-133143,46.63689526544407522777 > 78,20160905-133143,84.88227141164402181551 > 79,20160905-133143,68.72408602520662115000 > > val textFile = sc.textFile("/tmp/mytextfile.txt") > > Now I want to split the rows separated by "," > > scala> textFile.map(x=>x.toString).split(",") > <console>:27: error: value split is not a member of > org.apache.spark.rdd.RDD[String] > textFile.map(x=>x.toString).split(",") > > However, the above throws error? > > Any ideas what is wrong or how I can do this if I can avoid converting it > to String? > > Thanking > >
