hi TD, Thanks for the solutions for my previous post...I am running into other issue..i am getting data from json file and i am trying to parse it and trying to map it to a record given below
val jsonf =lines.map(JSON.parseFull(_)).map(_.get.asInstanceOf[scala.collection.immutable.Map[Any,Any]]).map(data=>Record(data("ID").toString,data("name").toString,data("score").toInt,data("school").toString)) case class Record(ID:String,name:String,score:Int,school:String) when i am trying to do this i am getting an error [error] /home/ubuntu/spark-1.0.0/external/jsonfile2/src/main/scala/jsonfile.scala:36: value toInt is not a member of Any [error] lines.map(JSON.parseFull(_)).map(_.get.asInstanceOf[scala.collection.immutable.Map[Any,Any]]).map(data=>Record(data("ID").toString,data("name").toString,data("score").toInt,data("school").toString)) [error] /home/ubuntu/spark-1.0.0/external/jsonfile2/src/main/scala/jsonfile.scala:36: value toInt is not a member of Any I tried giving immutable.Map[Any,Int] and tried converting Int to string my application compiled but i am getting exception when i am running it 14/07/17 17:11:30 ERROR Executor: Exception in task ID 6 java.lang.ClassCastException: java.lang.String cannot be cast to java.lang.Integer at scala.runtime.BoxesRunTime.unboxToInt(BoxesRunTime.java:106) Basically i am trying to do max operation in my sparksql. please let me know if their any work around solution for this. Thanks, -Srinivas. -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Spark-Streaming-Json-file-groupby-function-tp9618p10060.html Sent from the Apache Spark User List mailing list archive at Nabble.com.