This is a basic scala problem. You cannot apply toInt to Any. Try doing toString.toInt
For such scala issues, I recommend trying it out in the Scala shell. For example, you could have tried this out as the following. [tdas @ Xion streaming] scala Welcome to Scala version 2.10.3 (Java HotSpot(TM) 64-Bit Server VM, Java 1.7.0_45). Type in expressions to have them evaluated. Type :help for more information. scala> "12".asInstanceOf[Any].toInt <console>:8: error: value toInt is not a member of Any "12".asInstanceOf[Any].toInt ^ scala> "12".asInstanceOf[Any].toString.toInt res1: Int = 12 scala> On Thu, Jul 17, 2014 at 10:32 AM, srinivas <kusamsrini...@gmail.com> wrote: > hi TD, > > Thanks for the solutions for my previous post...I am running into other > issue..i am getting data from json file and i am trying to parse it and > trying to map it to a record given below > > val jsonf > > =lines.map(JSON.parseFull(_)).map(_.get.asInstanceOf[scala.collection.immutable.Map[Any,Any]]).map(data=>Record(data("ID").toString,data("name").toString,data("score").toInt,data("school").toString)) > > > > case class Record(ID:String,name:String,score:Int,school:String) > > > when i am trying to do this i am getting an error > > [error] > > /home/ubuntu/spark-1.0.0/external/jsonfile2/src/main/scala/jsonfile.scala:36: > value toInt is not a member of Any > [error] > > lines.map(JSON.parseFull(_)).map(_.get.asInstanceOf[scala.collection.immutable.Map[Any,Any]]).map(data=>Record(data("ID").toString,data("name").toString,data("score").toInt,data("school").toString)) > [error] > > /home/ubuntu/spark-1.0.0/external/jsonfile2/src/main/scala/jsonfile.scala:36: > value toInt is not a member of Any > > I tried giving immutable.Map[Any,Int] and tried converting Int to string my > application compiled but i am getting exception when i am running it > > 14/07/17 17:11:30 ERROR Executor: Exception in task ID 6 > java.lang.ClassCastException: java.lang.String cannot be cast to > java.lang.Integer > at scala.runtime.BoxesRunTime.unboxToInt(BoxesRunTime.java:106) > > Basically i am trying to do max operation in my sparksql. > please let me know if their any work around solution for this. > > Thanks, > -Srinivas. > > > > -- > View this message in context: > http://apache-spark-user-list.1001560.n3.nabble.com/Spark-Streaming-Json-file-groupby-function-tp9618p10060.html > Sent from the Apache Spark User List mailing list archive at Nabble.com. >