I see you have the code to convert to Record class but commented it out.
That is the right way to go. When you are converting it to a 4-tuple with "
(data("type"),data("name"),data("score"),data("school"))" ... its of type
(Any, Any, Any, Any)  as data("xyz") returns Any. And registerAsTable
probably doesnt work well with Any as the columns.

@michael any insights?

TD


On Mon, Jul 14, 2014 at 10:07 PM, srinivas <kusamsrini...@gmail.com> wrote:

> Hi TD,
>   Thanks for ur help...i am able to convert map to records using case
> class.
> I am left with doing some aggregations. I am trying to do some SQL type
> operations on my records set. My code looks like
>
>  case class Record(ID:Int,name:String,score:Int,school:String)
> //val records = jsonf.map(m => Record(m(0),m(1),m(2),m(3)))
> val fields = jsonf.map(data =>
> (data("type"),data("name"),data("score"),data("school")))
> val results = fields.transform((rdd,time) => {
>  rdd.registerAsTable("table1")
>  sqlc.sql(select * from table1)
> })
>
> when i am trying to compile my code it  giving me
> jsonfile.scala:30: value registerAsTable is not a member of
> org.apache.spark.rdd.RDD[(Any, Any, Any, Any)]
>
> Please let me know if i am missing any thing.
> And using Spark Streaming can i really use sql kind of operations on
> Dstreams?
>
>
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/Spark-Streaming-Json-file-groupby-function-tp9618p9714.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>

Reply via email to