Hi spark users,

Do you know how to access rows of row?

I have a SchemaRDD called user and register it as a table with the
following schema:

root
 |-- user_id: string (nullable = true)
 |-- item: array (nullable = true)
 |    |-- element: struct (containsNull = false)
 |    |    |-- item_id: string (nullable = true)
 |    |    |-- name: string (nullable = true)


val items=sqlContext.sql("select items from user where user_id = 1").first

The type of items is org.apache.spark.sql.Row. I want to iterate through
the items and count how many items that user_id = 1 has.

I could not find a method in which I can do that. The farthest I can get to
is to convert items.toSeq. The type information I got back is:

scala> items.toSeq
res57: Seq[Any] = [WrappedArray([1,orange],[2,apple])]

Any suggestion?

Best Regards,

Jerry

Reply via email to