The implicit conversion function mentioned by Hao is createSchemaRDD in
SQLContext/HiveContext.
You can import it by doing
val sqlContext = new org.apache.spark.sql.SQLContext(sc)
// Or new org.apache.spark.sql.hive.HiveContext(sc) for HiveContext
import sqlContext.createSchemaRDD
On Wed, Oct 2
You needn't do anything, the implicit conversion should do this for you.
https://github.com/apache/spark/blob/master/sql/core/src/main/scala/org/apache/spark/sql/SQLContext.scala#L103
https://github.com/apache/spark/blob/2ac40da3f9fa6d45a59bb45b41606f1931ac5e81/sql/catalyst/src/main/scala/org/apac