This is not currently supported. Right now you can only get RDD[Row] as
Ted suggested.
On Sun, Feb 22, 2015 at 2:52 PM, Ted Yu wrote:
> Haven't found the method in
> http://spark.apache.org/docs/latest/api/scala/index.html#org.apache.spark.sql.SchemaRDD
>
> The new DataFrame has this method:
>
Haven't found the method in
http://spark.apache.org/docs/latest/api/scala/index.html#org.apache.spark.sql.SchemaRDD
The new DataFrame has this method:
/**
* Returns the content of the [[DataFrame]] as an [[RDD]] of [[Row]]s.
* @group rdd
*/
def rdd: RDD[Row] = {
FYI
On Sun, Feb 22,
Hi Michael,
I think that the feature (convert a SchemaRDD to a structured class RDD) is
now available. But I didn't understand in the PR how exactly to do this. Can
you give an example or doc links?
Best regards
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.
Cool Thanks Michael!
Message sent from a mobile device - excuse typos and abbreviations
> Le 8 juil. 2014 à 22:17, Michael Armbrust [via Apache Spark User List]
> a écrit :
>
>> On Tue, Jul 8, 2014 at 12:43 PM, Pierre B <[hidden email]> wrote:
>> 1/ Is there a way to convert a SchemaRDD (for i
On Tue, Jul 8, 2014 at 12:43 PM, Pierre B <
pierre.borckm...@realimpactanalytics.com> wrote:
>
> 1/ Is there a way to convert a SchemaRDD (for instance loaded from a
> parquet
> file) back to a RDD of a given case class?
>
There may be someday, but doing so will either require a lot of reflection