OK, it do work.
Maybe it will be better to update this usage in the official Spark SQL
tutorial:
http://spark.apache.org/docs/latest/sql-programming-guide.html

Thanks,
pishen


2015-04-14 15:30 GMT+08:00 fightf...@163.com <fightf...@163.com>:

> Hi,there
>
> If you want to use the saveAsParquetFile, you may want to use
>     val log_df =  sqlContext.createDataFrame(logs)
>
> And then you can issue log_df.saveAsParquetFile (path)
>
> Best,
> Sun.
>
> ------------------------------
> fightf...@163.com
>
>
> *From:* pishen <pishe...@gmail.com>
> *Date:* 2015-04-14 15:18
> *To:* user <user@spark.apache.org>
> *Subject:* Cannot saveAsParquetFile from a RDD of case class
> Hello,
>
> I tried to follow the tutorial of Spark SQL, but is not able to
> saveAsParquetFile from a RDD of case class.
> Here is my Main.scala and build.sbt
> https://gist.github.com/pishen/939cad3da612ec03249f
>
> At line 34, compiler said that "value saveAsParquetFile is not a member of
> org.apache.spark.rdd.RDD[core.Log]"
>
> Any suggestion on how to solve this?
>
> Thanks,
> pishen
>
> ------------------------------
> View this message in context: Cannot saveAsParquetFile from a RDD of case
> class
> <http://apache-spark-user-list.1001560.n3.nabble.com/Cannot-saveAsParquetFile-from-a-RDD-of-case-class-tp22488.html>
> Sent from the Apache Spark User List mailing list archive
> <http://apache-spark-user-list.1001560.n3.nabble.com/> at Nabble.com.
>
>

Reply via email to