More info on why toDF is required:
http://spark.apache.org/docs/latest/sql-programming-guide.html#upgrading-from-spark-sql-10-12-to-13
On Tue, Apr 14, 2015 at 6:55 AM, pishen tsai wrote:
> I've changed it to
>
> import sqlContext.implicits._
>
> but it still doesn't work. (I've updated the gist)
I've changed it to
import sqlContext.implicits._
but it still doesn't work. (I've updated the gist)
BTW, using ".toDF()" do work, thanks for this information.
Regards,
pishen
2015-04-14 20:35 GMT+08:00 Todd Nist :
> I think docs are correct. If you follow the example from the docs and add
>
I think docs are correct. If you follow the example from the docs and add
this import shown below, I believe you will get what your looking for:
// This is used to implicitly convert an RDD to a DataFrame.import
sqlContext.implicits._
You could also simply take your rdd and do the following:
lo
OK, it do work.
Maybe it will be better to update this usage in the official Spark SQL
tutorial:
http://spark.apache.org/docs/latest/sql-programming-guide.html
Thanks,
pishen
2015-04-14 15:30 GMT+08:00 fightf...@163.com :
> Hiļ¼there
>
> If you want to use the saveAsParquetFile, you may want to