Looking in the code

https://github.com/apache/spark/blob/master/sql/hive/src/main/scala/org/apache/spark/sql/hive/orc/OrcRelation.scala

I don’t think any of that advanced functionality is supported sorry �C there is 
a parameters option, but I don’t think it’s used for much.

Ewan

From: zhangjp [mailto:592426...@qq.com]
Sent: 20 November 2015 07:47
To: Fengdong&nbsp;Yu <fengdo...@everstring.com>
Cc: user <user@spark.apache.org>
Subject: 回复: has any spark write orc document

Thanks to Jeff Zhang and FengDong. When i use hive i know how to set orc table 
properties,but i don't know how to set orcfile properties when i write a 
orcfile using spark api. for example strip.size 、index.rows.
------------------ 原始邮件 ------------------
发件人: "Fengdong Yu"<fengdo...@everstring.com<mailto:fengdo...@everstring.com>>
发送时间: 2015年11月20日(星期五) 下午3:19
收件人: "zhangjp"<592426...@qq.com<mailto:592426...@qq.com>>;
抄送: "user"<user@spark.apache.org<mailto:user@spark.apache.org>>;
主题: Re: has any spark write orc document
You can use DataFrame:

sqlContext.write.format(“orc”).save(“xxxx")



On Nov 20, 2015, at 2:59 PM, zhangjp 
<592426...@qq.com<mailto:592426...@qq.com>> wrote:

Hi,
has any spark write orc document which like the parquet document.
 http://spark.apache.org/docs/latest/sql-programming-guide.html#parquet-files

Thanks

Reply via email to