Looks like you are using Avro to read Parquet file.
Sincerely,
Sharath Punreddy
Email:srpunre...@gmail.com
Phone: 918-973-3399
On Tue, Dec 13, 2016 at 9:04 AM, ws wrote:
> Hive: 2.1.0
> Sqoop: 1.4.6
>
> ###
> hive> select * from dimemployee;
> OK
> Failed with exception java.io.IOException:java
You've got it. Welcome to the Hive wiki team, Michael!
-- Lefty
On Sun, Dec 18, 2016 at 1:25 PM, mikey d wrote:
> Requesting write access to Hive wiki
>
> Request send already to user-subscr...@hive.apache.org
>
> If there is anything futher needed, please let me know.
>
>
> User: mdeguzis
>
Requesting write access to Hive wiki
Request send already to user-subscr...@hive.apache.org
If there is anything futher needed, please let me know.
User: mdeguzis
On Sun, Dec 18, 2016 at 4:24 PM, mikey d wrote:
> Requesting write access to Hive wiki
>
> Request send already to user-subscr...
You can use straight sql command to create ORC table in Hive. Assuming you
have registered a temp table
val HiveContext = new org.apache.spark.sql.hive.HiveContext(sc)
s.registerTempTable("tmp")
sqltext = """
CREATE TABLE test.dummy2
(
ID INT
, CLUSTERED INT
, SCATTERED INT
, RANDOM
Hi,
When writing a dataframe using:
df.write.orc("/path/to/orc")
How can I specify orc parameters like orc.stripe.size ?
Thank you,
Daniel