cool thanks, will try
2015-09-24 9:32 GMT+01:00 Prasanth Jayachandran <
pjayachand...@hortonworks.com>:
> With 650 columns you might need to reduce the compression buffer size to
> 8KB (may be try decreasing it fails or increasing it if it succeeds to find
> the right size) down from default 256K
With 650 columns you might need to reduce the compression buffer size to 8KB
(may be try decreasing it fails or increasing it if it succeeds to find the
right size) down from default 256KB. You can do that by setting
orc.compress.size tblproperties.
On Sep 24, 2015, at 3:27 AM, Patrick Duin
ma
Thanks for the reply,
My first thought was out of memory as well but the illegal argument
exception happens before it is a separate entry in the log, The OOM
exception is not the cause. So I am not sure where that OOM exception fits
in. I've tried running it with more memory and got the same proble
Looks like you are running out of memory. Trying increasing the heap memory or
reducing the stripe size. How many columns are you writing? Any idea how many
record writers are open per map task?
- Prasanth
On Sep 22, 2015, at 4:32 AM, Patrick Duin
mailto:patd...@gmail.com>> wrote:
Hi all,
I
Hi all,
I am struggling trying to understand a stack trace I am getting trying to
write an ORC file:
I am using hive-0.13.0/hadoop-2.4.0.
2015-09-21 09:15:44,603 INFO [main] org.apache.hadoop.mapred.MapTask:
Ignoring exception during close for
org.apache.hadoop.mapred.MapTask$NewDirectOutputColle