Hi,
I'm trying create an external bucketed table but I'm having trouble
recreating the behavior of the hive partitioner used to create
internal bucketed tables.
My bucket key is a String s. Currently in my partitioner I'm using the
follow code which is based on my findings in the Hive codebase:
Hi Vinod,
Do you use remote server configuration ?
What version of hive do you use ?
We are using 0.7.1 where we see this issue.
Sent from my iPhone
On Aug 11, 2012, at 12:39 AM, Vinod Singh wrote:
> We run Hive jobs on 20+ TB data without any issues.
>
> Thanks,
> Vinod
>
> On Sat, Aug 11,
I see exception like:
Moving data to: hdfs://../hive/atangri_test_1
FAILED: Error in metadata: org.apache.thrift.transport.TTransportException:
java.net.SocketException: Connection timed out
FAILED: Execution Error, return code 1 from
org.apache.hadoop.hive.ql.exec.DDLTask
There is enough space
Hi Raihan
To fetch an element of an Array you can just specify the position of the
element in your query.
Say if you have a table like
'test_table' with a field 'arr_clmn Array', you can get the first
element of the array as
SELECT arr_clmn[0] FROM test_table;
Or if you are looking at explodi
Hi Anurag,
How much space is for /user and /tmp directory on client.
Did you check that part? , anything which might stop move task from
finishing.
---
Sent from Mobile , short and crisp.
On 11-Aug-2012 1:37 PM, "Anurag Tangri" wrote:
> Hi,
> We are facing this issue where we run a hiv
We run Hive jobs on 20+ TB data without any issues.
Thanks,
Vinod
On Sat, Aug 11, 2012 at 9:07 AM, Anurag Tangri wrote:
> Hi,
> We are facing this issue where we run a hive job over huge data about ~6
> TB input.
>
> We run this from hive client and hive metastore server is on another
> machine.