We surely can make it bigger. However there is a sublte problem. I
have ran into maximum column length limitations in the metastore with
heavily nested columns. MySQL varchar maxes etc. You should open a
jirra issues on issues.apache.org/jira/hive
Edward
On Wed, Jul 11, 2012 at 5:10 PM, kulkarni.
Can you share your query and use case?
~Aniket
On Tue, Jul 10, 2012 at 9:39 AM, Harsh J wrote:
> This appears to be a Hive issue (something probably called FS.close()
> too early?). Redirecting to the Hive user lists as they can help
> better with this.
>
> On Tue, Jul 10, 2012 at 9:59 PM, 안의건
Hi Shaik
For the correct syntax for create table statement please refer
https://cwiki.apache.org/Hive/languagemanual-ddl.html#LanguageManualDDL-CreateTable
Please try our this command to avoid the syntax error
Create external table vender(vender string,supplier string,order_date
string,quanti
Thanks for the reply guys,
I have tried using the below cmd
usr/local/hive-0.9.0# load data local inpath
‘/usr/local/hadoop_dir/hadoop/big_data/vender_details.txt’ into table
vender;
in the above hive path we cant load the data using the above cmd ?
In the below there is an syntax error
plz c
You can create an external table to make your Data visible in hive.
Sent from my iPhone
On Jul 11, 2012, at 7:39 AM, shaik ahamed wrote:
> Hi All,
>
>As i have a data of 100GB in HDFS as i want this 100 gb file to
> move or copy to the hive directory or path how can i achieve th
Hi Shaik
If you already have the data in hdfs then just create an External Table with
that hdfs location. You'll have the data in your hive table.
Or if you want to have a managed table then also it is good use a Load data
statement. It'd be faster as well since it is a hdfs move operation unde
Try it out using "distcp" command.
Regards,
Mohammad Tariq
On Wed, Jul 11, 2012 at 8:09 PM, shaik ahamed wrote:
> Hi All,
>
>As i have a data of 100GB in HDFS as i want this 100 gb file to
> move or copy to the hive directory or path how can i achieve this .
>
> Is there any cm
Hi All,
As i have a data of 100GB in HDFS as i want this 100 gb file to
move or copy to the hive directory or path how can i achieve this .
Is there any cmd to run this.
Please provide me a solution where i can load fast ...
Thanks in advance
Shaik
Hi Xun,
Here is an answer I had given earlier on the mailing list. See if that helps.
http://mail-archives.apache.org/mod_mbox/hive-user/201204.mbox/%3c350967547.114894.1335230199385.javamail.r...@sms-zimbra-message-store-03.sms.scalar.ca%3E
Mark
- Original Message -
From: "Xun TANG"
To