Re: Separators in struct

2012-07-11 Thread Edward Capriolo
We surely can make it bigger. However there is a sublte problem. I have ran into maximum column length limitations in the metastore with heavily nested columns. MySQL varchar maxes etc. You should open a jirra issues on issues.apache.org/jira/hive Edward On Wed, Jul 11, 2012 at 5:10 PM, kulkarni.

Re: FileSystem Closed.

2012-07-11 Thread Aniket Mokashi
Can you share your query and use case? ~Aniket On Tue, Jul 10, 2012 at 9:39 AM, Harsh J wrote: > This appears to be a Hive issue (something probably called FS.close() > too early?). Redirecting to the Hive user lists as they can help > better with this. > > On Tue, Jul 10, 2012 at 9:59 PM, 안의건

Re: unable to create external table plz corrrect the syntax

2012-07-11 Thread Bejoy Ks
Hi Shaik For the correct syntax for create table statement please refer https://cwiki.apache.org/Hive/languagemanual-ddl.html#LanguageManualDDL-CreateTable  Please try our this command to avoid the syntax error Create external table vender(vender string,supplier string,order_date string,quanti

unable to create external table plz corrrect the syntax

2012-07-11 Thread shaik ahamed
Thanks for the reply guys, I have tried using the below cmd usr/local/hive-0.9.0# load data local inpath ‘/usr/local/hadoop_dir/hadoop/big_data/vender_details.txt’ into table vender; in the above hive path we cant load the data using the above cmd ? In the below there is an syntax error plz c

Re: hi all

2012-07-11 Thread Mapred Learn
You can create an external table to make your Data visible in hive. Sent from my iPhone On Jul 11, 2012, at 7:39 AM, shaik ahamed wrote: > Hi All, > >As i have a data of 100GB in HDFS as i want this 100 gb file to > move or copy to the hive directory or path how can i achieve th

Re: hi all

2012-07-11 Thread Bejoy KS
Hi Shaik If you already have the data in hdfs then just create an External Table with that hdfs location. You'll have the data in your hive table. Or if you want to have a managed table then also it is good use a Load data statement. It'd be faster as well since it is a hdfs move operation unde

Re: hi all

2012-07-11 Thread Mohammad Tariq
Try it out using "distcp" command. Regards, Mohammad Tariq On Wed, Jul 11, 2012 at 8:09 PM, shaik ahamed wrote: > Hi All, > >As i have a data of 100GB in HDFS as i want this 100 gb file to > move or copy to the hive directory or path how can i achieve this . > > Is there any cm

hi all

2012-07-11 Thread shaik ahamed
Hi All, As i have a data of 100GB in HDFS as i want this 100 gb file to move or copy to the hive directory or path how can i achieve this . Is there any cmd to run this. Please provide me a solution where i can load fast ... Thanks in advance Shaik

Re: How to Decide How Many Buckets to Use?

2012-07-11 Thread Mark Grover
Hi Xun, Here is an answer I had given earlier on the mailing list. See if that helps. http://mail-archives.apache.org/mod_mbox/hive-user/201204.mbox/%3c350967547.114894.1335230199385.javamail.r...@sms-zimbra-message-store-03.sms.scalar.ca%3E Mark - Original Message - From: "Xun TANG" To