Large text loading into hive table

2012-09-06 Thread shaik ahamed
Hi Users, Please help me out in the below query's to go further in hive. 1.How can i load an large text into an hive table ,and the syntax of that. 2.Is it possible in hive to process an unstructured data i.e audio ,image and smiley symbol purely unstru

Need Unstructured data sample file

2012-08-14 Thread shaik ahamed
Hi Users, Need the clarifications on the below tasks. 1. What are the unstrutured type files in hive with examples . 2 .What is the cmd to load unstructured(images&audio) files into hive table. Please let me know is it possible going thorugh with the hive,i

ftp shell command from local machine to hdfs

2012-08-13 Thread shaik ahamed
Hi users, How can i load the local machine file through an ftp shell command in HDFS.Is it possible dng this in hadoop file system.If so please tell me the way dng this and the cmd. Thanks shaik.

unable to see the file

2012-07-26 Thread shaik ahamed
Hi Users, Before creating the table i enable the below cmds *set hive.exec.compress.output=true; set io.seqfile.compression.type=BLOCK;* As i created an external table with the below syntax *CREATE EXTERNAL TABLE test_data(vender string,supplier string,order_date st

Re: cmd for to know the buckets

2012-07-18 Thread shaik ahamed
cmd for syntax to to view that please reply me Regards shaik. On Wed, Jul 18, 2012 at 3:40 PM, Navis류승우 wrote: > Currently, configuring bucket num per partition is not allowed. > > If you want know the bucket num of table, use 'desc extended' or 'desc > formatted

cmd for to know the buckets

2012-07-18 Thread shaik ahamed
Hi users, As i would like to know the syntax or the cmd to know the buckets. For example for partitions as we will give the below cmd to know the partitions for a table *show partitions xyz; xyz(table name)* Please tell the cmd to view the buckets created? Regards, shai

error while inserting data into hive dynamic partition table

2012-07-17 Thread shaik ahamed
. Hi users, As im inserting 1GB data in to hive partition table and the job was ended with the below error Below is my vender1gb data vender string supplierstring order_date string quantity int Vendor_1Supplier_1212012-03-06 2763 Vendor_1

Re: unable to view the data in the hive tables

2012-07-16 Thread shaik ahamed
missions on the directory. If you had made any > permission changes recently, reverting those should resolve the issue. > > Regards > Bejoy KS > > Sent from handheld, please excuse typos. > ------ > *From: *shaik ahamed > *Date: *Mon, 16 Jul 2012

unable to view the data in the hive tables

2012-07-15 Thread shaik ahamed
Hi All, Please help me in gng forward with below error ,as im unable to see the table data . hive> show tables; Shutting down due to severe error. FAILED: Error in metadata: javax.jdo.JDODataStoreException: Failed to start database 'metastore_db', see the next exception for details. Ne

Re: unable to create external table plz corrrect the syntax

2012-07-12 Thread shaik ahamed
ing ? > > > On Thu, Jul 12, 2012 at 2:53 PM, shaik ahamed wrote: > >> i tried nitin, still i cant previously before sending to mailing list i >> tried all the always still getting the below error >> >> hive> load data local inpath >> './usr/local/ha

Re: hive opload shows NULL from hdfs

2012-07-12 Thread shaik ahamed
Hi yogesh, I too come across this prob,give the syntax as : *load data local inpath './xyz.txt’ into table abc;* here yogesh when creating the hive table u have metion the file type eg like , or tilda or pipe see below *create table vender(vender string,supplier string,order_date st

Re: unable to create external table plz corrrect the syntax

2012-07-12 Thread shaik ahamed
loading data after into > > it shd be load data local inpath > './usr/local/hadoop_dir/hadoop/big_data/vender_details.txt' into > table vender > > > On Thu, Jul 12, 2012 at 1:35 PM, shaik ahamed wrote: > >> Hi bejoy >> >> hive> Create ext

Re: unable to create external table plz corrrect the syntax

2012-07-12 Thread shaik ahamed
ution of this command the data files will be > moved to the table's location .( specified in previous step) > > Alternatively you can move or copy files within hdfs using hadoop fs copy > commands. > Regards > Bejoy KS > > Sent from handheld, please excuse typos. > ---

Re: unable to create external table plz corrrect the syntax

2012-07-12 Thread shaik ahamed
lfs to hdfs if the data volume is large (100G) it'll take some time. > > > Regards > > Bejoy KS > > > > > -- > *From:* shaik ahamed > *To:* user@hive.apache.org > *Sent:* Wednesday, July 11, 2012 8:38 PM > *Subject:* unable to create externa

unable to create external table plz corrrect the syntax

2012-07-11 Thread shaik ahamed
Thanks for the reply guys, I have tried using the below cmd usr/local/hive-0.9.0# load data local inpath ‘/usr/local/hadoop_dir/hadoop/big_data/vender_details.txt’ into table vender; in the above hive path we cant load the data using the above cmd ? In the below there is an syntax error plz c

hi all

2012-07-11 Thread shaik ahamed
Hi All, As i have a data of 100GB in HDFS as i want this 100 gb file to move or copy to the hive directory or path how can i achieve this . Is there any cmd to run this. Please provide me a solution where i can load fast ... Thanks in advance Shaik

hi

2012-07-06 Thread shaik ahamed
Hi all, As im trying to insert the data in to the hive table as im getting the below error Total MapReduce jobs = 2 Launching Job 1 out of 2 Number of reduce tasks is set to 0 since there's no reduce operator org.apache.hadoop.ipc.RemoteException: java.io.IOException: org.apache.hadoop.f

Re: hi all

2012-07-06 Thread shaik ahamed
t from handheld, please excuse typos. > ------ > *From: *shaik ahamed > *Date: *Fri, 6 Jul 2012 17:09:26 +0530 > *To: * > *ReplyTo: *user@hive.apache.org > *Subject: *hi all > > *Hi users,* > ** > * As im selecting the distinct column from the vende

hi all

2012-07-06 Thread shaik ahamed
*Hi users,* ** * As im selecting the distinct column from the vender Hive table * ** *Im getting the below error plz help me in this* ** *hive> select distinct supplier from vender_sample;* Total MapReduce jobs = 1 Launching Job 1 out of 1 Number of reduce tasks not specified. Estimat

Re: hi users

2012-07-05 Thread shaik ahamed
Thanks for ur reply nitin .. On Thu, Jul 5, 2012 at 12:30 PM, Nitin Pawar wrote: > read for hadoop dfs fsck command > > > On Thu, Jul 5, 2012 at 12:29 PM, shaik ahamed wrote: > >> Hi Nitin, >> >> How can i check the dfs health? could u plz guide me the st

Re: hi users

2012-07-04 Thread shaik ahamed
Hi Nitin, How can i check the dfs health? could u plz guide me the steps... On Thu, Jul 5, 2012 at 12:23 PM, Nitin Pawar wrote: > can you check dfs health? > > I think few of your nodes are down > > > On Thu, Jul 5, 2012 at 12:17 PM, shaik ahamed

Re: hi users

2012-07-04 Thread shaik ahamed
this. On Thu, Jul 5, 2012 at 12:23 PM, Nitin Pawar wrote: > can you check dfs health? > > I think few of your nodes are down > > > On Thu, Jul 5, 2012 at 12:17 PM, shaik ahamed wrote: > >> Hi All, >> >> >>Im not able to fetch the d

hi users

2012-07-04 Thread shaik ahamed
Hi All, Im not able to fetch the data from the hive table ,getting the below error FAILED: Error in semantic analysis: hive> select * from vender; OK Failed with exception java.io.IOException:java.io.IOException: Could not obtain block: blk_-3328791500929854839_1178 file=/user

Hi

2012-07-04 Thread shaik ahamed
Hi All, We can update the records in the Hive table,if so please tell me the syntax in hive. Regards, shaik.

hi all

2012-06-26 Thread shaik ahamed
Hi Users, As i created an hive table with the below syntax CREATE EXTERNAL TABLE vender_part(vender string, supplier string,quantity int ) PARTITIONED BY (order_date string) row format delimited fields terminated by ',' stored as textfile; And inserted the 100GB of data with the be

hive buckets

2012-06-25 Thread shaik ahamed
Hi Users, I am new to hive ,i need the clarity on hive buckets as this is done with the hive partition table only or we can create a buckets for hive table .please give an example with the sample data.