Re: hi users

2012-07-04 Thread shaik ahamed
Hi Nitin, How can i check the dfs health? could u plz guide me the steps... On Thu, Jul 5, 2012 at 12:23 PM, Nitin Pawar wrote: > can you check dfs health? > > I think few of your nodes are down > > > On Thu, Jul 5, 2012 at 12:17 PM, shaik ahamed wrote: > >> Hi All, >> >> >>

Re: hi users

2012-07-04 Thread shaik ahamed
Thanks for the reply guys, Yesterday night im able to fecth the data .And my second node is down in the sence im not able to connect to the 2 machine as i have 3 machiens 1 master and 2 slave .As the 2 second one im not able to connect .Is this the prob for not retreiving the data, or other than t

Re: hi users

2012-07-04 Thread Mohammad Tariq
Hello shaik, Were you able to fetch the data earlier. I mean is it happening for the first time or you were not able to fetch the data even once?? Regards, Mohammad Tariq On Thu, Jul 5, 2012 at 12:17 PM, shaik ahamed wrote: > Hi All, > > >Im not able to fetch the

Re: hi users

2012-07-04 Thread Nitin Pawar
can you check dfs health? I think few of your nodes are down On Thu, Jul 5, 2012 at 12:17 PM, shaik ahamed wrote: > Hi All, > > >Im not able to fetch the data from the hive table ,getting > the below error > >FAILED: Error in semantic analysis: > > hive> select * from vender

hi users

2012-07-04 Thread shaik ahamed
Hi All, Im not able to fetch the data from the hive table ,getting the below error FAILED: Error in semantic analysis: hive> select * from vender; OK Failed with exception java.io.IOException:java.io.IOException: Could not obtain block: blk_-3328791500929854839_1178 file=/user

Re: Hive upload

2012-07-04 Thread Ruslan Al-Fakikh
Hi, Regarding the sqoop import: I noticed you wrote -table instead of --table (1 dash instead of 2) Ruslan On Wed, Jul 4, 2012 at 12:41 PM, Bejoy Ks wrote: > Hi Yogesh > > To add on, looks like the table definition doesn't match with data as well. > > Your table defn has 4 columns defined, with

Re: Hi

2012-07-04 Thread Bejoy KS
Hi Shaik Updates are not supported in hive. Still you can accomplish updates by over writing either a whole table or a partition. In short updates are not directly supported in hive, indirectly it is really expensive as well. Regards Bejoy KS Sent from handheld, please excuse typos. -Or

Hi

2012-07-04 Thread shaik ahamed
Hi All, We can update the records in the Hive table,if so please tell me the syntax in hive. Regards, shaik.

Re: Hive upload

2012-07-04 Thread Bejoy Ks
Hi Yogesh To add on, looks like the table definition doesn't match with data as well. Your table defn has 4 columns defined, with 4th column as int describe formatted letstry; OK # col_name        data_type       comment        rollno      int   

Re: Hive upload

2012-07-04 Thread Bejoy Ks
Hi Yogesh Looks like Sqoop import from rdbms to hdfs is getting successful but is failing at hive create table. You are seeing data in hive ware house because you have specified that as your target dir in sqoop import (--target-dir /user/hive/warehouse/new). It is recommended to use a different

RE: Hive upload

2012-07-04 Thread yogesh dhari
Hi Bejoy, Thank you very much for your response, 1) A) When I run command show tables it doesn't show newhive table.B) Yes the the newhive directory is present into /user/hive/warehouse and also containing the values imported from RDBMS Please suggest and give me an example for the sqoop i

Hive 0.8.0 - Add partitions programmtically to non-default db table

2012-07-04 Thread Priya Cheryl Sebastian
I am not using the default database in hive. And I need to add partitions programmatically. So I've tried the following via the JDBC thrift client, but nothing seems to work. It seems to only look at the default database. Any help appreciated. 1) ALTER TABLE test.webstat add if not exists par