Hi Nitin,
How can i check the dfs health? could u plz guide me the steps...
On Thu, Jul 5, 2012 at 12:23 PM, Nitin Pawar wrote:
> can you check dfs health?
>
> I think few of your nodes are down
>
>
> On Thu, Jul 5, 2012 at 12:17 PM, shaik ahamed wrote:
>
>> Hi All,
>>
>>
>>
Thanks for the reply guys,
Yesterday night im able to fecth the data .And my second node is down in
the sence im not able to connect to the 2 machine as i have 3 machiens 1
master and 2 slave .As the 2 second one im not able to connect .Is this the
prob for not retreiving the data, or other than t
Hello shaik,
Were you able to fetch the data earlier. I mean is it
happening for the first time or you were not able to fetch the data
even once??
Regards,
Mohammad Tariq
On Thu, Jul 5, 2012 at 12:17 PM, shaik ahamed wrote:
> Hi All,
>
>
>Im not able to fetch the
can you check dfs health?
I think few of your nodes are down
On Thu, Jul 5, 2012 at 12:17 PM, shaik ahamed wrote:
> Hi All,
>
>
>Im not able to fetch the data from the hive table ,getting
> the below error
>
>FAILED: Error in semantic analysis:
>
> hive> select * from vender
Hi All,
Im not able to fetch the data from the hive table ,getting
the below error
FAILED: Error in semantic analysis:
hive> select * from vender;
OK
Failed with exception java.io.IOException:java.io.IOException: Could not
obtain block: blk_-3328791500929854839_1178
file=/user
Hi,
Regarding the sqoop import: I noticed you wrote -table instead of
--table (1 dash instead of 2)
Ruslan
On Wed, Jul 4, 2012 at 12:41 PM, Bejoy Ks wrote:
> Hi Yogesh
>
> To add on, looks like the table definition doesn't match with data as well.
>
> Your table defn has 4 columns defined, with
Hi Shaik
Updates are not supported in hive. Still you can accomplish updates by over
writing either a whole table or a partition.
In short updates are not directly supported in hive, indirectly it is really
expensive as well.
Regards
Bejoy KS
Sent from handheld, please excuse typos.
-Or
Hi All,
We can update the records in the Hive table,if so please tell me
the syntax in hive.
Regards,
shaik.
Hi Yogesh
To add on, looks like the table definition doesn't match with data as well.
Your table defn has 4 columns defined, with 4th column as int
describe formatted letstry;
OK
# col_name data_type comment
rollno int
Hi Yogesh
Looks like Sqoop import from rdbms to hdfs is getting successful but is failing
at hive create table. You are seeing data in hive ware house because you have
specified that as your target dir in sqoop import (--target-dir
/user/hive/warehouse/new). It is recommended to use a different
Hi Bejoy,
Thank you very much for your response,
1)
A) When I run command show tables it doesn't show newhive table.B) Yes the
the newhive directory is present into /user/hive/warehouse and also containing
the values imported from RDBMS
Please suggest and give me an example for the sqoop i
I am not using the default database in hive. And I need to add partitions
programmatically. So I've tried the following via the JDBC thrift client, but
nothing seems to work. It seems to only look at the default database. Any help
appreciated.
1) ALTER TABLE test.webstat add if not exists par
12 matches
Mail list logo