Thanks Mafish.
Can you please point me which config need to be set correctly?
Amlan
On Mon, Feb 21, 2011 at 12:45 PM, Mafish Liu wrote:
> It seem you did not config your HDFS properly.
>
> "Caused by: java.lang.IllegalArgumentException: Wrong FS:
> hdfs://
> 192.168.1.22:54310/tmp/hive-hadoop/h
It seem you did not config your HDFS properly.
"Caused by: java.lang.IllegalArgumentException: Wrong FS:
hdfs://192.168.1.22:54310/tmp/hive-hadoop/hive_2011-02-21_12-09-42_678_6107747797061030113,
expected: hdfs://amlan-laptop.local:54310 "
2011/2/21 Amlan Mandal :
> To give more context my mul
To give more context my multinode hadoop is working fine. fs.default.name,
mapred.job.tracker settings are correct.
I can submit job to my multinode hadoop and see output. (One of the node
running namenode,datanode,job tracker , task tracker other running task
tracker and datanode)
On Mon, Feb 21
Earlier I had hive running on single node hadoop which was working fine. Now
I made it 2 node hadoop cluster. When I run hive from cli I am getting
following error
java.lang.RuntimeException: Error while making MR scratch directory - check
filesystem config (null)
at org.apache.hadoop.hive.ql
Hi all,
I am using UDFRowSequence as follows:
CREATE TEMPORARY FUNCTION rowSequence AS
'org.apache.hadoop.hive.contrib.udf.UDFRowSequence';
mapred.reduce.tasks=1;
CREATE TABLE temp_tc1_test
as
SELECT
rowSequence() AS id,
data_resource_id,
local_id,
local_parent_id,
name,
author
FROM n
Hi Tony,
We're still working on the index implementation in Hive, so index
support is very limited. When you use CREATE INDEX in Hive, you must
specify the index type. Currently, the only built-in index type is the
Compact index, though we are working to add bitmap indexes and others.
Suppose you
It is working perfectly after changing from union to union all. Thanks to
all.
2011/2/19 Jov
> if you want union,you should do it as this :
>
> select distinct ... from
> subquery1
> union all
> subquery2
>
> so,union = union distinct
> 在 2011-2-19 上午12:11,"sangeetha s" 写道:
>
> > Hi,
> >
> > T
I ran into some problems with this maybe you can help me out.
I have aux jars, in them I have a custom writable object,
I put my jars in auxlib, using hive interactive mode it works perfectly, but
Using TOAD for hive, the jobs fail, looking in the jobtracker I see that my
custom writable class c