Hi,

Can you please help me to solve the below problem. I have spend on this 2
days, still i could not solve this.

Dynamic partition is not working as expected.



To Test Dynamic-partition Insert

Create source table
------------------
CREATE EXTERNAL TABLE testmove (
    a  string,
    b string
  )
  PARTITIONED BY (cust string, dt string);
Data has been kept in /usr/hive/warehouse/testmove/cust=a/dt=20100102/a.txt
a.txt has 1 row the value is "a", "b"

Create Destination table
-----------------------
CREATE EXTERNAL TABLE testmove1 (
    a  string,
    b string
  )
  PARTITIONED BY (cust string, dt string)
Run the query for dynamic partion insert
---------------------------------------
set hive.exec.dynamic.partition=true;
set hive.exec.dynamic.partition.mode=nonstrict;

FROM testmove t
    INSERT OVERWRITE TABLE testmove1 PARTITIONS (cust, dt)
        SELECT t.a, t.b, 'a', '20100102';

FROM testmove t
    INSERT OVERWRITE TABLE testmove1 PARTITION (cust='a', dt)
        SELECT t.a, t.b, t.dt;



    INSERT OVERWRITE TABLE testmove1 PARTITION (cust, dt)
    SELECT * FROM (
        SELECT a, b, cust, dt from testmove DISTRIBUTE BY cust, dt
    ) X;

output
-------
otal MapReduce jobs = 2
Launching Job 1 out of 2
Number of reduce tasks is set to 0 since there's no reduce operator
Execution log at:
/tmp/root/root_20101103170404_9e869676-7bb5-4655-b027-5bcb4b7fa2cb.log
Job running in-process (local Hadoop)
2010-11-03 17:04:06,818 null map = 100%,  reduce = 0%
Ended Job = job_local_0001
Ended Job = -645725555, job is filtered out (removed at runtime).
Moving data to:
file:/tmp/hive-root/hive_2010-11-03_17-03-59_979_5901061386316364507/-ext-10000
Loading data to table testmove1 partition (cust=null, dt=null)
[Warning] could not update stats.
OK

If i run as static partion is the data is inserted in to destination table.

FROM testmove t
    INSERT OVERWRITE TABLE testmove1 PARTITION (cust='a', dt='20100102')
        SELECT t.a, t.b;

On Tue, Nov 9, 2010 at 2:00 PM, yongqiang he <heyongqiang...@gmail.com>wrote:

> Maybe you can try to run "export HIVE_AUX_JARS_PATH=jar paths" before
> starting hive.
>
> Yongqiang
>
> On Mon, Nov 8, 2010 at 10:03 PM, Stuart Smith <stu24m...@yahoo.com> wrote:
>
>>
>> Hello Ted,
>>
>>   Yes, I saw that mail when I googled. I'm pretty sure I didn't have the
>> same problem. I used the stable release tarball, which only has one bin dir,
>> which I ran the exe from.
>>
>> Adding Classpath back onto HADOOP_CLASSPATH worked for me.
>> Was my solution misleading? I wouldn't want to confuse people, but it did
>> work once I added that back on..
>>
>> Thanks!
>>
>> Take care,
>>   -stu
>>
>>
>> --- On *Tue, 11/9/10, Ted Yu <yuzhih...@gmail.com>* wrote:
>>
>>
>> From: Ted Yu <yuzhih...@gmail.com>
>> Subject: Re: Hive Getting Started Wiki assumes $CLASSPATH at end of
>> HADOOP_CLASSPATH
>> To: user@hive.apache.org
>> Date: Tuesday, November 9, 2010, 12:45 AM
>>
>>
>> Please see Edward's reply to 'Exception in hive startup' on Oct 13th.
>> Try running with <install-dir>/bin/hive
>>
>> On Mon, Nov 8, 2010 at 7:02 PM, Stuart Smith 
>> <stu24m...@yahoo.com<http://mc/compose?to=stu24m...@yahoo.com>
>> > wrote:
>>
>> Hello,
>>
>>   I'm just starting with hive, and I ran into a newbie problem that didn't
>> have a solution via google. So I thought I'd record the solution for
>> posterity (and other hapless newbies) :)
>>
>> I've been using hadoop/hbase for a while, and have configured
>> hadoop-env.sh a bit here and there (to work with hbase, etc). At some point,
>> I dropped the $CLASSPATH off the end of the standard line:
>>
>> export
>> HADOOP_CLASSPATH=/home/stu/hbase/hbase-0.20.6.jar:/home/stu/hbase/hbase-0.20.6-test.jar:/home/stu/hbase/conf:/home/stu/hbase/lib/zookeeper-3.2.2.jar:$CLASSPATH
>>
>> So it became:
>>
>> # Extra Java CLASSPATH elements.  Optional.
>> export
>> HADOOP_CLASSPATH=/home/stu/hbase/hbase-0.20.6.jar:/home/stu/hbase/hbase-0.20.6-test.jar:/home/stu/hbase/conf:/home/stu/hbase/lib/zookeeper-3.2.2.jar
>>
>> (probably when I added the hbase stuff or something). My hadoop/hbase set
>> up runs fine, so I never noticed.
>>
>> Well, if you do that, and you try to run the hive shell, you get the:
>>
>> s...@ubuntu-update:~/hive-0.6.0/bin/ext$
>> /home/stu/hadoop-0.20.2/bin/hadoop jar
>> /home/stu/hive-0.6.0/lib/hive-cli-0.6.0.jar
>> org.apache.hadoop.hive.cli.CliDriver
>> Exception in thread "main" java.lang.NoClassDefFoundError:
>> org/apache/hadoop/hive/conf/HiveConf
>>        at java.lang.Class.forName0(Native Method)
>>        at java.lang.Class.forName(Class.java:247)
>>        at org.apache.hadoop.util.RunJar.main(RunJar.java:149)
>> Caused by: java.lang.ClassNotFoundException:
>> org.apache.hadoop.hive.conf.HiveConf
>>        at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
>>        at java.security.AccessController.doPrivileged(Native Method)
>>        at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
>>        at java.lang.ClassLoader.loadClass(ClassLoader.java:307)
>>        at java.lang.ClassLoader.loadClass(ClassLoader.java:248)
>>        ... 3 more
>>
>> error, even if you've followed the wiki correctly and set HADOOP_HOME and
>> HIVE_HOME correctly. Note the command line above is a little strange,
>> because I was debugging through the $HIVE_HOME/bin/hive script... (So I
>> printed out the classpath it was forming, set it by hand, ran the
>> instructions by hand, etc).
>>
>> This is installing from the hive tar (stable). But that doesn't matter.
>>
>> Anyways, hope the answer helps someone..
>>
>> Best,
>>  -stu
>>
>>
>>
>>
>>
>>
>>
>>
>

Reply via email to