i aslo meet the same question,so if you know how to slove it ,please
tell me .Ths.

在 2011年12月21日 下午6:00,王锋 <wfeng1...@163.com> 写道:
> I just create a table fool; And table data is :
> sellerId consumerId
> 1000,4000
> 1000,4001
> 1002,4002
> 1003,4000
> 1004,4001
> 1005,4002
> 1000,4000
>
> total 8 rows;
>
> then I execute sql   "select consumerId from fool".
>
> When the job was executing, I watch one  map  log. through
> "http://ip:50030/";
>
> syslog logs
>
> 2011-12-21 17:57:57,013 WARN org.apache.hadoop.conf.Configuration:
> /analyser/hdfs/mapred/local/taskTracker/hadoop/jobcache/job_201112161932_0283/job.xml:a
> attempt to override final parameter: mapred.hosts.exclude;  Ignoring
>  .
> 2011-12-21 17:57:57,074 WARN org.apache.hadoop.util.NativeCodeLoader: Unable
> to load native-hadoop library for your platform... using builtin-java
> classes where applicable
> 2011-12-21 17:57:57,208 INFO org.apache.hadoop.mapred.TaskRunner: Creating
> symlink:
> /analyser/hdfs/mapred/local/taskTracker/distcache/-5775579569808818942_-277615576_1611537495/h-master/tmp/hive-hadoop/hive_2011-12-21_17-57-43_970_69237304724972937/-mr-10003/9bc47070-2f7c-4ad0-8a16-d5687fad2a11
> <-
> /analyser/hdfs/mapred/local/taskTracker/hadoop/jobcache/job_201112161932_0283/attempt_201112161932_0283_m_000000_0/work/HIVE_PLAN9bc47070-2f7c-4ad0-8a16-d5687fad2a11
> 2011-12-21 17:57:57,214 INFO
> org.apache.hadoop.filecache.TrackerDistributedCacheManager: Creating
> symlink:
> /analyser/hdfs/mapred/local/taskTracker/hadoop/jobcache/job_201112161932_0283/jars/.job.jar.crc
> <-
> /analyser/hdfs/mapred/local/taskTracker/hadoop/jobcache/job_201112161932_0283/attempt_201112161932_0283_m_000000_0/work/.job.jar.crc
> 2011-12-21 17:57:57,215 INFO
> org.apache.hadoop.filecache.TrackerDistributedCacheManager: Creating
> symlink:
> /analyser/hdfs/mapred/local/taskTracker/hadoop/jobcache/job_201112161932_0283/jars/job.jar
> <-
> /analyser/hdfs/mapred/local/taskTracker/hadoop/jobcache/job_201112161932_0283/attempt_201112161932_0283_m_000000_0/work/job.jar
> 2011-12-21 17:57:57,291 INFO org.apache.hadoop.metrics.jvm.JvmMetrics:
> Initializing JVM Metrics with processName=MAP, sessionId=
> 2011-12-21 17:57:57,368 WARN org.apache.hadoop.conf.Configuration:
> /analyser/hdfs/mapred/local/taskTracker/hadoop/jobcache/job_201112161932_0283/job.xml:a
> attempt to override final parameter: mapred.hosts.exclude;  Ignoring.
> 2011-12-21 17:57:57,370 WARN org.apache.hadoop.conf.Configuration:
> /analyser/hdfs/mapred/local/taskTracker/hadoop/jobcache/job_201112161932_0283/job.xml:a
> attempt to override final parameter: dfs.hosts.exclude;  Ignoring.
> 2011-12-21 17:57:57,374 WARN org.apache.hadoop.conf.Configuration:
> /analyser/hdfs/mapred/local/taskTracker/hadoop/jobcache/job_201112161932_0283/job.xml:a
> attempt to override final parameter: dfs.name.dir;  Ignoring.
> 2011-12-21 17:57:57,614 INFO com.hadoop.compression.lzo.GPLNativeCodeLoader:
> Loaded native gpl library
> 2011-12-21 17:57:57,617 INFO com.hadoop.compression.lzo.LzoCodec:
> Successfully loaded & initialized native-lzo library [hadoop-lzo rev
> 4537f94556c9ba71ffca316514e0c0101f76b63b]
> 2011-12-21 17:57:57,632 INFO org.apache.hadoop.mapred.MapTask:
> numReduceTasks: 0
> 2011-12-21 17:57:57,639 INFO ExecMapper: maximum memory = 514523136
> 2011-12-21 17:57:57,640 INFO ExecMapper: conf classpath =
> [file:/usr/home/hadoop/hadoop-0.20.2/conf/,
> file:/usr/java/jdk1.6.0_23/lib/tools.jar,
> file:/usr/home/hadoop/hadoop-0.20.2/,
> file:/usr/home/hadoop/hadoop-0.20.2/hadoop-core-0.20.2-CDH3B4.jar,
> file:/usr/home/hadoop/hadoop-0.20.2/lib/aspectjrt-1.6.5.jar,
> file:/usr/home/hadoop/hadoop-0.20.2/lib/aspectjtools-1.6.5.jar,
> file:/usr/home/hadoop/hadoop-0.20.2/lib/commons-cli-1.2.jar,
> file:/usr/home/hadoop/hadoop-0.20.2/lib/commons-codec-1.4.jar,
> file:/usr/home/hadoop/hadoop-0.20.2/lib/commons-daemon-1.0.1.jar,
> file:/usr/home/hadoop/hadoop-0.20.2/lib/commons-el-1.0.jar,
> file:/usr/home/hadoop/hadoop-0.20.2/lib/commons-httpclient-3.0.1.jar,
> file:/usr/home/hadoop/hadoop-0.20.2/lib/commons-logging-1.0.4.jar,
> file:/usr/home/hadoop/hadoop-0.20.2/lib/commons-logging-api-1.0.4.jar,
> file:/usr/home/hadoop/hadoop-0.20.2/lib/commons-net-1.4.1.jar,
> file:/usr/home/hadoop/hadoop-0.20.2/lib/core-3.1.1.jar,
> file:/usr/home/hadoop/hadoop-0.20.2/lib
>  /hadoop-fairscheduler-0.20.2-CDH3B4.jar,
> file:/usr/home/hadoop/hadoop-0.20.2/lib/hadoop-lzo-0.4.12.jar,
> file:/usr/home/hadoop/hadoop-0.20.2/lib/hsqldb-1.8.0.10.jar,
> file:/usr/home/hadoop/hadoop-0.20.2/lib/jackson-core-asl-1.5.2.jar,
> file:/usr/home/hadoop/hadoop-0.20.2/lib/jackson-mapper-asl-1.5.2.jar,
> file:/usr/home/hadoop/hadoop-0.20.2/lib/jasper-compiler-5.5.12.jar,
> file:/usr/home/hadoop/hadoop-0.20.2/lib/jasper-runtime-5.5.12.jar,
> file:/usr/home/hadoop/hadoop-0.20.2/lib/jets3t-0.6.1.jar,
> file:/usr/home/hadoop/hadoop-0.20.2/lib/jetty-6.1.26.jar,
> file:/usr/home/hadoop/hadoop-0.20.2/lib/jetty-servlet-tester-6.1.26.jar,
> file:/usr/home/hadoop/hadoop-0.20.2/lib/jetty-util-6.1.26.jar,
> file:/usr/home/hadoop/hadoop-0.20.2/lib/jsch-0.1.42.jar,
> file:/usr/home/hadoop/hadoop-0.20.2/lib/junit-4.5.jar,
> file:/usr/home/hadoop/hadoop-0.20.2/lib/kfs-0.2.2.jar,
> file:/usr/home/hadoop/hadoop-0.20.2/lib/log4j-1.2.15.jar,
> file:/usr/home/hadoop/hadoop-0.20.2/lib/mockito-all-1.8.2.jar, file:/usr/h
>  ome/hadoop/hadoop-0.20.2/lib/oro-2.0.8.jar,
> file:/usr/home/hadoop/hadoop-0.20.2/lib/servlet-api-2.5-20081211.jar,
> file:/usr/home/hadoop/hadoop-0.20.2/lib/servlet-api-2.5-6.1.14.jar,
> file:/usr/home/hadoop/hadoop-0.20.2/lib/slf4j-api-1.4.3.jar,
> file:/usr/home/hadoop/hadoop-0.20.2/lib/slf4j-log4j12-1.4.3.jar,
> file:/usr/home/hadoop/hadoop-0.20.2/lib/xmlenc-0.52.jar,
> file:/usr/home/hadoop/hadoop-0.20.2/lib/jsp-2.1/jsp-2.1.jar,
> file:/usr/home/hadoop/hadoop-0.20.2/lib/jsp-2.1/jsp-api-2.1.jar,
> file:/data1/analyser/hdfs/mapred/local/taskTracker/hadoop/jobcache/job_201112161932_0283/jars/classes,
> file:/data1/analyser/hdfs/mapred/local/taskTracker/hadoop/jobcache/job_201112161932_0283/jars/job.jar,
> file:/data1/analyser/hdfs/mapred/local/taskTracker/hadoop/distcache/4011691731310986177_1050235208_1611538248/h-master/analyser/hdfs/mapred/staging/hadoop/.staging/job_201112161932_0283/libjars/logfunc.jar,
> file:/data1/analyser/hdfs/mapred/local/taskTracker/distcache/5733595343670413510_2108
>  507888_486173279/h-master/user/hive/hbase-0.90.3-cdh3u1.jar,
> file:/data1/analyser/hdfs/mapred/local/taskTracker/distcache/5214408008425213222_-774945002_486208112/h-master/user/hive/hive-hbase-handler-0.7.1-cdh3u1.jar,
> file:/data1/analyser/hdfs/mapred/local/taskTracker/distcache/9089248915585406031_-644048472_1609575637/h-master/usr/lib/hive/lib/hive-contrib-0.7.1-cdh3u1.jar,
> file:/data1/analyser/hdfs/mapred/local/taskTracker/hadoop/jobcache/job_201112161932_0283/attempt_201112161932_0283_m_000000_0/work/]
> 2011-12-21 17:57:57,640 INFO ExecMapper: thread classpath =
> [file:/usr/home/hadoop/hadoop-0.20.2/conf/,
> file:/usr/java/jdk1.6.0_23/lib/tools.jar,
> file:/usr/home/hadoop/hadoop-0.20.2/,
> file:/usr/home/hadoop/hadoop-0.20.2/hadoop-core-0.20.2-CDH3B4.jar,
> file:/usr/home/hadoop/hadoop-0.20.2/lib/aspectjrt-1.6.5.jar,
> file:/usr/home/hadoop/hadoop-0.20.2/lib/aspectjtools-1.6.5.jar,
> file:/usr/home/hadoop/hadoop-0.20.2/lib/commons-cli-1.2.jar,
> file:/usr/home/hadoop/hadoop-0.20.2/lib/commons-codec-1.4.jar,
> file:/usr/home/hadoop/hadoop-0.20.2/lib/commons-daemon-1.0.1.jar,
> file:/usr/home/hadoop/hadoop-0.20.2/lib/commons-el-1.0.jar,
> file:/usr/home/hadoop/hadoop-0.20.2/lib/commons-httpclient-3.0.1.jar,
> file:/usr/home/hadoop/hadoop-0.20.2/lib/commons-logging-1.0.4.jar,
> file:/usr/home/hadoop/hadoop-0.20.2/lib/commons-logging-api-1.0.4.jar,
> file:/usr/home/hadoop/hadoop-0.20.2/lib/commons-net-1.4.1.jar,
> file:/usr/home/hadoop/hadoop-0.20.2/lib/core-3.1.1.jar,
> file:/usr/home/hadoop/hadoop-0.20.2/l
>  ib/hadoop-fairscheduler-0.20.2-CDH3B4.jar,
> file:/usr/home/hadoop/hadoop-0.20.2/lib/hadoop-lzo-0.4.12.jar,
> file:/usr/home/hadoop/hadoop-0.20.2/lib/hsqldb-1.8.0.10.jar,
> file:/usr/home/hadoop/hadoop-0.20.2/lib/jackson-core-asl-1.5.2.jar,
> file:/usr/home/hadoop/hadoop-0.20.2/lib/jackson-mapper-asl-1.5.2.jar,
> file:/usr/home/hadoop/hadoop-0.20.2/lib/jasper-compiler-5.5.12.jar,
> file:/usr/home/hadoop/hadoop-0.20.2/lib/jasper-runtime-5.5.12.jar,
> file:/usr/home/hadoop/hadoop-0.20.2/lib/jets3t-0.6.1.jar,
> file:/usr/home/hadoop/hadoop-0.20.2/lib/jetty-6.1.26.jar,
> file:/usr/home/hadoop/hadoop-0.20.2/lib/jetty-servlet-tester-6.1.26.jar,
> file:/usr/home/hadoop/hadoop-0.20.2/lib/jetty-util-6.1.26.jar,
> file:/usr/home/hadoop/hadoop-0.20.2/lib/jsch-0.1.42.jar,
> file:/usr/home/hadoop/hadoop-0.20.2/lib/junit-4.5.jar,
> file:/usr/home/hadoop/hadoop-0.20.2/lib/kfs-0.2.2.jar,
> file:/usr/home/hadoop/hadoop-0.20.2/lib/log4j-1.2.15.jar,
> file:/usr/home/hadoop/hadoop-0.20.2/lib/mockito-all-1.8.2.jar, file:/usr
>  /home/hadoop/hadoop-0.20.2/lib/oro-2.0.8.jar,
> file:/usr/home/hadoop/hadoop-0.20.2/lib/servlet-api-2.5-20081211.jar,
> file:/usr/home/hadoop/hadoop-0.20.2/lib/servlet-api-2.5-6.1.14.jar,
> file:/usr/home/hadoop/hadoop-0.20.2/lib/slf4j-api-1.4.3.jar,
> file:/usr/home/hadoop/hadoop-0.20.2/lib/slf4j-log4j12-1.4.3.jar,
> file:/usr/home/hadoop/hadoop-0.20.2/lib/xmlenc-0.52.jar,
> file:/usr/home/hadoop/hadoop-0.20.2/lib/jsp-2.1/jsp-2.1.jar,
> file:/usr/home/hadoop/hadoop-0.20.2/lib/jsp-2.1/jsp-api-2.1.jar,
> file:/data1/analyser/hdfs/mapred/local/taskTracker/hadoop/jobcache/job_201112161932_0283/jars/classes,
> file:/data1/analyser/hdfs/mapred/local/taskTracker/hadoop/jobcache/job_201112161932_0283/jars/job.jar,
> file:/data1/analyser/hdfs/mapred/local/taskTracker/hadoop/distcache/4011691731310986177_1050235208_1611538248/h-master/analyser/hdfs/mapred/staging/hadoop/.staging/job_201112161932_0283/libjars/logfunc.jar,
> file:/data1/analyser/hdfs/mapred/local/taskTracker/distcache/5733595343670413510_21
>  08507888_486173279/h-master/user/hive/hbase-0.90.3-cdh3u1.jar,
> file:/data1/analyser/hdfs/mapred/local/taskTracker/distcache/5214408008425213222_-774945002_486208112/h-master/user/hive/hive-hbase-handler-0.7.1-cdh3u1.jar,
> file:/data1/analyser/hdfs/mapred/local/taskTracker/distcache/9089248915585406031_-644048472_1609575637/h-master/usr/lib/hive/lib/hive-contrib-0.7.1-cdh3u1.jar,
> file:/data1/analyser/hdfs/mapred/local/taskTracker/hadoop/jobcache/job_201112161932_0283/attempt_201112161932_0283_m_000000_0/work/]
> 2011-12-21 17:57:57,690 INFO org.apache.hadoop.hive.ql.exec.MapOperator:
> Adding alias testudaf to work list for file
> hdfs://h-master:9000/user/hadoop/wftest
> 2011-12-21 17:57:57,693 INFO org.apache.hadoop.hive.ql.exec.MapOperator:
> dump TS struct<seller:string,cosumer:string>
> 2011-12-21 17:57:57,694 INFO ExecMapper:
> <MAP>Id =3
>   <Children>
>     <TS>Id =0
>       <Children>
>         <SEL>Id =1
>           <Children>
>             <FS>Id =2
>               <Parent>Id = 1 null<\Parent>
>             <\FS>
>           <\Children>
>           <Parent>Id = 0 null<\Parent>
>         <\SEL>
>       <\Children>
>       <Parent>Id = 3 null<\Parent>
>     <\TS>
>   <\Children>
> <\MAP>
> 2011-12-21 17:57:57,694 INFO org.apache.hadoop.hive.ql.exec.MapOperator:
> Initializing Self 3 MAP
> 2011-12-21 17:57:57,694 INFO
> org.apache.hadoop.hive.ql.exec.TableScanOperator: Initializing Self 0 TS
> 2011-12-21 17:57:57,694 INFO
> org.apache.hadoop.hive.ql.exec.TableScanOperator: Operator 0 TS initialized
> 2011-12-21 17:57:57,694 INFO
> org.apache.hadoop.hive.ql.exec.TableScanOperator: Initializing children of 0
> TS
> 2011-12-21 17:57:57,694 INFO org.apache.hadoop.hive.ql.exec.SelectOperator:
> Initializing child 1 SEL
> 2011-12-21 17:57:57,694 INFO org.apache.hadoop.hive.ql.exec.SelectOperator:
> Initializing Self 1 SEL
> 2011-12-21 17:57:57,699 INFO org.apache.hadoop.hive.ql.exec.SelectOperator:
> SELECT struct<seller:string,cosumer:string>
> 2011-12-21 17:57:57,703 INFO org.apache.hadoop.hive.ql.exec.SelectOperator:
> Operator 1 SEL initialized
> 2011-12-21 17:57:57,703 INFO org.apache.hadoop.hive.ql.exec.SelectOperator:
> Initializing children of 1 SEL
> 2011-12-21 17:57:57,703 INFO
> org.apache.hadoop.hive.ql.exec.FileSinkOperator: Initializing child 2 FS
> 2011-12-21 17:57:57,703 INFO
> org.apache.hadoop.hive.ql.exec.FileSinkOperator: Initializing Self 2 FS
> 2011-12-21 17:57:57,707 INFO
> org.apache.hadoop.hive.ql.exec.FileSinkOperator: Operator 2 FS initialized
> 2011-12-21 17:57:57,707 INFO
> org.apache.hadoop.hive.ql.exec.FileSinkOperator: Initialization Done 2 FS
> 2011-12-21 17:57:57,707 INFO org.apache.hadoop.hive.ql.exec.SelectOperator:
> Initialization Done 1 SEL
> 2011-12-21 17:57:57,707 INFO
> org.apache.hadoop.hive.ql.exec.TableScanOperator: Initialization Done 0 TS
> 2011-12-21 17:57:57,707 INFO org.apache.hadoop.hive.ql.exec.MapOperator:
> Initialization Done 3 MAP
> 2011-12-21 17:57:57,711 INFO org.apache.hadoop.hive.ql.exec.MapOperator:
> Processing path hdfs://h-master:9000/user/hadoop/wftest/test.txt
> 2011-12-21 17:57:57,711 INFO org.apache.hadoop.hive.ql.exec.MapOperator:
> Processing alias testudaf for file hdfs://h-master:9000/user/hadoop/wftest
> 2011-12-21 17:57:57,712 INFO org.apache.hadoop.hive.ql.exec.MapOperator: 3
> forwarding 1 rows
> 2011-12-21 17:57:57,712 INFO
> org.apache.hadoop.hive.ql.exec.TableScanOperator: 0 forwarding 1 rows
> 2011-12-21 17:57:57,712 INFO org.apache.hadoop.hive.ql.exec.SelectOperator:
> 1 forwarding 1 rows
> 2011-12-21 17:57:57,713 INFO
> org.apache.hadoop.hive.ql.exec.FileSinkOperator: Final Path: FS
> hdfs://h-master:9000/tmp/hive-hadoop/hive_2011-12-21_17-57-43_970_69237304724972937/_tmp.-ext-10001/000000_0
> 2011-12-21 17:57:57,713 INFO
> org.apache.hadoop.hive.ql.exec.FileSinkOperator: Writing to temp file: FS
> hdfs://h-master:9000/tmp/hive-hadoop/hive_2011-12-21_17-57-43_970_69237304724972937/_tmp.-ext-10001/_tmp.000000_0
> 2011-12-21 17:57:57,714 INFO
> org.apache.hadoop.hive.ql.exec.FileSinkOperator: New Final Path: FS
> hdfs://h-master:9000/tmp/hive-hadoop/hive_2011-12-21_17-57-43_970_69237304724972937/_tmp.-ext-10001/000000_0
> 2011-12-21 17:57:57,785 INFO ExecMapper: ExecMapper: processing 1 rows: used
> memory = 59120320
> 2011-12-21 17:57:57,786 INFO org.apache.hadoop.hive.ql.exec.MapOperator: 3
> finished. closing...
> 2011-12-21 17:57:57,786 INFO org.apache.hadoop.hive.ql.exec.MapOperator: 3
> forwarded 4 rows
> 2011-12-21 17:57:57,786 INFO org.apache.hadoop.hive.ql.exec.MapOperator:
> DESERIALIZE_ERRORS:0
> 2011-12-21 17:57:57,786 INFO
> org.apache.hadoop.hive.ql.exec.TableScanOperator: 0 finished. closing...
> 2011-12-21 17:57:57,786 INFO
> org.apache.hadoop.hive.ql.exec.TableScanOperator: 0 forwarded 4 rows
> 2011-12-21 17:57:57,786 INFO org.apache.hadoop.hive.ql.exec.SelectOperator:
> 1 finished. closing...
> 2011-12-21 17:57:57,786 INFO org.apache.hadoop.hive.ql.exec.SelectOperator:
> 1 forwarded 4 rows
> 2011-12-21 17:57:57,786 INFO
> org.apache.hadoop.hive.ql.exec.FileSinkOperator: 2 finished. closing...
> 2011-12-21 17:57:57,786 INFO
> org.apache.hadoop.hive.ql.exec.FileSinkOperator: 2 forwarded 0 rows
> 2011-12-21 17:57:57,950 INFO org.apache.hadoop.hive.ql.exec.SelectOperator:
> 1 Close done
> 2011-12-21 17:57:57,950 INFO
> org.apache.hadoop.hive.ql.exec.TableScanOperator: 0 Close done
> 2011-12-21 17:57:57,950 INFO org.apache.hadoop.hive.ql.exec.MapOperator: 3
> Close done
> 2011-12-21 17:57:57,950 INFO ExecMapper: ExecMapper: processed 4 rows: used
> memory = 64489208
> 2011-12-21 17:57:57,953 INFO org.apache.hadoop.mapred.Task:
> Task:attempt_201112161932_0283_m_000000_0 is done. And is in the process of
> commiting
> 2011-12-21 17:57:57,960 INFO org.apache.hadoop.mapred.Task: Task
> 'attempt_201112161932_0283_m_000000_0' done.
> 2011-12-21 17:57:57,991 INFO org.apache.hadoop.mapred.TaskLogsTruncater:
> Initializing logs' truncater with mapRetainSize=-1 and reduceRetainSize=-1
>
>
>
> pls see the red rows.  why map used so large memory, which parameter
>  related  ?
>
> Thanks.
>
>
>
>
>

Reply via email to