Hi Folks,

I am running Hive in eclipse. I think I am missing a hive conf property either 
in the xml  file or the debug configuration. The jar file name is set to null.
13/07/16 09:55:23 INFO exec.ExecDriver: Executing: 
/Users/bharati/hadoop/bin/hadoop jar null 
org.apache.hadoop.hive.ql.exec.ExecDriver  -plan 
file:/tmp/bharati/hive_2013-07-16_09-53-30_441_6800320206008341895/-local-10003/plan.xml
   -jobconffile 
file:/tmp/bharati/hive_2013-07-16_09-53-30_441_6800320206008341895/-local-10002/jobconf.xml

Thanks,

Warm Regards,
Bharati


Here is the console output.


13/07/16 09:51:21 WARN common.LogUtils: DEPRECATED: Ignoring hive-default.xml 
found on the CLASSPATH at 
/Users/bharati/eclipse/tutorial/src/conf/hive-default.xml
13/07/16 09:51:21 INFO service.HiveServer: Starting hive server on port 10001 
with 100 min worker threads and 2147483647 max worker threads
13/07/16 09:51:21 INFO service.HiveServer: TCP keepalive = true
13/07/16 09:52:26 INFO metastore.HiveMetaStore: 0: Opening raw store with 
implemenation class:org.apache.hadoop.hive.metastore.ObjectStore
13/07/16 09:52:26 INFO metastore.ObjectStore: ObjectStore, initialize called
13/07/16 09:52:26 ERROR DataNucleus.Plugin: Bundle "org.eclipse.jdt.core" 
requires "org.eclipse.core.resources" but it cannot be resolved.
13/07/16 09:52:26 ERROR DataNucleus.Plugin: Bundle "org.eclipse.jdt.core" 
requires "org.eclipse.core.runtime" but it cannot be resolved.
13/07/16 09:52:26 ERROR DataNucleus.Plugin: Bundle "org.eclipse.jdt.core" 
requires "org.eclipse.text" but it cannot be resolved.
13/07/16 09:52:27 INFO metastore.ObjectStore: Setting MetaStore object pin 
classes with 
hive.metastore.cache.pinobjtypes="Table,StorageDescriptor,SerDeInfo,Partition,Database,Type,FieldSchema,Order"
13/07/16 09:52:27 INFO metastore.ObjectStore: Initialized ObjectStore
Hive history 
file=/tmp/bharati/hive_job_log_bharati_6000@Bharati-Adkars-iMac.local_201307160952_1763040067.txt
13/07/16 09:52:28 INFO exec.HiveHistory: Hive history 
file=/tmp/bharati/hive_job_log_bharati_6000@Bharati-Adkars-iMac.local_201307160952_1763040067.txt
13/07/16 09:53:30 INFO service.HiveServer: Putting temp output to file 
/tmp/bharati/bharati_6000@Bharati-Adkars-iMac.local_201307160952376261539654619795.pipeout
13/07/16 09:53:30 INFO service.HiveServer: Running the query: set 
hive.fetch.output.serde = org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe
13/07/16 09:53:30 INFO service.HiveServer: Putting temp output to file 
/tmp/bharati/bharati_6000@Bharati-Adkars-iMac.local_201307160952376261539654619795.pipeout
13/07/16 09:53:30 INFO service.HiveServer: Running the query: select count(*) 
from fact_mobiledata
13/07/16 09:53:30 INFO ql.Driver: <PERFLOG method=Driver.run>
13/07/16 09:53:30 INFO ql.Driver: <PERFLOG method=TimeToSubmit>
13/07/16 09:53:30 INFO ql.Driver: <PERFLOG method=compile>
13/07/16 09:53:30 INFO parse.ParseDriver: Parsing command: select count(*) from 
fact_mobiledata
13/07/16 09:53:30 INFO parse.ParseDriver: Parse Completed
13/07/16 09:53:30 INFO parse.SemanticAnalyzer: Starting Semantic Analysis
13/07/16 09:53:30 INFO parse.SemanticAnalyzer: Completed phase 1 of Semantic 
Analysis
13/07/16 09:53:30 INFO parse.SemanticAnalyzer: Get metadata for source tables
13/07/16 09:53:31 INFO metastore.HiveMetaStore: 0: get_table : db=default 
tbl=fact_mobiledata
13/07/16 09:53:31 INFO HiveMetaStore.audit: ugi=bharati ip=unknown-ip-addr      
cmd=get_table : db=default tbl=fact_mobiledata  
13/07/16 09:53:31 INFO metastore.HiveMetaStore: 0: Opening raw store with 
implemenation class:org.apache.hadoop.hive.metastore.ObjectStore
13/07/16 09:53:31 INFO metastore.ObjectStore: ObjectStore, initialize called
13/07/16 09:53:31 INFO metastore.ObjectStore: Initialized ObjectStore
13/07/16 09:53:31 INFO parse.SemanticAnalyzer: Get metadata for subqueries
13/07/16 09:53:31 INFO parse.SemanticAnalyzer: Get metadata for destination 
tables
13/07/16 09:53:31 INFO parse.SemanticAnalyzer: Completed getting MetaData in 
Semantic Analysis
13/07/16 09:53:31 INFO ppd.OpProcFactory: Processing for FS(6)
13/07/16 09:53:31 INFO ppd.OpProcFactory: Processing for SEL(5)
13/07/16 09:53:31 INFO ppd.OpProcFactory: Processing for GBY(4)
13/07/16 09:53:31 INFO ppd.OpProcFactory: Processing for RS(3)
13/07/16 09:53:31 INFO ppd.OpProcFactory: Processing for GBY(2)
13/07/16 09:53:31 INFO ppd.OpProcFactory: Processing for SEL(1)
13/07/16 09:53:31 INFO ppd.OpProcFactory: Processing for TS(0)
13/07/16 09:53:31 INFO physical.MetadataOnlyOptimizer: Looking for table scans 
where optimization is applicable
13/07/16 09:53:31 INFO physical.MetadataOnlyOptimizer: Found 0 metadata only 
table scans
13/07/16 09:53:31 INFO parse.SemanticAnalyzer: Completed plan generation
13/07/16 09:53:31 INFO ql.Driver: Semantic Analysis Completed
13/07/16 09:53:31 INFO exec.ListSinkOperator: Initializing Self 7 OP
13/07/16 09:53:31 INFO exec.ListSinkOperator: Operator 7 OP initialized
13/07/16 09:53:31 INFO exec.ListSinkOperator: Initialization Done 7 OP
13/07/16 09:53:31 INFO ql.Driver: Returning Hive schema: 
Schema(fieldSchemas:[FieldSchema(name:_c0, type:bigint, comment:null)], 
properties:null)
13/07/16 09:53:31 INFO ql.Driver: </PERFLOG method=compile start=1373993610417 
end=1373993611601 duration=1184>
13/07/16 09:53:31 INFO ql.Driver: <PERFLOG method=Driver.execute>
13/07/16 09:53:31 INFO ql.Driver: Starting command: select count(*) from 
fact_mobiledata
Total MapReduce jobs = 1
13/07/16 09:53:31 INFO ql.Driver: Total MapReduce jobs = 1
13/07/16 09:53:31 INFO ql.Driver: </PERFLOG method=TimeToSubmit 
start=1373993610417 end=1373993611620 duration=1203>
Launching Job 1 out of 1
13/07/16 09:53:31 INFO ql.Driver: Launching Job 1 out of 1
Number of reduce tasks determined at compile time: 1
13/07/16 09:53:31 INFO exec.Task: Number of reduce tasks determined at compile 
time: 1
In order to change the average load for a reducer (in bytes):
13/07/16 09:53:31 INFO exec.Task: In order to change the average load for a 
reducer (in bytes):
  set hive.exec.reducers.bytes.per.reducer=<number>
13/07/16 09:53:31 INFO exec.Task:   set 
hive.exec.reducers.bytes.per.reducer=<number>
In order to limit the maximum number of reducers:
13/07/16 09:53:31 INFO exec.Task: In order to limit the maximum number of 
reducers:
  set hive.exec.reducers.max=<number>
13/07/16 09:53:31 INFO exec.Task:   set hive.exec.reducers.max=<number>
In order to set a constant number of reducers:
13/07/16 09:53:31 INFO exec.Task: In order to set a constant number of reducers:
  set mapred.reduce.tasks=<number>
13/07/16 09:53:31 INFO exec.Task:   set mapred.reduce.tasks=<number>
13/07/16 09:55:22 INFO exec.ExecDriver: Generating plan file 
file:/tmp/bharati/hive_2013-07-16_09-53-30_441_6800320206008341895/-local-10003/plan.xml
13/07/16 09:55:23 INFO exec.ExecDriver: Executing: 
/Users/bharati/hadoop/bin/hadoop jar null 
org.apache.hadoop.hive.ql.exec.ExecDriver  -plan 
file:/tmp/bharati/hive_2013-07-16_09-53-30_441_6800320206008341895/-local-10003/plan.xml
   -jobconffile 
file:/tmp/bharati/hive_2013-07-16_09-53-30_441_6800320206008341895/-local-10002/jobconf.xml
Exception in thread "main" java.io.IOException: Error opening job jar: null
        at org.apache.hadoop.util.RunJar.main(RunJar.java:90)
Caused by: java.util.zip.ZipException: error in opening zip file
        at java.util.zip.ZipFile.open(Native Method)
        at java.util.zip.ZipFile.<init>(ZipFile.java:128)
        at java.util.jar.JarFile.<init>(JarFile.java:136)
        at java.util.jar.JarFile.<init>(JarFile.java:73)
        at org.apache.hadoop.util.RunJar.main(RunJar.java:88)
Execution failed with exit status: 1
13/07/16 09:55:38 ERROR exec.Task: Execution failed with exit status: 1
Obtaining error information
13/07/16 09:55:38 ERROR exec.Task: Obtaining error information

Task failed!
Task ID:
  Stage-1

Logs:

13/07/16 09:55:38 ERROR exec.Task: 
Task failed!
Task ID:
  Stage-1

Logs:

13/07/16 09:55:38 ERROR exec.ExecDriver: Execution failed with exit status: 1
FAILED: Execution Error, return code 1 from 
org.apache.hadoop.hive.ql.exec.MapRedTask
13/07/16 09:55:38 ERROR ql.Driver: FAILED: Execution Error, return code 1 from 
org.apache.hadoop.hive.ql.exec.MapRedTask
13/07/16 09:55:38 INFO ql.Driver: </PERFLOG method=Driver.execute 
start=1373993611602 end=1373993738252 duration=126650>
13/07/16 09:55:38 INFO ql.Driver: <PERFLOG method=releaseLocks>
13/07/16 09:55:38 INFO ql.Driver: </PERFLOG method=releaseLocks 
start=1373993738252 end=1373993738252 duration=0>

Reply via email to