Hi Sai Local mode is just for trials, for any pre prod/production environment you need MR jobs.
Hive under the hood stores data in HDFS (mostly) and definitely we use hadoop/hive for larger data volumes. So MR should be in there to process them. Regards Bejoy KS Sent from remote device, Please excuse typos -----Original Message----- From: Ramki Palle <ramki.pa...@gmail.com> Date: Sun, 10 Mar 2013 06:58:57 To: <user@hive.apache.org>; Sai Sai<saigr...@yahoo.in> Reply-To: user@hive.apache.org Subject: Re: java.lang.NoClassDefFoundError: com/jayway/jsonpath/PathUtil Well, you get the results faster. Please check this: https://cwiki.apache.org/Hive/gettingstarted.html#GettingStarted-Runtimeconfiguration Under section "Hive, Map-Reduce and Local-Mode", it says This can be very useful to run queries over small data sets - in such cases local mode execution is usually significantly faster than submitting jobs to a large cluster. -Ramki. On Sun, Mar 10, 2013 at 5:26 AM, Sai Sai <saigr...@yahoo.in> wrote: > Ramki/John > Many Thanks, that really helped. I have run the add jars in the new > session and it appears to be running. However i was wondering about by > passing MR, why would we do it and what is the use of it. Will appreciate > any input. > Thanks > Sai > > > ------------------------------ > *From:* Ramki Palle <ramki.pa...@gmail.com> > > *To:* user@hive.apache.org; Sai Sai <saigr...@yahoo.in> > *Sent:* Sunday, 10 March 2013 4:22 AM > *Subject:* Re: java.lang.NoClassDefFoundError: > com/jayway/jsonpath/PathUtil > > When you execute the following query, > > hive> select * from twitter limit 5; > > Hive runs it in local mode and not use MapReduce. > > For the query, > > hive> select tweet_id from twitter limit 5; > > I think you need to add JSON jars to overcome this error. You might have > added these in a previous session. If you want these jars available for all > sessions, insert the add jar statements to your $HOME/.hiverc file. > > > To bypass MapReduce > > set hive.exec.mode.local.auto = true; > > to suggest Hive to use local mode to execute the query. If it still uses > MR, try > > set hive.fetch.task.conversion = more;. > > > -Ramki. > > > > On Sun, Mar 10, 2013 at 12:19 AM, Sai Sai <saigr...@yahoo.in> wrote: > > Just wondering if anyone has any suggestions: > > This executes successfully: > > hive> select * from twitter limit 5; > > This does not work: > > hive> select tweet_id from twitter limit 5; // I have given the exception > info below: > > Here is the output of this: > > hive> select * from twitter limit 5; > OK > > tweet_id created_at text user_id user_screen_name user_lang > 122106088022745088 Fri Oct 07 00:28:54 +0000 2011 wkwkw -_- ayo saja > mba RT @yullyunet: Sepupuuu, kita lanjalan yok.. Kita karokoe-an.. Ajak mas > galih jg kalo dia mau.. "@Dindnf: doremifas 124735434 Dindnf en > 122106088018558976 Fri Oct 07 00:28:54 +0000 2011 @egg486 특별히 > 준비했습니다! 252828803 CocaCola_Korea ko > 122106088026939392 Fri Oct 07 00:28:54 +0000 2011 My offer of free > gobbies for all if @amityaffliction play Blair snitch project still > stands. 168590073 SarahYoungBlood en > 122106088035328001 Fri Oct 07 00:28:54 +0000 2011 the girl nxt to me > in the lib got her headphones in dancing and singing loud af like she the > only one here haha 267296295 MONEYyDREAMS_ en > 122106088005971968 Fri Oct 07 00:28:54 +0000 2011 @KUnYoong_B2UTY > Bị lsao đấy 269182160 b2st_b2utyhp en > Time taken: 0.154 seconds > > This does not work: > > hive> select tweet_id from twitter limit 5; > > > Total MapReduce jobs = 1 > Launching Job 1 out of 1 > Number of reduce tasks is set to 0 since there's no reduce operator > Starting Job = job_201303050432_0094, Tracking URL = > http://ubuntu:50030/jobdetails.jsp?jobid=job_201303050432_0094 > Kill Command = /home/satish/work/hadoop-1.0.4/libexec/../bin/hadoop job > -kill job_201303050432_0094 > Hadoop job information for Stage-1: number of mappers: 1; number of > reducers: 0 > 2013-03-10 00:14:44,509 Stage-1 map = 0%, reduce = 0% > 2013-03-10 00:15:14,613 Stage-1 map = 100%, reduce = 100% > Ended Job = job_201303050432_0094 with errors > Error during job, obtaining debugging information... > Job Tracking URL: > http://ubuntu:50030/jobdetails.jsp?jobid=job_201303050432_0094 > Examining task ID: task_201303050432_0094_m_000002 (and more) from job > job_201303050432_0094 > > Task with the most failures(4): > ----- > Task ID: > task_201303050432_0094_m_000000 > > URL: > > http://ubuntu:50030/taskdetails.jsp?jobid=job_201303050432_0094&tipid=task_201303050432_0094_m_000000 > ----- > Diagnostic Messages for this Task: > java.lang.RuntimeException: Error in configuring object > at > org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:93) > at > org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:64) > at > org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:117) > at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:432) > at org.apache.hadoop.mapred.MapTask.run(MapTask.java:372) > at org.apache.hadoop.mapred.Child$4.run(Child.java:255) > at java.security.AccessController.doPrivileged(Native Method) > at javax.security.auth.Subject.doAs(Subject.java:416) > at > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121) > at org.apache.hadoop.mapred.Child.main(Child.java:249) > Caused by: java.lang.reflect.InvocationTargetException > > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > at > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) > at > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) > at java.lang.reflect.Method.invoke(Method.java:616) > at > org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:88) > ... 9 more > Caused by: java.lang.RuntimeException: Error in configuring object > at > org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:93) > at > org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:64) > at > org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:117) > at org.apache.hadoop.mapred.MapRunner.configure(MapRunner.java:34) > ... 14 more > Caused by: java.lang.reflect.InvocationTargetException > > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > at > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) > at > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) > at java.lang.reflect.Method.invoke(Method.java:616) > at > org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:88) > ... 17 more > Caused by: java.lang.RuntimeException: Map operator initialization failed > at > org.apache.hadoop.hive.ql.exec.ExecMapper.configure(ExecMapper.java:121) > ... 22 more > Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: > java.lang.ClassNotFoundException: > org.apache.hadoop.hive.contrib.serde2.JsonSerde > at > org.apache.hadoop.hive.ql.exec.MapOperator.setChildren(MapOperator.java:420) > at > org.apache.hadoop.hive.ql.exec.ExecMapper.configure(ExecMapper.java:90) > ... 22 more > Caused by: java.lang.ClassNotFoundException: > org.apache.hadoop.hive.contrib.serde2.JsonSerde > > at java.net.URLClassLoader$1.run(URLClassLoader.java:217) > at java.security.AccessController.doPrivileged(Native Method) > at java.net.URLClassLoader.findClass(URLClassLoader.java:205) > at java.lang.ClassLoader.loadClass(ClassLoader.java:321) > at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:294) > at java.lang.ClassLoader.loadClass(ClassLoader.java:266) > at java.lang.Class.forName0(Native Method) > at java.lang.Class.forName(Class.java:264) > at > org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:820) > at > org.apache.hadoop.hive.ql.exec.MapOperator.initObjectInspector(MapOperator.java:243) > at > org.apache.hadoop.hive.ql.exec.MapOperator.setChildren(MapOperator.java:380) > ... 23 more > > > FAILED: Execution Error, return code 2 from > org.apache.hadoop.hive.ql.exec.MapRedTask > MapReduce Jobs Launched: > Job 0: Map: 1 HDFS Read: 0 HDFS Write: 0 FAIL > Total MapReduce CPU Time Spent: 0 msec > > Thanks > Sai > ------------------------------ > *From:* Dean Wampler <dean.wamp...@thinkbiganalytics.com> > *To:* user@hive.apache.org; Sai Sai <saigr...@yahoo.in> > *Sent:* Friday, 8 March 2013 5:22 AM > *Subject:* Re: java.lang.NoClassDefFoundError: > com/jayway/jsonpath/PathUtil > > Unfortunately, you have to also add the json jars to Hive's class path > before it starts, e.g., > > env HADOOP_CLASSPATH=/path/to/lib/*.jar hive > > Use the appropriate path to your lib directory. > > On Fri, Mar 8, 2013 at 4:53 AM, Sai Sai <saigr...@yahoo.in> wrote: > > I have added the jar files successfully like this: > > > hive (testdb)> ADD JAR lib/hive-json-serde-0.3.jar; > Added lib/hive-json-serde-0.3.jar to class path > Added resource: lib/hive-json-serde-0.3.jar > > > hive (testdb)> ADD JAR lib/json-path-0.5.4.jar; > Added lib/json-path-0.5.4.jar to class path > Added resource: lib/json-path-0.5.4.jar > > > hive (testdb)> ADD JAR lib/json-smart-1.0.6.3.jar; > Added lib/json-smart-1.0.6.3.jar to class path > Added resource: lib/json-smart-1.0.6.3.jar > > > After this i am getting this error: > > > CREATE EXTERNAL TABLE IF NOT EXISTS twitter (tweet_id BIGINT,created_at > STRING,text STRING,user_id BIGINT, user_screen_name STRING,user_lang > STRING) ROW FORMAT SERDE "org.apache.hadoop.hive.contrib.serde2.JsonSerde" > WITH SERDEPROPERTIES ( > "tweet_id"="$.id","created_at"="$.created_at","text"="$.text","user_id"="$. > user.id","user_screen_name"="$.user.screen_name", > "user_lang"="$.user.lang") LOCATION '/home/satish/data/twitter/input'; > java.lang.NoClassDefFoundError: com/jayway/jsonpath/PathUtil > at org.apache.hadoop.hive.contrib.serde2.JsonSerde.initialize(Unknown > Source) > at > org.apache.hadoop.hive.metastore.MetaStoreUtils.getDeserializer(MetaStoreUtils.java:207) > at > org.apache.hadoop.hive.ql.metadata.Table.getDeserializerFromMetaStore(Table.java:266) > at > org.apache.hadoop.hive.ql.metadata.Table.getDeserializer(Table.java:259) > at org.apache.hadoop.hive.ql.metadata.Table.getCols(Table.java:585) > at org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:550) > at > org.apache.hadoop.hive.ql.exec.DDLTask.createTable(DDLTask.java:3698) > at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:253) > at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:138) > at > org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:57) > at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1336) > at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1122) > at org.apache.hadoop.hive.ql.Driver.run(Driver.java:935) > at > org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:259) > at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:216) > at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:412) > at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:755) > at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:613) > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > at > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) > at > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) > at java.lang.reflect.Method.invoke(Method.java:616) > at org.apache.hadoop.util.RunJar.main(RunJar.java:156) > Caused by: java.lang.ClassNotFoundException: com.jayway.jsonpath.PathUtil > at java.net.URLClassLoader$1.run(URLClassLoader.java:217) > at java.security.AccessController.doPrivileged(Native Method) > at java.net.URLClassLoader.findClass(URLClassLoader.java:205) > at java.lang.ClassLoader.loadClass(ClassLoader.java:321) > at java.lang.ClassLoader.loadClass(ClassLoader.java:266) > ... 23 more > FAILED: Execution Error, return code -101 from > org.apache.hadoop.hive.ql.exec.DDLTask > > > Any help would be really appreciated. > Thanks > Sai > > > > > -- > *Dean Wampler, Ph.D.* > thinkbiganalytics.com > +1-312-339-1330 > > > > > > >