You could add the ADD JAR statements to your hive scripts or for your
interact sessions, put them in ~/.hiverc.

On Fri, Jun 21, 2013 at 12:37 PM, fab wol <darkwoll...@gmail.com> wrote:

> thx, this worked now, after i checked, that i have to add them in each
> session. which config file have i to manipulate to embed those jars
> permantly?
>
>
> 2013/6/21 Ramki Palle <ramki.pa...@gmail.com>
>
>> Try to add the jar explicitly from hive prompt and see if that works.
>>
>> Regards,
>> Ramki.
>>
>>
>> On Fri, Jun 21, 2013 at 7:32 AM, fab wol <darkwoll...@gmail.com> wrote:
>>
>>> I'm using Hadoop 0.20.2 with Hive 0.11. I have succesfully inserted into
>>> hive/hdfs some csv-files in seperate tables. selects and joins work
>>> flawlessly. When trying to analyse some data, i needed to make use of the
>>> built in functions of hive like:
>>>
>>>  - substr
>>>  - to_date
>>>  - rand
>>>  - etc.
>>>
>>> for example:
>>>
>>>     select sid, request_id, to_date(times), to_unix_timestamp(times)
>>> from contents where sid = '5000000032066010373';
>>>
>>> sid and request id are strings here, times is a timestamp column
>>> Unfortanetely i only get errors (always the same error stack) when using
>>> these functions:
>>>
>>>     java.lang.RuntimeException: Error in configuring object
>>>             at
>>> org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:93)
>>>             at
>>> org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:64)
>>>             at
>>> org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:117)
>>>             at
>>> org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:354)
>>>             at org.apache.hadoop.mapred.MapTask.run(MapTask.java:307)
>>>             at org.apache.hadoop.mapred.Child.main(Child.java:170)
>>>     Caused by: java.lang.reflect.InvocationTargetException
>>>             at sun.reflect.NativeMethodAccessorImpl.invoke0(Native
>>> Method)
>>>             at
>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>>             at
>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>             at java.lang.reflect.Method.invoke(Method.java:601)
>>>             at
>>> org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:88)
>>>             ... 5 more
>>>     Caused by: java.lang.RuntimeException: Error in configuring object
>>>             at
>>> org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:93)
>>>             at
>>> org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:64)
>>>             at
>>> org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:117)
>>>             at
>>> org.apache.hadoop.mapred.MapRunner.configure(MapRunner.java:34)
>>>             ... 10 more
>>>     Caused by: java.lang.reflect.InvocationTargetException
>>>             at sun.reflect.NativeMethodAccessorImpl.invoke0(Native
>>> Method)
>>>             at
>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>>             at
>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>             at java.lang.reflect.Method.invoke(Method.java:601)
>>>             at
>>> org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:88)
>>>             ... 13 more
>>>     Caused by: java.lang.RuntimeException: Map operator initialization
>>> failed
>>>             at
>>> org.apache.hadoop.hive.ql.exec.ExecMapper.configure(ExecMapper.java:121)
>>>              ... 18 more
>>>     Caused by: java.lang.NoClassDefFoundError:
>>> org/codehaus/jackson/JsonFactory
>>>             at
>>> org.apache.hadoop.hive.ql.udf.generic.GenericUDTFJSONTuple.<clinit>(GenericUDTFJSONTuple.java:56)
>>>             at
>>> sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
>>>             at
>>> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
>>>             at
>>> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>>>             at
>>> java.lang.reflect.Constructor.newInstance(Constructor.java:525)
>>>             at
>>> org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:113)
>>>             at
>>> org.apache.hadoop.hive.ql.exec.FunctionRegistry.registerGenericUDTF(FunctionRegistry.java:526)
>>>             at
>>> org.apache.hadoop.hive.ql.exec.FunctionRegistry.registerGenericUDTF(FunctionRegistry.java:520)
>>>             at
>>> org.apache.hadoop.hive.ql.exec.FunctionRegistry.<clinit>(FunctionRegistry.java:423)
>>>             at
>>> org.apache.hadoop.hive.ql.exec.DefaultUDFMethodResolver.getEvalMethod(DefaultUDFMethodResolver.java:59)
>>>             at
>>> org.apache.hadoop.hive.ql.udf.generic.GenericUDFBridge.initialize(GenericUDFBridge.java:154)
>>>             at
>>> org.apache.hadoop.hive.ql.udf.generic.GenericUDF.initializeAndFoldConstants(GenericUDF.java:111)
>>>             at
>>> org.apache.hadoop.hive.ql.exec.ExprNodeGenericFuncEvaluator.initialize(ExprNodeGenericFuncEvaluator.java:141)
>>>             at
>>> org.apache.hadoop.hive.ql.exec.Operator.initEvaluators(Operator.java:970)
>>>             at
>>> org.apache.hadoop.hive.ql.exec.Operator.initEvaluatorsAndReturnStruct(Operator.java:996)
>>>             at
>>> org.apache.hadoop.hive.ql.exec.SelectOperator.initializeOp(SelectOperator.java:60)
>>>             at
>>> org.apache.hadoop.hive.ql.exec.Operator.initialize(Operator.java:375)
>>>             at
>>> org.apache.hadoop.hive.ql.exec.Operator.initialize(Operator.java:451)
>>>             at
>>> org.apache.hadoop.hive.ql.exec.Operator.initializeChildren(Operator.java:407)
>>>             at
>>> org.apache.hadoop.hive.ql.exec.FilterOperator.initializeOp(FilterOperator.java:78)
>>>             at
>>> org.apache.hadoop.hive.ql.exec.Operator.initialize(Operator.java:375)
>>>             at
>>> org.apache.hadoop.hive.ql.exec.Operator.initialize(Operator.java:451)
>>>             at
>>> org.apache.hadoop.hive.ql.exec.Operator.initializeChildren(Operator.java:407)
>>>             at
>>> org.apache.hadoop.hive.ql.exec.TableScanOperator.initializeOp(TableScanOperator.java:186)
>>>             at
>>> org.apache.hadoop.hive.ql.exec.Operator.initialize(Operator.java:375)
>>>             at
>>> org.apache.hadoop.hive.ql.exec.MapOperator.initializeOp(MapOperator.java:543)
>>>             at
>>> org.apache.hadoop.hive.ql.exec.Operator.initialize(Operator.java:375)
>>>             at
>>> org.apache.hadoop.hive.ql.exec.ExecMapper.configure(ExecMapper.java:100)
>>>             ... 18 more
>>>     Caused by: java.lang.ClassNotFoundException:
>>> org.codehaus.jackson.JsonFactory
>>>
>>> what am i doing wrong here? the jackson-core-asl-1.8.8.jar is in the
>>> $HIVE_HOME/lib directory ...
>>>
>>>     SHOW FUNCTIONS;
>>>
>>> shows me that these functions are in there ... i already tried
>>> downgrading to hive 0.10 but the error is the same over there. i need to
>>> work with hadoop 0.20, so unfortunately i can't try hadoop 1.x.x
>>>
>>> thanks in advance
>>> cheers
>>> Wolli
>>>
>>
>>
>


-- 
Dean Wampler, Ph.D.
@deanwampler
http://polyglotprogramming.com

Reply via email to