Hey Sanjeev.

Can you put the /tmp/hive/hive.log (on the hvevserver2 host) when you
launch the query ?

Best regards.

Tale

On Thu, Sep 22, 2016 at 5:03 AM, Sanjeev Verma <sanjeev.verm...@gmail.com>
wrote:

> lowered 1073741824 to half of it but still getting the same issue.
>
> On Wed, Sep 21, 2016 at 6:44 PM, Sanjeev Verma <sanjeev.verm...@gmail.com>
> wrote:
>
>> its 1073741824 now but I cant see anything running on client side, the
>> job which kicked up by the query got completed but HS2 is crashing
>>
>> On Wed, Sep 21, 2016 at 6:40 PM, Prasanth Jayachandran <
>> pjayachand...@hortonworks.com> wrote:
>>
>>> FetchOperator will run client side. What is the value for
>>> hive.fetch.task.conversion.threshold?
>>>
>>> Thanks
>>> Prasanth
>>> > On Sep 21, 2016, at 6:37 PM, Sanjeev Verma <sanjeev.verm...@gmail.com>
>>> wrote:
>>> >
>>> > I am getting hiveserver2 memory even after increasing the heap size
>>> from 8G to 24G, in clue why it still going to OOM with enough heapsize
>>> >
>>> > "HiveServer2-HttpHandler-Pool: Thread-58026" prio=5 tid=58026 RUNNABLE
>>> >      at java.lang.OutOfMemoryError.<init>(OutOfMemoryError.java:48)
>>> >      at org.apache.hadoop.util.LineReader.<init>(LineReader.java:140)
>>> >      at org.apache.hadoop.mapreduce.lib.input.SplitLineReader.<init>
>>> (SplitLineReader.java:37)
>>> >      at org.apache.hadoop.mapreduce.lib.input.UncompressedSplitLineR
>>> eader.<init>(UncompressedSplitLineReader.java:46)
>>> >      at org.apache.hadoop.mapred.LineRecordReader.<init>(LineRecordR
>>> eader.java:128)
>>> >      at org.apache.hadoop.mapred.TextInputFormat.getRecordReader(Tex
>>> tInputFormat.java:67)
>>> >      at org.apache.hadoop.hive.ql.exec.FetchOperator$FetchInputForma
>>> tSplit.getRecordReader(FetchOperator.java:682)
>>> >      at org.apache.hadoop.hive.ql.exec.FetchOperator.getRecordReader
>>> (FetchOperator.java:328)
>>> >      at org.apache.hadoop.hive.ql.exec.FetchOperator.getNextRow(Fetc
>>> hOperator.java:450)
>>> >      at org.apache.hadoop.hive.ql.exec.FetchOperator.pushRow(FetchOp
>>> erator.java:419)
>>> >      at org.apache.hadoop.hive.ql.exec.FetchTask.fetch(FetchTask.jav
>>> a:143)
>>> >      at org.apache.hadoop.hive.ql.Driver.getResults(Driver.java:1745)
>>> >      at org.apache.hive.service.cli.operation.SQLOperation.getNextRo
>>> wSet(SQLOperation.java:347)
>>> >      at org.apache.hive.service.cli.operation.OperationManager.getOp
>>> erationNextRowSet(OperationManager.java:223
>>> >      at org.apache.hive.service.cli.session.HiveSessionImpl.fetchRes
>>> ults(HiveSessionImpl.java:716)
>>> >      at sun.reflect.GeneratedMethodAccessor15.invoke(<unknown string>)
>>> >      at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMe
>>> thodAccessorImpl.java:43)
>>> >      at java.lang.reflect.Method.invoke(Method.java:606)
>>> >      at org.apache.hive.service.cli.session.HiveSessionProxy.invoke(
>>> HiveSessionProxy.java:78)
>>> >      at org.apache.hive.service.cli.session.HiveSessionProxy.access$
>>> 000(HiveSessionProxy.java:36)
>>> >      at org.apache.hive.service.cli.session.HiveSessionProxy$1.run(H
>>> iveSessionProxy.java:63)
>>> >      at java.security.AccessController.doPrivileged(Native Method)
>>> >      at javax.security.auth.Subject.doAs(Subject.java:415)
>>> >      at org.apache.hadoop.security.UserGroupInformation.doAs(UserGro
>>> upInformation.java:1709)
>>>
>>>
>>
>

Reply via email to