tyajit.
>
>
>
> On Tue, Oct 25, 2016 at 8:49 PM, Eugene Koifman
> wrote:
>
>> Which of your tables are are transactional? Can you provide the DDL?
>>
>> I don’t think “File does not exist” error is causing your queries to
>> fail. It’s an INFO level msg.
>
perator
> limit: -1
> Processor Tree:
> ListSink
>
> Any suggestion in debugging this issue is appreciated.
>
>
> Regards,
> Satyajit.
>
>
>
>
> On Wed, Oct 26, 2016 at 3:34 PM, Eugene Koifman
> wrote:
>
>> If you can run this, then it’s s
medate="2016-10-23";
>
>
> Any clue, or something that you would want me to focus on to debug the
> issue.
>
> Regards,
> Satyajit.
>
>
>
> On Tue, Oct 25, 2016 at 8:49 PM, Eugene Koifman
> wrote:
>
>> Which of your tables are are transactio
>
> Eugene
>
>
> From: satyajit vegesna
> Reply-To: "u...@hive.apache.org"
> Date: Tuesday, October 25, 2016 at 5:46 PM
> To: "u...@hive.apache.org" , "dev@hive.apache.org" <
> dev@hive.apache.org>
> Subject: Error with flush_length File in Orc
.apache.org>"
mailto:dev@hive.apache.org>>
Subject: Error with flush_length File in Orc, in hive 2.1.0 and mr execution
engine.
HI All,
i am using hive 2.1.0 , hadoop 2.7.2 , but when i try running queries like
simple insert,
set mapreduce.job.queuename=default;set hive.exec.dy
HI All,
i am using hive 2.1.0 , hadoop 2.7.2 , but when i try running queries like
simple insert,
set mapreduce.job.queuename=default;set
hive.exec.dynamic.partition=true;set
hive.exec.dynamic.partition.mode=nonstrict;set
hive.exec.max.dynamic.partitions.pernode=400;set
hive.exec.max.dynamic.par