Hi,

I have able to resolve above issue by setting below parameter :-

set hive.msck.path.validation=ignore;


Regards,
Anup Tiwari


On Thu, Sep 6, 2018 at 3:48 PM Anup Tiwari <anupsdtiw...@gmail.com> wrote:

> Hi All,
>
> Can anyone look into it?
>
> On Wed, 5 Sep 2018 19:28 Anup Tiwari, <anupsdtiw...@gmail.com> wrote:
>
>> Hi All,
>>
>> I have executed "msck repair table <tablename>" on my hive ACID table and
>> it printed message that partitions added but when i am querying it; it is
>> giving below error :-
>>
>>
>>
>> Vertex failed, vertexName=Map 1, vertexId=vertex_1536134751043_0020_2_00,
>> diagnostics=[Task failed, taskId=task_1536134751043_0020_2_00_000003,
>> diagnostics=[TaskAttempt 0 failed, info=[Error: Error while running task (
>> failure ) :
>> attempt_1536134751043_0020_2_00_000003_0:java.lang.RuntimeException:
>> java.lang.RuntimeException: java.io.IOException:
>> java.io.FileNotFoundException: *Path is not a file:
>> /user/hive/warehouse/fact_t_mp_basic_all_report_bucket/genddate=2017-04-02*
>>     at
>> org.apache.hadoop.hdfs.server.namenode.INodeFile.valueOf(INodeFile.java:76)
>>     at
>> org.apache.hadoop.hdfs.server.namenode.INodeFile.valueOf(INodeFile.java:62)
>>     at
>> org.apache.hadoop.hdfs.server.namenode.FSDirStatAndListingOp.getBlockLocations(FSDirStatAndListingOp.java:152)
>>     at
>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getBlockLocations(FSNamesystem.java:1819)
>>     at
>> org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getBlockLocations(NameNodeRpcServer.java:692)
>>     at
>> org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getBlockLocations(ClientNamenodeProtocolServerSideTranslatorPB.java:381)
>>     at
>> org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
>>     at
>> org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:447)
>>     at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:989)
>>     at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:850)
>>     at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:793)
>>     at java.security.AccessController.doPrivileged(Native Method)
>>     at javax.security.auth.Subject.doAs(Subject.java:422)
>>     at
>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1840)
>>     at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2489)
>>
>>     at
>> org.apache.hadoop.hive.ql.exec.tez.TezProcessor.initializeAndRunProcessor(TezProcessor.java:211)
>>     at
>> org.apache.hadoop.hive.ql.exec.tez.TezProcessor.run(TezProcessor.java:168)
>>     at
>> org.apache.tez.runtime.LogicalIOProcessorRuntimeTask.run(LogicalIOProcessorRuntimeTask.java:370)
>>     at
>> org.apache.tez.runtime.task.TaskRunner2Callable$1.run(TaskRunner2Callable.java:73)
>>     at
>> org.apache.tez.runtime.task.TaskRunner2Callable$1.run(TaskRunner2Callable.java:61)
>>     at java.security.AccessController.doPrivileged(Native Method)
>>     at javax.security.auth.Subject.doAs(Subject.java:422)
>>     at
>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1840)
>>     at
>> org.apache.tez.runtime.task.TaskRunner2Callable.callInternal(TaskRunner2Callable.java:61)
>>     at
>> org.apache.tez.runtime.task.TaskRunner2Callable.callInternal(TaskRunner2Callable.java:37)
>>     at org.apache.tez.common.CallableWithNdc.call(CallableWithNdc.java:36)
>>     at java.util.concurrent.FutureTask.run(FutureTask.java:266)
>>     at
>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
>>     at
>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
>>     at java.lang.Thread.run(Thread.java:748)
>> Caused by: java.lang.RuntimeException: java.io.IOException:
>> java.io.FileNotFoundException: Path is not a file:
>> /user/hive/warehouse/fact_t_mp_basic_all_report_bucket/genddate=2017-04-02
>>
>>
>>
>>
>> I can understand the error, but is it a bug that hive is not able to
>> navigate to final file /data ?
>>
>> Regards,
>> Anup Tiwari
>>
>

Reply via email to