Sravani Gadey created HIVE-28525:
------------------------------------

             Summary: Hive Iceberg Interop : After writing the data using 
Flink, reading from hive is failing
                 Key: HIVE-28525
                 URL: https://issues.apache.org/jira/browse/HIVE-28525
             Project: Hive
          Issue Type: Bug
      Security Level: Public (Viewable by anyone)
          Components: Hive
            Reporter: Sravani Gadey
            Assignee: Simhadri Govindappa


After writing the data using Flink, reading the table from hive is failing with 
the below error.

We observed that reading using Hive is failing but reading using Impala is 
working fine.

Error:
{code:java}
Error while compiling statement: java.io.IOException: java.io.IOException: 
Cannot create an instance of InputFormat class 
org.apache.hadoop.mapred.FileInputFormat as specified in mapredWork!
{code}

{code:java}
INFO  : Compiling 
command(queryId=hive_20240805104217_e6857f09-156f-45e6-b628-b0f5268b6859): 


select * from test_flink_ice666
INFO  : No Stats for default@test_flink_ice666, Columns: itemid, queryid, ts
INFO  : Semantic Analysis Completed (retrial = false)
INFO  : Created Hive schema: 
Schema(fieldSchemas:[FieldSchema(name:test_flink_ice666.queryid, type:bigint, 
comment:null), FieldSchema(name:test_flink_ice666.ts, type:bigint, 
comment:null), FieldSchema(name:test_flink_ice666.itemid, type:string, 
comment:null)], properties:null)
INFO  : Completed compiling 
command(queryId=hive_20240805104217_e6857f09-156f-45e6-b628-b0f5268b6859); Time 
taken: 1.582 seconds
INFO  : Executing 
command(queryId=hive_20240805104217_e6857f09-156f-45e6-b628-b0f5268b6859): 


select * from test_flink_ice666
INFO  : Completed executing 
command(queryId=hive_20240805104217_e6857f09-156f-45e6-b628-b0f5268b6859); Time 
taken: 0.042 seconds
INFO  : OK
ERROR : Failed with exception java.io.IOException:java.io.IOException: Cannot 
create an instance of InputFormat class 
org.apache.hadoop.mapred.FileInputFormat as specified in mapredWork!
java.io.IOException: java.io.IOException: Cannot create an instance of 
InputFormat class org.apache.hadoop.mapred.FileInputFormat as specified in 
mapredWork!
        at 
org.apache.hadoop.hive.ql.exec.FetchOperator.getNextRow(FetchOperator.java:642)
        at 
org.apache.hadoop.hive.ql.exec.FetchOperator.pushRow(FetchOperator.java:549)
        at 
org.apache.hadoop.hive.ql.exec.FetchTask.executeInner(FetchTask.java:217)
        at org.apache.hadoop.hive.ql.exec.FetchTask.execute(FetchTask.java:114)
        at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:820)
        at org.apache.hadoop.hive.ql.Driver.run(Driver.java:550)
        at org.apache.hadoop.hive.ql.Driver.run(Driver.java:544)
        at 
org.apache.hadoop.hive.ql.reexec.ReExecDriver.run(ReExecDriver.java:190)
        at 
org.apache.hive.service.cli.operation.SQLOperation.runQuery(SQLOperation.java:235)
        at 
org.apache.hive.service.cli.operation.SQLOperation.access$700(SQLOperation.java:92)
        at 
org.apache.hive.service.cli.operation.SQLOperation$BackgroundWork$1.run(SQLOperation.java:340)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:422)
        at 
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1899)
        at 
org.apache.hive.service.cli.operation.SQLOperation$BackgroundWork.run(SQLOperation.java:360)
        at 
java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
        at 
java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
        at java.lang.Thread.run(Thread.java:750)
Caused by: java.io.IOException: Cannot create an instance of InputFormat class 
org.apache.hadoop.mapred.FileInputFormat as specified in mapredWork!
        at 
org.apache.hadoop.hive.ql.exec.FetchOperator.getInputFormatFromCache(FetchOperator.java:251)
        at 
org.apache.hadoop.hive.ql.exec.FetchOperator.getNextSplits(FetchOperator.java:392)
        at 
org.apache.hadoop.hive.ql.exec.FetchOperator.getRecordReader(FetchOperator.java:324)
        at 
org.apache.hadoop.hive.ql.exec.FetchOperator.getNextRow(FetchOperator.java:580)
        ... 21 more
Caused by: java.lang.RuntimeException: java.lang.InstantiationException
        at 
org.apache.hive.common.util.ReflectionUtil.newInstance(ReflectionUtil.java:85)
        at 
org.apache.hadoop.hive.ql.exec.FetchOperator.getInputFormatFromCache(FetchOperator.java:247)
        ... 24 more
Caused by: java.lang.InstantiationException
        at 
sun.reflect.InstantiationExceptionConstructorAccessorImpl.newInstance(InstantiationExceptionConstructorAccessorImpl.java:48)
        at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
        at 
org.apache.hive.common.util.ReflectionUtil.newInstance(ReflectionUtil.java:83)
        ... 25 more
{code}

Steps to repro:
* Create an Iceberg table using Hive client.
* CREATE TABLE IF NOT EXISTS test_flink_ice666(queryid bigint, ts bigint, 
itemid string) stored by iceberg TBLPROPERTIES ('format-version' = '2')
* Insert the data into Iceberg table using Flink.
* Try reading the data using Hive.

Also tried setting the below property on table using alter but still reading 
the data using hive failed.
{code:java}
ALTER TABLE test_flink_ice666 SET tblprop
{code}


Here, after inserting data from flink , if we try to run alter table command, 
it doesn't take effect and the subsequent select queries are failing. 



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

Reply via email to