[ https://issues.apache.org/jira/browse/HIVE-25567?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Stamatis Zampetakis updated HIVE-25567: --------------------------------------- Fix Version/s: (was: 3.1.3) I cleared the fixVersion field since this ticket is still open. Please review this ticket and if the fix is already committed to a specific version please set the version accordingly and mark the ticket as RESOLVED. According to the [JIRA guidelines|https://cwiki.apache.org/confluence/display/Hive/HowToContribute] the fixVersion should be set only when the issue is resolved/closed. > Cannot import table from Tibero to Hive. > ---------------------------------------- > > Key: HIVE-25567 > URL: https://issues.apache.org/jira/browse/HIVE-25567 > Project: Hive > Issue Type: Bug > Components: CLI, Hive > Affects Versions: 3.1.0 > Environment: 1) RDMS Tibero > 2) Apache Hive > 3) Apache Sqoop > ============================================================ > Table Schema: > CREATE EXTERNAL TABLE dmsdba_raw.CMM_CADORG_TB (DORG_CMPN_NO String, > DORG_CORP_NO String, > DORG_DLR_NO String, > DORG_SCTN_TYPE String, > DORG_ORG_DSCTN String, > DORG_CLS_CODE String, > DORG_MNGR_NAME String, > DORG_CNTRY_CODE String, > DORG_RGN_CODE_DEL String, > DORG_STCD_CODE String, > DORG_CITY_CODE String, > DORG_ADDR1 String, > DORG_ADDR2 String, > DORG_ADDR3 String, > DORG_PHONE_NO1 String, > DORG_PHONE_NO2 String, > DORG_FAX_NO String, > DORG_EMAIL String, > DORG_HMI_DFIN_YN String, > DORG_CRTE_EMP_NO String, > DORG_CRTE_DTIME String, > DORG_UPDT_EMP_NO String, > DORG_UPDT_DTIME String, > DORG_DLR_TYPE String, > DORG_CST_NUM String, > DORG_CST_DATE String, > DORG_LST_NUM String, > DORG_LST_DATE String, > DORG_VAT_NUM String, > DORG_VAT_DATE String, > DORG_RC_NO String, > DORG_RC_TIN String, > DORG_TRADE_DSCTN String, > DORG_SUPPLIER String, > DORG_PRM_YN String, > DORG_STATUS String, > DORG_DLR_GRADE String, > DORG_ORDER_BLOCK String, > DORG_WRKSHOP_TYPE String, > DORG_IS_IN_CITY String, > DORG_WRK_PROFILE String, > DORG_SCTN_SUB_TYPE String, > DORG_JDP_YN String, > DORG_GST_NO String, > DORG_GST_RANGE_ADD String, > DORG_IS_UGST String, > DORG_GRP_CITY_CODE String, > DORG_CIN_NO String, > IS_WA_FLAG String, > DORG_GSTN_DATE String, > DORG_CITY_CATEGORY String, > DORG_TEMP1 String, > DORG_TEMP2 String, > DORG_TEMP3 String, > DORG_TEMP4 String, > DORG_LATITUDE String, > DORG_LONGITUDE String, > DORG_SPR_DLR_CODE String, > DORG_DLR_CODE String, > DORG_MAIN_DLR_YN String, > DORG_GRP_DLR_CODE String, > DORG_PREV_DLR_CODE String, > DORG_RSY_DLRNO String, > DORG_WEBSITE_EMAIL String, > DORG_WEBSITE_URL String, > DORG_WEBSITE_CONTNO String, > DORG_SALE_DISTRICT String, > DORG_WEBSITE_SER_CONTNO String, > DORG_MERCHANT_ID String, > DORG_DLR_PPIN_CODE String, > DORG_DLR_EINVOICE_FLAG String, > DORG_EINVC_USR_ID String, > DORG_EINVC_PWD String, > DORG_CLIENT_CD String, > DORG_EI_CUTOFF_DATE String, > DORG_EINVOICE_URL String, > DORG_SOQ_ELGBL String, > DORG_WEBSITE_ALLOW String, > DORG_REMARK String, > DORG_ACTIVATION_DATE String, > DORG_UPI_ID String, > DORG_ACCOUNT_NO String, > DORG_IFSC_NO String, > DORG_DLR__IRN_CATEGORY String, > DORG_SMS_ENABLED String) > Reporter: Ravi Kumar > Assignee: Sushant Karki > Priority: Major > Labels: Tibero, sqoop-hive,, sqoop-import-table > Original Estimate: 12h > Remaining Estimate: 12h > > I am trying to import table from Tibero RDMS to Hive table. I am able use > sqoop eval, sqoop list-tables, sqoop list-databases but sqoop import is > giving below error. > ``` > sqoop import "-Dorg.apache.sqoop.splitter.allow_text_splitter=true" \ > > --connect "jdbc:tibero:thin:@hostname:8629:DB" \ > > --driver com.tmax.tibero.jdbc.TbDriver \ > > --username XXX --password XXX@2021 \ > > --split-by 'DORG_LATITUDE' \ > > --table DMSDBA.CMM_CADORG_TB \ > > --fields-terminated-by "," \ > > --hive-import \ > > --create-hive-table \ > > --hive-table DMSDBA_raw.CMM_CADORG_TB2\ > > --hive-overwrite > Warning: /usr/hdp/3.0.1.0-187/accumulo does not exist! Accumulo imports will > fail. > Please set $ACCUMULO_HOME to the root of your Accumulo installation. > SLF4J: Class path contains multiple SLF4J bindings. > SLF4J: Found binding in > [jar:file:/usr/hdp/3.0.1.0-187/hadoop/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class] > SLF4J: Found binding in > [jar:file:/usr/hdp/3.0.1.0-187/hive/lib/log4j-slf4j-impl-2.10.0.jar!/org/slf4j/impl/StaticLoggerBinder.class] > SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an > explanation. > SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory] > 21/09/28 09:54:10 INFO sqoop.Sqoop: Running Sqoop version: 1.4.8.3.0.1.0-187 > 21/09/28 09:54:10 WARN tool.BaseSqoopTool: Setting your password on the > command-line is insecure. Consider using -P instead. > 21/09/28 09:54:10 WARN sqoop.ConnFactory: Parameter --driver is set to an > explicit driver however appropriate connection manager is not being set (via > --connection-manager). Sqoop is going to fall back to > org.apache.sqoop.manager.GenericJdbcManager. Please specify explicitly which > connection manager should be used next time. > 21/09/28 09:54:10 INFO manager.SqlManager: Using default fetchSize of 1000 > 21/09/28 09:54:10 INFO tool.CodeGenTool: Beginning code generation > 21/09/28 09:54:10 INFO manager.SqlManager: Executing SQL statement: SELECT > t.* FROM DMSDBA.CMM_CADORG_TB AS t WHERE 1=0 > 21/09/28 09:54:10 INFO manager.SqlManager: Executing SQL statement: SELECT > t.* FROM DMSDBA.CMM_CADORG_TB AS t WHERE 1=0 > 21/09/28 09:54:10 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is > /usr/hdp/3.0.1.0-187/hadoop-mapreduce > 21/09/28 09:54:12 INFO orm.CompilationManager: Writing jar file: > /tmp/sqoop-hdfs/compile/159d1a61233887f50112dd4fb3fb129b/DMSDBA.CMM_CADORG_TB.jar > 21/09/28 09:54:13 INFO mapreduce.ImportJobBase: Beginning import of > DMSDBA.CMM_CADORG_TB > 21/09/28 09:54:13 INFO manager.SqlManager: Executing SQL statement: SELECT > t.* FROM DMSDBA.CMM_CADORG_TB AS t WHERE 1=0 > 21/09/28 09:54:14 INFO client.AHSProxy: Connecting to Application History > server at itisbdp/10.107.7.20:10200 > 21/09/28 09:54:14 INFO mapreduce.JobResourceUploader: Disabling Erasure > Coding for path: /user/hdfs/.staging/job_1630059382070_0035 > 21/09/28 09:54:16 INFO db.DBInputFormat: Using read commited transaction > isolation > 21/09/28 09:54:16 INFO db.DataDrivenDBInputFormat: BoundingValsQuery: SELECT > MIN(DORG_LATITUDE), MAX(DORG_LATITUDE) FROM DMSDBA.CMM_CADORG_TB > 21/09/28 09:54:16 WARN db.TextSplitter: Generating splits for a textual index > column. > 21/09/28 09:54:16 WARN db.TextSplitter: If your database sorts in a > case-insensitive order, this may result in a partial import or duplicate > records. > 21/09/28 09:54:16 WARN db.TextSplitter: You are strongly encouraged to choose > an integral split column. > 21/09/28 09:54:16 INFO mapreduce.JobSubmitter: number of splits:4 > 21/09/28 09:54:16 INFO mapreduce.JobSubmitter: Submitting tokens for job: > job_1630059382070_0035 > 21/09/28 09:54:16 INFO mapreduce.JobSubmitter: Executing with tokens: [] > 21/09/28 09:54:16 INFO conf.Configuration: found resource resource-types.xml > at file:/etc/hadoop/3.0.1.0-187/0/resource-types.xml > 21/09/28 09:54:16 INFO impl.YarnClientImpl: Submitted application > application_1630059382070_0035 > 21/09/28 09:54:16 INFO mapreduce.Job: The url to track the job: > http://itisbdp:8088/proxy/application_1630059382070_0035/ > 21/09/28 09:54:16 INFO mapreduce.Job: Running job: job_1630059382070_0035 > 21/09/28 09:54:21 INFO mapreduce.Job: Job job_1630059382070_0035 running in > uber mode : false > 21/09/28 09:54:21 INFO mapreduce.Job: map 0% reduce 0% > 21/09/28 09:54:25 INFO mapreduce.Job: Task Id : > attempt_1630059382070_0035_m_000001_0, Status : FAILED > Error: java.io.IOException: SQLException in nextKeyValue > at > org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:275) > at > org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:568) > at > org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:80) > at > org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:91) > at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) > at > org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64) > at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:799) > at org.apache.hadoop.mapred.MapTask.run(MapTask.java:347) > at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:174) > at java.security.AccessController.doPrivileged(Native Method) > at javax.security.auth.Subject.doAs(Subject.java:422) > at > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1730) > at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:168) > Caused by: java.sql.SQLException: JDBC-8022:Invalid end of SQL. > at line 1, column 1430 of null: > BLED FROM DMSDBA.CMM_CADORG_TB AS DMSDBA.CMM_CADORG_TB WHERE ( DORG_LATITUDE > >= > ^ > at com.tmax.tibero.jdbc.err.TbError.makeSQLException(Unknown Source) > at com.tmax.tibero.jdbc.err.TbError.newSQLException(Unknown Source) > at com.tmax.tibero.jdbc.msg.common.TbMsgError.readErrorStackInfo(Unknown > Source) > at com.tmax.tibero.jdbc.msg.TbMsgEreply.deserialize(Unknown Source) > at com.tmax.tibero.jdbc.comm.TbStream.readMsg(Unknown Source) > at com.tmax.tibero.jdbc.comm.TbCommType4.prepareExecute(Unknown Source) > at > com.tmax.tibero.jdbc.driver.TbPreparedStatementImpl.executeCompleteSQL(Unknown > Source) > at > com.tmax.tibero.jdbc.driver.TbPreparedStatementImpl.executeInternal(Unknown > Source) > at com.tmax.tibero.jdbc.driver.TbPreparedStatementImpl.executeQuery(Unknown > Source) > at com.tmax.tibero.jdbc.driver.TbPreparedStatement.executeQuery(Unknown > Source) > at > org.apache.sqoop.mapreduce.db.DBRecordReader.executeQuery(DBRecordReader.java:109) > at > org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:233) > ... 12 more > 21/09/28 09:54:25 INFO mapreduce.Job: Task Id : > attempt_1630059382070_0035_m_000003_0, Status : FAILED > Error: java.io.IOException: SQLException in nextKeyValue > at > org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:275) > at > org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:568) > at > org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:80) > at > org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:91) > at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) > at > org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64) > at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:799) > at org.apache.hadoop.mapred.MapTask.run(MapTask.java:347) > at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:174) > at java.security.AccessController.doPrivileged(Native Method) > at javax.security.auth.Subject.doAs(Subject.java:422) > at > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1730) > ``` -- This message was sent by Atlassian Jira (v8.20.10#820010)