[ 
https://issues.apache.org/jira/browse/SQOOP-3482?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Vinodh Kumar R updated SQOOP-3482:
----------------------------------
    Description: 
+*Issue description:*+
 We are trying to sqoop-import records from PostgreSQL server, which contain a 
fields datatype 'money'.
 During the course of import the Postgres JDBC driver is trying read this field 
as datatype 'double' and eventually it fails to import the record and throws 
the below exception -
 Bad value for type double : 100,000.00

Steps to reproduce the behaviour:

+*Environment details -*+
 Tried with Hadoop 2.7.1 and 3.1.4
 sqoop-version = 1.4.6 and 1.4.7
 Postgre JDBC driver = postgresql-42.2.16.jar
 Postgre SQL 12.4
 Linux OS - Ubuntu 20.04.1 LTS

Database Name : stg_db
 Table Name : tbl_cust
 Columns and DataType :
 cust_f_name varchar(25),
 cust_l_name varchar(25),
 trans_amount money
 Table Contains 1 row as below -
 insert into tbl_cust values ('VR','Kumar',100000)

+*Sqoop Statement (Connection string)*+
 sqoop-import --connect jdbc:postgresql://192.168.0.130:5432/stg_db
 --username postgres --password cornerstone
 --table tbl_cust -m 1 --target-dir tbl_cust --delete-target-dir;

Below is the log message -
 _20/09/08 14:24:23 INFO mapreduce.Job: Task Id : 
attempt_1599505663642_0011_m_000000_0, Status : FAILED_
 _Error: java.io.IOException: SQLException in nextKeyValue_
 _at 
org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:277)_
 _at 
org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:556)_
 _at 
org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:80)_
 _at 
org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:91)_
 _at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145)_
 _at 
org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64)_
 _at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:787)_
 _at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)_
 _at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:164)_
 _at java.security.AccessController.doPrivileged(Native Method)_
 _at javax.security.auth.Subject.doAs(Subject.java:422)_
 _at 
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)_
 _at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)_
 *_Caused by: org.postgresql.util.PSQLException: Bad value for type double : 
100,000.00_*
 _at org.postgresql.jdbc.PgResultSet.toDouble(PgResultSet.java:3104)_
 _at org.postgresql.jdbc.PgResultSet.getDouble(PgResultSet.java:2432)_
 _at 
org.apache.sqoop.lib.JdbcWritableBridge.readDouble(JdbcWritableBridge.java:86)_
 _at 
com.cloudera.sqoop.lib.JdbcWritableBridge.readDouble(JdbcWritableBridge.java:69)_
 _at tabletodrop.readFields(tabletodrop.java:106)_
 _at 
org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:244)_
 _... 12 more_

However, other tables which doesn't have any field with datatype 'money' gets 
imported without any issues.
 Attached screenshot of the record in the postgresql table and the log details 
for reference.

  was:
+*Issue description:*+
 We are trying to sqoop-import records from PostgreSQL server, which contain a 
fields datatype 'money'.
 During the course of import the Postgres JDBC driver is trying read this field 
as datatype 'double' and eventually it fails to import the record and throws 
the below exception -
 Bad value for type double : 100,000.00

Steps to reproduce the behaviour:

+*Environment details -*+
 Tried with Hadoop 2.7.1 and 3.1.4
 sqoop-version = 1.4.6 and 1.4.7
 Postgre JDBC driver = postgresql-42.2.16.jar
 Postgre SQL 12.4
 Linux OS - Ubuntu 20.04.1 LTS

Database Name : stg_db
 Table Name : tbl_cust
 Columns and DataType :
 cust_f_name varchar(25),
 cust_l_name varchar(25),
 trans_amount money
 Table Contains 1 row as below -
 insert into tbl_cust values ('VR','Kumar',100000)

Sqoop Statement (Connection string)
 sqoop-import --connect jdbc:postgresql://192.168.0.130:5432/stg_db
 --username postgres --password cornerstone
 --table tbl_cust -m 1 --target-dir tbl_cust --delete-target-dir;

Below is the log message -
 _20/09/08 14:24:23 INFO mapreduce.Job: Task Id : 
attempt_1599505663642_0011_m_000000_0, Status : FAILED_
 _Error: java.io.IOException: SQLException in nextKeyValue_
 _at 
org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:277)_
 _at 
org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:556)_
 _at 
org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:80)_
 _at 
org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:91)_
 _at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145)_
 _at 
org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64)_
 _at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:787)_
 _at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)_
 _at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:164)_
 _at java.security.AccessController.doPrivileged(Native Method)_
 _at javax.security.auth.Subject.doAs(Subject.java:422)_
 _at 
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)_
 _at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)_
 *_Caused by: org.postgresql.util.PSQLException: Bad value for type double : 
100,000.00_*
 _at org.postgresql.jdbc.PgResultSet.toDouble(PgResultSet.java:3104)_
 _at org.postgresql.jdbc.PgResultSet.getDouble(PgResultSet.java:2432)_
 _at 
org.apache.sqoop.lib.JdbcWritableBridge.readDouble(JdbcWritableBridge.java:86)_
 _at 
com.cloudera.sqoop.lib.JdbcWritableBridge.readDouble(JdbcWritableBridge.java:69)_
 _at tabletodrop.readFields(tabletodrop.java:106)_
 _at 
org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:244)_
 _... 12 more_

However, other tables which doesn't have any field with datatype 'money' gets 
imported without any issues.
 Attached screenshot of the record in the postgresql table and the log details 
for reference.


> sqoop-import fails when importing records with datatype 'Money' in PostgreSQL
> -----------------------------------------------------------------------------
>
>                 Key: SQOOP-3482
>                 URL: https://issues.apache.org/jira/browse/SQOOP-3482
>             Project: Sqoop
>          Issue Type: Bug
>          Components: connectors/postgresql
>    Affects Versions: 1.4.7
>         Environment: +*!1.JPG!!2.JPG!*+
>            Reporter: Vinodh Kumar R
>            Priority: Minor
>         Attachments: 1.JPG, 2.JPG
>
>
> +*Issue description:*+
>  We are trying to sqoop-import records from PostgreSQL server, which contain 
> a fields datatype 'money'.
>  During the course of import the Postgres JDBC driver is trying read this 
> field as datatype 'double' and eventually it fails to import the record and 
> throws the below exception -
>  Bad value for type double : 100,000.00
> Steps to reproduce the behaviour:
> +*Environment details -*+
>  Tried with Hadoop 2.7.1 and 3.1.4
>  sqoop-version = 1.4.6 and 1.4.7
>  Postgre JDBC driver = postgresql-42.2.16.jar
>  Postgre SQL 12.4
>  Linux OS - Ubuntu 20.04.1 LTS
> Database Name : stg_db
>  Table Name : tbl_cust
>  Columns and DataType :
>  cust_f_name varchar(25),
>  cust_l_name varchar(25),
>  trans_amount money
>  Table Contains 1 row as below -
>  insert into tbl_cust values ('VR','Kumar',100000)
> +*Sqoop Statement (Connection string)*+
>  sqoop-import --connect jdbc:postgresql://192.168.0.130:5432/stg_db
>  --username postgres --password cornerstone
>  --table tbl_cust -m 1 --target-dir tbl_cust --delete-target-dir;
> Below is the log message -
>  _20/09/08 14:24:23 INFO mapreduce.Job: Task Id : 
> attempt_1599505663642_0011_m_000000_0, Status : FAILED_
>  _Error: java.io.IOException: SQLException in nextKeyValue_
>  _at 
> org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:277)_
>  _at 
> org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:556)_
>  _at 
> org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:80)_
>  _at 
> org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:91)_
>  _at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145)_
>  _at 
> org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64)_
>  _at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:787)_
>  _at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)_
>  _at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:164)_
>  _at java.security.AccessController.doPrivileged(Native Method)_
>  _at javax.security.auth.Subject.doAs(Subject.java:422)_
>  _at 
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)_
>  _at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)_
>  *_Caused by: org.postgresql.util.PSQLException: Bad value for type double : 
> 100,000.00_*
>  _at org.postgresql.jdbc.PgResultSet.toDouble(PgResultSet.java:3104)_
>  _at org.postgresql.jdbc.PgResultSet.getDouble(PgResultSet.java:2432)_
>  _at 
> org.apache.sqoop.lib.JdbcWritableBridge.readDouble(JdbcWritableBridge.java:86)_
>  _at 
> com.cloudera.sqoop.lib.JdbcWritableBridge.readDouble(JdbcWritableBridge.java:69)_
>  _at tabletodrop.readFields(tabletodrop.java:106)_
>  _at 
> org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:244)_
>  _... 12 more_
> However, other tables which doesn't have any field with datatype 'money' gets 
> imported without any issues.
>  Attached screenshot of the record in the postgresql table and the log 
> details for reference.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

Reply via email to