[ https://issues.apache.org/jira/browse/SQOOP-423?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15356901#comment-15356901 ]
simran commented on SQOOP-423: ------------------------------ But how do I add it to a sqoop job ? I tried: sqoop job -Duser.timezone=GMT --meta-connect jdbc:hsqldb:hsql://FQDN:16000/sqoop --create JOB_NAME -- import --driver com.mysql.jdbc.Driver --connect jdbc:mysql://IP/DB?zeroDateTimeBehavior=convertToNull --username root --password 'PASSWORD' --table TABLE_NAME --incremental lastmodified --check-column updated_at --last-value 0 --merge-key entity_id --split-by entity_id --target-dir PATH_TO_DIRECTORY --hive-database DB_NAME --hive-drop-import-delims --null-string '\\N' --null-non-string '\\N' --fields-terminated-by '\001' --input-null-string '\\N' --input-null-non-string '\\N' --input-null-non-string '\\N' --input-fields-terminated-by '\001' and sqoop job -Dmapred.child.java.opts="-Duser.timezone=GMT" --meta-connect jdbc:hsqldb:hsql://FQDN:16000/sqoop --create JOB_NAME -- import --driver com.mysql.jdbc.Driver --connect jdbc:mysql://IP/DB?zeroDateTimeBehavior=convertToNull --username root --password 'PASSWORD' --table TABLE_NAME --incremental lastmodified --check-column updated_at --last-value 0 --merge-key entity_id --split-by entity_id --target-dir PATH_TO_DIRECTORY --hive-database DB_NAME --hive-drop-import-delims --null-string '\\N' --null-non-string '\\N' --fields-terminated-by '\001' --input-null-string '\\N' --input-null-non-string '\\N' --input-null-non-string '\\N' --input-fields-terminated-by '\001' but none of these really worked. > Sqoop import of timestamps to Avro from Postgres - Timezone Issue > ----------------------------------------------------------------- > > Key: SQOOP-423 > URL: https://issues.apache.org/jira/browse/SQOOP-423 > Project: Sqoop > Issue Type: Bug > Affects Versions: 1.3.0 > Reporter: Lynn Goh > > I am running sqoop-1.3.0-cdh3u2 on a Mac and when I sqoop import from a > postgres table with columns of type 'timestamp without time zone', they are > converted to longs in the time zone of my local operating system, even after > I have started Hadoop up with TZ=GMT or passed in > HADOOP_OPTS="-Duser.timezone=GMT". My ultimate goal is to sqoop import into > long representations that are in GMT timezone rather than my operating > system's timezone. > Postgres example: > {code} > acamp_id | start_time | end_time > ----------+---------------------+--------------------- > 1 | 2008-01-01 00:00:00 | 2011-12-16 00:00:00 > {code} > After import, you can see the values are 8 hours ahead, even with TZ=GMT and > user.timezone set properly (this is the json representation of the parsed > imported avro file): > {code} > {"acamp_id": 1, "end_time": 1324022400000, "start_time": 1199174400000} > {code} > date utility invocation: > {code} > lynngoh@unknown:~$ date -u -r 1199174400 > Tue Jan 1 08:00:00 UTC 2008 > {code} -- This message was sent by Atlassian JIRA (v6.3.4#6332)