[ 
https://issues.apache.org/jira/browse/SQOOP-3463?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Toan Nguyen updated SQOOP-3463:
-------------------------------
    Description: 
It's my script

" sqoop-import --connect $DATASOURCE --username $USERNAME --password $PASSWORD 
-driver com.mysql.jdbc.Driver --query "SELECT * FROM sales_order_item WHERE 
item_id > 0 AND \$CONDITIONS LIMIT $ITEM_ID, 10000" --target-dir 
/user/raw/magento/$MERCHANT_ID/sales_order_item -m 8 --split-by item_id 
--fields-terminated-by "|" --merge-key item_id --hive-import --hive-table 
raw_magento_$MERCHANT_ID.sales_order_item --verbose --direct – "

And after completed process, i got only 9992 records. The happen is same with n 
mapper, when i increase n, it will lost n records.

Everything will be OK with only 1 mapper. But if i want to import large 
records, i have to use more mapper. So what should i do now? Please support me. 
Thanks in advance 

  was:
It's my script

" sqoop-import --connect $DATASOURCE --username $USERNAME --password $PASSWORD 
--driver com.mysql.jdbc.Driver \
--query "SELECT * FROM sales_order_item WHERE item_id > 0 AND \$CONDITIONS 
LIMIT $ITEM_ID, 10000" \
--target-dir /user/raw/magento/$MERCHANT_ID/sales_order_item -m 8 \
--split-by item_id \
--fields-terminated-by "|" \
--merge-key item_id \
--hive-import \
--hive-table raw_magento_$MERCHANT_ID.sales_order_item \
--verbose \
--direct – "

And after completed process, i got only 9992 records. The happen is same with n 
mapper, when i increase n, it will lost n records.

Everything will be OK with only 1 mapper. But if i want to import large 
records, i have to use more mapper. So what should i do now? Please support me. 
Thanks in advance 


> Sqoop import from MYSQL to Hive fails when increase mapper
> ----------------------------------------------------------
>
>                 Key: SQOOP-3463
>                 URL: https://issues.apache.org/jira/browse/SQOOP-3463
>             Project: Sqoop
>          Issue Type: Bug
>            Reporter: Toan Nguyen
>            Priority: Major
>
> It's my script
> " sqoop-import --connect $DATASOURCE --username $USERNAME --password 
> $PASSWORD -driver com.mysql.jdbc.Driver --query "SELECT * FROM 
> sales_order_item WHERE item_id > 0 AND \$CONDITIONS LIMIT $ITEM_ID, 10000" 
> --target-dir /user/raw/magento/$MERCHANT_ID/sales_order_item -m 8 --split-by 
> item_id --fields-terminated-by "|" --merge-key item_id --hive-import 
> --hive-table raw_magento_$MERCHANT_ID.sales_order_item --verbose --direct – "
> And after completed process, i got only 9992 records. The happen is same with 
> n mapper, when i increase n, it will lost n records.
> Everything will be OK with only 1 mapper. But if i want to import large 
> records, i have to use more mapper. So what should i do now? Please support 
> me. Thanks in advance 



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

Reply via email to