Just started to use Hadoop, Hive, Sqoop today. Sorry for any stupid questions.

I managed to run a sqoop-import command. The command output says "Hive import 
complete". But I cannot see the table within hive. Hive command "show tables" 
show nothing.

My Sqoop command is: 
/bin/sqoop-import --connect jdbc:mysql://xxx.com/yyyy --table user_entity 
--username uuuuuu --password ppppp --hive-table h_user_entity 
--create-hive-table
>
My Hive command is:
show tables;
Sqoop command output is:
Warning: /usr/lib/hbase does not exist! HBase imports will fail.
Please set $HBASE_HOME to the root of your HBase installation.
>Warning: /usr/lib/hcatalog does not exist! HCatalog jobs will fail.
>Please set $HCAT_HOME to the root of your HCatalog installation.
>Warning: $HADOOP_HOME is deprecated.
>
>
>13/11/19 15:02:16 WARN tool.BaseSqoopTool: Setting your password on the 
>command-line is insecure. Consider using -P instead.
>13/11/19 15:02:16 INFO tool.BaseSqoopTool: Using Hive-specific delimiters for 
>output. You can override
>13/11/19 15:02:16 INFO tool.BaseSqoopTool: delimiters with 
>--fields-terminated-by, etc.
>13/11/19 15:02:16 WARN tool.BaseSqoopTool: It seems that you've specified at 
>least one of following:
>13/11/19 15:02:16 WARN tool.BaseSqoopTool: --hive-home
>13/11/19 15:02:16 WARN tool.BaseSqoopTool: --hive-overwrite
>13/11/19 15:02:16 WARN tool.BaseSqoopTool: --create-hive-table
>13/11/19 15:02:16 WARN tool.BaseSqoopTool: --hive-table
>13/11/19 15:02:16 WARN tool.BaseSqoopTool: --hive-partition-key
>13/11/19 15:02:16 WARN tool.BaseSqoopTool: --hive-partition-value
>13/11/19 15:02:16 WARN tool.BaseSqoopTool: --map-column-hive
>13/11/19 15:02:16 WARN tool.BaseSqoopTool: Without specifying parameter 
>--hive-import. Please note that
>13/11/19 15:02:16 WARN tool.BaseSqoopTool: those arguments will not be used in 
>this session. Either
>13/11/19 15:02:16 WARN tool.BaseSqoopTool: specify --hive-import to apply them 
>correctly or remove them
>13/11/19 15:02:16 WARN tool.BaseSqoopTool: from command line to remove this 
>warning.
>13/11/19 15:02:16 INFO tool.BaseSqoopTool: Please note that --hive-home, 
>--hive-partition-key, 
>13/11/19 15:02:16 INFO tool.BaseSqoopTool:  hive-partition-value and 
>--map-column-hive options are 
>13/11/19 15:02:16 INFO tool.BaseSqoopTool:  are also valid for HCatalog 
>imports and exports
>13/11/19 15:02:16 INFO manager.MySQLManager: Preparing to use a MySQL 
>streaming resultset.
>13/11/19 15:02:16 INFO tool.CodeGenTool: Beginning code generation
>13/11/19 15:02:16 INFO manager.SqlManager: Executing SQL statement: SELECT t.* 
>FROM `user_entity` AS t LIMIT 1
>13/11/19 15:02:16 INFO manager.SqlManager: Executing SQL statement: SELECT t.* 
>FROM `user_entity` AS t LIMIT 1
>13/11/19 15:02:16 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is 
>/data/hadoop/current
>Note: 
>/tmp/sqoop-root/compile/39e8d314afcc1b81c1cc50a6c3d396b1/user_entity.java uses 
>or overrides a deprecated API.
>Note: Recompile with -Xlint:deprecation for details.
>13/11/19 15:02:17 INFO orm.CompilationManager: Writing jar file: 
>/tmp/sqoop-root/compile/39e8d314afcc1b81c1cc50a6c3d396b1/user_entity.jar
>13/11/19 15:02:17 WARN manager.MySQLManager: It looks like you are importing 
>from mysql.
>13/11/19 15:02:17 WARN manager.MySQLManager: This transfer can be faster! Use 
>the --direct
>13/11/19 15:02:17 WARN manager.MySQLManager: option to exercise a 
>MySQL-specific fast path.
>13/11/19 15:02:17 INFO manager.MySQLManager: Setting zero DATETIME behavior to 
>convertToNull (mysql)
>13/11/19 15:02:17 INFO mapreduce.ImportJobBase: Beginning import of user_entity
>13/11/19 15:02:18 INFO db.DataDrivenDBInputFormat: BoundingValsQuery: SELECT 
>MIN(`id`), MAX(`id`) FROM `user_entity`
>13/11/19 15:02:18 WARN db.TextSplitter: Generating splits for a textual index 
>column.
>13/11/19 15:02:18 WARN db.TextSplitter: If your database sorts in a 
>case-insensitive order, this may result in a partial import or duplicate 
>records.
>13/11/19 15:02:18 WARN db.TextSplitter: You are strongly encouraged to choose 
>an integral split column.
>13/11/19 15:02:19 INFO mapred.JobClient: Running job: job_201311181910_0007
>13/11/19 15:02:20 INFO mapred.JobClient:  map 0% reduce 0%
>13/11/19 15:02:29 INFO mapred.JobClient:  map 16% reduce 0%
>13/11/19 15:02:32 INFO mapred.JobClient:  map 33% reduce 0%
>13/11/19 15:02:35 INFO mapred.JobClient:  map 50% reduce 0%
>13/11/19 15:02:39 INFO mapred.JobClient:  map 66% reduce 0%
>13/11/19 15:02:42 INFO mapred.JobClient:  map 83% reduce 0%
>13/11/19 15:02:43 INFO mapred.JobClient:  map 100% reduce 0%
>13/11/19 15:02:44 INFO mapred.JobClient: Job complete: job_201311181910_0007
>13/11/19 15:02:44 INFO mapred.JobClient: Counters: 18
>13/11/19 15:02:44 INFO mapred.JobClient:   Job Counters 
>13/11/19 15:02:44 INFO mapred.JobClient:     SLOTS_MILLIS_MAPS=38087
>13/11/19 15:02:44 INFO mapred.JobClient:     Total time spent by all reduces 
>waiting after reserving slots (ms)=0
>13/11/19 15:02:44 INFO mapred.JobClient:     Total time spent by all maps 
>waiting after reserving slots (ms)=0
>13/11/19 15:02:44 INFO mapred.JobClient:     Launched map tasks=6
>13/11/19 15:02:44 INFO mapred.JobClient:     SLOTS_MILLIS_REDUCES=0
>13/11/19 15:02:44 INFO mapred.JobClient:   File Output Format Counters 
>13/11/19 15:02:44 INFO mapred.JobClient:     Bytes Written=85103032
>13/11/19 15:02:44 INFO mapred.JobClient:   FileSystemCounters
>13/11/19 15:02:44 INFO mapred.JobClient:     HDFS_BYTES_READ=825
>13/11/19 15:02:44 INFO mapred.JobClient:     FILE_BYTES_WRITTEN=389460
>13/11/19 15:02:44 INFO mapred.JobClient:     HDFS_BYTES_WRITTEN=85103032
>13/11/19 15:02:44 INFO mapred.JobClient:   File Input Format Counters 
>13/11/19 15:02:44 INFO mapred.JobClient:     Bytes Read=0
>13/11/19 15:02:44 INFO mapred.JobClient:   Map-Reduce Framework
>13/11/19 15:02:44 INFO mapred.JobClient:     Map input records=387423
>13/11/19 15:02:44 INFO mapred.JobClient:     Physical memory (bytes) 
>snapshot=787578880
>13/11/19 15:02:44 INFO mapred.JobClient:     Spilled Records=0
>13/11/19 15:02:44 INFO mapred.JobClient:     CPU time spent (ms)=22420
>13/11/19 15:02:44 INFO mapred.JobClient:     Total committed heap usage 
>(bytes)=1205403648
>13/11/19 15:02:44 INFO mapred.JobClient:     Virtual memory (bytes) 
>snapshot=6980870144
>13/11/19 15:02:44 INFO mapred.JobClient:     Map output records=387423
>13/11/19 15:02:44 INFO mapred.JobClient:     SPLIT_RAW_BYTES=825
>13/11/19 15:02:44 INFO mapreduce.ImportJobBase: Transferred 81.1606 MB in 
>26.1394 seconds (3.1049 MB/sec)
>13/11/19 15:02:44 INFO mapreduce.ImportJobBase: Retrieved 387423 records.
>13/11/19 15:02:44 INFO manager.SqlManager: Executing SQL statement: SELECT t.* 
>FROM `user_entity` AS t LIMIT 1
>13/11/19 15:02:44 INFO hive.HiveImport: Removing temporary files from import 
>process: hdfs://localhost:9000/user/root/user_entity/_logs
>13/11/19 15:02:44 INFO hive.HiveImport: Loading uploaded data into Hive
>13/11/19 15:02:46 INFO hive.HiveImport: 
>13/11/19 15:02:46 INFO hive.HiveImport: Logging initialized using 
>configuration in 
>jar:file:/data/hive/hive-0.12.0/lib/hive-common-0.12.0.jar!/hive-log4j.properties
>13/11/19 15:02:51 INFO hive.HiveImport: OK
>13/11/19 15:02:51 INFO hive.HiveImport: Time taken: 5.221 seconds
>13/11/19 15:02:51 INFO hive.HiveImport: Loading data to table 
>default.h_user_entity
>13/11/19 15:02:52 INFO hive.HiveImport: Table default.h_user_entity stats: 
>[num_partitions: 0, num_files: 7, num_rows: 0, total_size: 85103032, 
>raw_data_size: 0]
>13/11/19 15:02:52 INFO hive.HiveImport: OK
>13/11/19 15:02:52 INFO hive.HiveImport: Time taken: 0.636 seconds
>13/11/19 15:02:52 INFO hive.HiveImport: Hive import complete.
>13/11/19 15:02:52 INFO hive.HiveImport: Export directory is empty, removing it.

Any help is highly appreciated.

-Kevin

Reply via email to