I am trying to use the following command to import a simple text file into an hbase table but am having issues. It looks like my job is getting executed from hdfs to hbase. However I think I have some fundamental data setup issues. I have a copy of my file and the format of the hbase employee table pasted below. Any help or pointer to a simple example would be appreciated..
Hadoop Version: 1.2.0 Hbase Version: 0.94.15 Hbase Tablename: employee Command I am issuing: /hd/hadoop/bin/hadoop jar /hbase/hbase-0.94.15/hbase-0.94.15.jar importtsv -Dimporttsv.columns=HBASE_ROW_KEY,basic_info:empname,basic_info:age employee /user/hduser/employee1 14/01/29 17:36:02 INFO zookeeper.ZooKeeper: Client environment:zookeeper.version=3.4.5-1392090, built on 09/30/2012 17:52 GMT 14/01/29 17:36:02 INFO zookeeper.ZooKeeper: Client environment:host.name=usann01 14/01/29 17:36:02 INFO zookeeper.ZooKeeper: Client environment:java.version=1.7.0_25 14/01/29 17:36:02 INFO zookeeper.ZooKeeper: Client environment:java.vendor=Oracle Corporation 14/01/29 17:36:02 INFO zookeeper.ZooKeeper: Client environment:java.home=/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.25/jre 14/01/29 17:36:02 INFO zookeeper.ZooKeeper: Client environment:java.class.path=/hd/hadoop/libexec/../conf:/usr/bin/java/lib/tools.jar:/hd/hadoop/libexec/..:/hd/hadoop/libexec/../hadoop-core-1.2.1.jar:/hd/hadoop/libexec/../lib/asm-3.2.jar:/hd/hadoop/libexec/../lib/aspectjrt-1.6.11.jar:/hd/hadoop/libexec/../lib/aspectjtools-1.6.11.jar:/hd/hadoop/libexec/../lib/commons-beanutils-1.7.0.jar:/hd/hadoop/libexec/../lib/commons-beanutils-core-1.8.0.jar:/hd/hadoop/libexec/../lib/commons-cli-1.2.jar:/hd/hadoop/libexec/../lib/commons-codec-1.4.jar:/hd/hadoop/libexec/../lib/commons-collections-3.2.1.jar:/hd/hadoop/libexec/../lib/commons-configuration-1.6.jar:/hd/hadoop/libexec/../lib/commons-daemon-1.0.1.jar:/hd/hadoop/libexec/../lib/commons-digester-1.8.jar:/hd/hadoop/libexec/../lib/commons-el-1.0.jar:/hd/hadoop/libexec/../lib/commons-httpclient-3.0.1.jar:/hd/hadoop/libexec/../lib/commons-io-2.1.jar:/hd/hadoop/libexec/../lib/commons-lang-2.4.jar:/hd/hadoop/libexec/../lib/commons-logging-1.1.1.jar:/hd/hadoop/libexec/../lib/commons-logging-api-1.0.4.jar:/hd/hadoop/libexec/../lib/commons-math-2.1.jar:/hd/hadoop/libexec/../lib/commons-net-3.1.jar:/hd/hadoop/libexec/../lib/core-3.1.1.jar:/hd/hadoop/libexec/../lib/hadoop-capacity-scheduler-1.2.1.jar:/hd/hadoop/libexec/../lib/hadoop-fairscheduler-1.2.1.jar:/hd/hadoop/libexec/../lib/hadoop-thriftfs-1.2.1.jar:/hd/hadoop/libexec/../lib/hsqldb-1.8.0.10.jar:/hd/hadoop/libexec/../lib/jackson-core-asl-1.8.8.jar:/hd/hadoop/libexec/../lib/jackson-mapper-asl-1.8.8.jar:/hd/hadoop/libexec/../lib/jasper-compiler-5.5.12.jar:/hd/hadoop/libexec/../lib/jasper-runtime-5.5.12.jar:/hd/hadoop/libexec/../lib/jdeb-0.8.jar:/hd/hadoop/libexec/../lib/jersey-core-1.8.jar:/hd/hadoop/libexec/../lib/jersey-json-1.8.jar:/hd/hadoop/libexec/../lib/jersey-server-1.8.jar:/hd/hadoop/libexec/../lib/jets3t-0.6.1.jar:/hd/hadoop/libexec/../lib/jetty-6.1.26.jar:/hd/hadoop/libexec/../lib/jetty-util-6.1.26.jar:/hd/hadoop/libexec/../lib/jsch-0.1.42.jar:/hd/hadoop/libexec/../lib/junit-4.5.jar:/hd/hadoop/libexec/../lib/kfs-0.2.2.jar:/hd/hadoop/libexec/../lib/log4j-1.2.15.jar:/hd/hadoop/libexec/../lib/mockito-all-1.8.5.jar:/hd/hadoop/libexec/../lib/oro-2.0.8.jar:/hd/hadoop/libexec/../lib/servlet-api-2.5-20081211.jar:/hd/hadoop/libexec/../lib/slf4j-api-1.4.3.jar:/hd/hadoop/libexec/../lib/slf4j-log4j12-1.4.3.jar:/hd/hadoop/libexec/../lib/xmlenc-0.52.jar:/hd/hadoop/libexec/../lib/jsp-2.1/jsp-2.1.jar:/hd/hadoop/libexec/../lib/jsp-2.1/jsp-api-2.1.jar::/hbase/hbase-0.94.15/lib/guava-11.0.2.jar:/hbase/hbase-0.94.15/lib/zookeeper-3.4.5.jar:/hbase/hbase-0.94.15/lib/protobuf-java-2.4.0a.jar 14/01/29 17:36:02 INFO zookeeper.ZooKeeper: Client environment:java.library.path=/hd/hadoop/libexec/../lib/native/Linux-i386-32 14/01/29 17:36:02 INFO zookeeper.ZooKeeper: Client environment:java.io.tmpdir=/tmp 14/01/29 17:36:02 INFO zookeeper.ZooKeeper: Client environment:java.compiler=<NA> 14/01/29 17:36:02 INFO zookeeper.ZooKeeper: Client environment:os.name=Linux 14/01/29 17:36:02 INFO zookeeper.ZooKeeper: Client environment:os.arch=i386 14/01/29 17:36:02 INFO zookeeper.ZooKeeper: Client environment:os.version=2.6.32-358.14.1.el6.i686 14/01/29 17:36:02 INFO zookeeper.ZooKeeper: Client environment:user.name=hduser 14/01/29 17:36:02 INFO zookeeper.ZooKeeper: Client environment:user.home=/home/hduser 14/01/29 17:36:02 INFO zookeeper.ZooKeeper: Client environment:user.dir=/hbase/hbase-0.94.15/bin 14/01/29 17:36:02 INFO zookeeper.ZooKeeper: Initiating client connection, connectString=localhost:2181 sessionTimeout=180000 watcher=hconnection 14/01/29 17:36:02 INFO zookeeper.RecoverableZooKeeper: The identifier of this process is 18287@usann01 14/01/29 17:36:02 INFO zookeeper.ClientCnxn: Opening socket connection to server localhost/127.0.0.1:2181. Will not attempt to authenticate using SASL (unknown error) 14/01/29 17:36:02 INFO zookeeper.ClientCnxn: Socket connection established to localhost/127.0.0.1:2181, initiating session 14/01/29 17:36:02 INFO zookeeper.ClientCnxn: Session establishment complete on server localhost/127.0.0.1:2181, sessionid = 0x143dff9049b0007, negotiated timeout = 40000 14/01/29 17:36:04 INFO mapreduce.TableOutputFormat: Created table instance for employee 14/01/29 17:36:04 INFO input.FileInputFormat: Total input paths to process : 1 14/01/29 17:36:04 INFO util.NativeCodeLoader: Loaded the native-hadoop library 14/01/29 17:36:04 WARN snappy.LoadSnappy: Snappy native library not loaded 14/01/29 17:36:04 INFO mapred.JobClient: Running job: job_201401291611_0001 14/01/29 17:36:05 INFO mapred.JobClient: map 0% reduce 0% 14/01/29 17:36:19 INFO mapred.JobClient: map 100% reduce 0% 14/01/29 17:36:21 INFO mapred.JobClient: Job complete: job_201401291611_0001 14/01/29 17:36:21 INFO mapred.JobClient: Counters: 19 14/01/29 17:36:21 INFO mapred.JobClient: Job Counters 14/01/29 17:36:21 INFO mapred.JobClient: SLOTS_MILLIS_MAPS=8187 14/01/29 17:36:21 INFO mapred.JobClient: Total time spent by all reduces waiting after reserving slots (ms)=0 14/01/29 17:36:21 INFO mapred.JobClient: Total time spent by all maps waiting after reserving slots (ms)=0 14/01/29 17:36:21 INFO mapred.JobClient: Launched map tasks=1 14/01/29 17:36:21 INFO mapred.JobClient: Data-local map tasks=1 14/01/29 17:36:21 INFO mapred.JobClient: SLOTS_MILLIS_REDUCES=0 14/01/29 17:36:21 INFO mapred.JobClient: ImportTsv 14/01/29 17:36:21 INFO mapred.JobClient: Bad Lines=3 14/01/29 17:36:21 INFO mapred.JobClient: File Output Format Counters 14/01/29 17:36:21 INFO mapred.JobClient: Bytes Written=0 14/01/29 17:36:21 INFO mapred.JobClient: FileSystemCounters 14/01/29 17:36:21 INFO mapred.JobClient: HDFS_BYTES_READ=131 14/01/29 17:36:21 INFO mapred.JobClient: FILE_BYTES_WRITTEN=79714 14/01/29 17:36:21 INFO mapred.JobClient: File Input Format Counters 14/01/29 17:36:21 INFO mapred.JobClient: Bytes Read=24 14/01/29 17:36:21 INFO mapred.JobClient: Map-Reduce Framework 14/01/29 17:36:21 INFO mapred.JobClient: Map input records=3 14/01/29 17:36:21 INFO mapred.JobClient: Physical memory (bytes) snapshot=38928384 14/01/29 17:36:21 INFO mapred.JobClient: Spilled Records=0 14/01/29 17:36:21 INFO mapred.JobClient: CPU time spent (ms)=120 14/01/29 17:36:21 INFO mapred.JobClient: Total committed heap usage (bytes)=9502720 14/01/29 17:36:21 INFO mapred.JobClient: Virtual memory (bytes) snapshot=347398144 14/01/29 17:36:21 INFO mapred.JobClient: Map output records=0 14/01/29 17:36:21 INFO mapred.JobClient: SPLIT_RAW_BYTES=107 [hduser@usann01 bin]$ Here is my file on hdfs: [hduser@usann01 bin]$ /hd/hadoop/bin/hadoop dfs -cat /user/hduser/employee1 emp1,24 emp2,26 emp3,24 Here is my table in hbase: hbase(main):003:0> scan 'employee' ROW COLUMN+CELL 0 row(s) in 0.0160 seconds hbase(main):004:0> Notice: This e-mail message, together with any attachments, contains information of Merck & Co., Inc. (One Merck Drive, Whitehouse Station, New Jersey, USA 08889), and/or its affiliates Direct contact information for affiliates is available at http://www.merck.com/contact/contacts.html) that may be confidential, proprietary copyrighted and/or legally privileged. It is intended solely for the use of the individual or entity named on this message. If you are not the intended recipient, and have received this message in error, please notify us immediately by reply e-mail and then delete it from your system.