There was a hadoop version issue. Thanks. Il 13/apr/2015 15:13 "Shahab Yunus" <[email protected]> ha scritto:
> Silvio did your problem got resolved or not? I am assuming you have already > seen the example 7.2.4 from here > http://hbase.apache.org/0.94/book/mapreduce.example.html > > There seem to be some type mismatch in the job setup, along with hadoop > version. > > If you still have an issue then, can you paste your code? Both the > configuration as well as your reducer. Thanks. > > Regards, > Shahab > > On Mon, Apr 13, 2015 at 8:21 AM, Silvio Di gregorio < > [email protected]> wrote: > > > the signature of the write method is: > > write(ImmutableBytesWritable > > > > > <eclipse-javadoc:%E2%98%82=LoadUnico/C:%5C/Users%5C/sdigregorio%5C/ws-cloudera%5C/JAR%5C/hadoop-mapreduce-client-core-2.0.0-cdh4.0.0.jar%3Corg.apache.hadoop.mapreduce(TaskInputOutputContext.class%E2%98%83TaskInputOutputContext~write~TKEYOUT;~TVALUEOUT;%E2%98%82org.apache.hadoop.hbase.io.ImmutableBytesWritable> > > arg0, Writable > > > > > <eclipse-javadoc:%E2%98%82=LoadUnico/C:%5C/Users%5C/sdigregorio%5C/ws-cloudera%5C/JAR%5C/hadoop-mapreduce-client-core-2.0.0-cdh4.0.0.jar%3Corg.apache.hadoop.mapreduce(TaskInputOutputContext.class%E2%98%83TaskInputOutputContext~write~TKEYOUT;~TVALUEOUT;%E2%98%82org.apache.hadoop.io.Writable> > > arg1) > > > > arg0 don't accept NulWritable.get(), instead "null" > > > > 2015-04-13 14:13 GMT+02:00 Silvio Di gregorio < > [email protected] > > >: > > > > > in documentation ( > > http://hbase.apache.org/0.94/book/mapreduce.example.html), > > > when the Reduce extends the TableReducer class the write method > > > put the key null, or rather NullWritable like Shahab Say. > > > However, the error has disappeared when i removed the > > > "hbase-client-0.96.0-hadoop1.jar" and inserted > > "hbase-0.94.6-cdh4.3.0.jar". > > > Sorry but i don't understand, I knew only that the Context object had > not > > > accepted a PUT where was required WRITABLE with the > > > "hbase-client-0.96.0-hadoop1.jar" jar in classpath. With the > > > "hbase-0.94.6-cdh4.3.0.jar" it is possible. > > > > > > 2015-04-13 13:52 GMT+02:00 Jean-Marc Spaggiari < > [email protected] > > >: > > > > > >> Oh, Shahab is write! That's what happend when you write emails before > > your > > >> coffee ;) I confused with your "Put" key ;) Looked to quickly... > > >> > > >> JM > > >> > > >> 2015-04-13 7:46 GMT-04:00 Shahab Yunus <[email protected]>: > > >> > > >> > For the null key you should use NullWritable class, as discussed > here: > > >> > > > >> > > > >> > > > http://stackoverflow.com/questions/16198752/advantages-of-using-nullwritable-in-hadoop > > >> > > > >> > Regards, > > >> > Shahab > > >> > > > >> > On Mon, Apr 13, 2015 at 7:01 AM, Jean-Marc Spaggiari < > > >> > [email protected]> wrote: > > >> > > > >> > > Hi Silvio, > > >> > > > > >> > > What is the key you try to write into your HBase table? From your > > >> code, > > >> > > sound like you want your key to be null for all your values, which > > is > > >> not > > >> > > possible in HBase. > > >> > > > > >> > > JM > > >> > > > > >> > > 2015-04-13 6:37 GMT-04:00 Silvio Di gregorio < > > >> > [email protected] > > >> > > >: > > >> > > > > >> > > > Hi, > > >> > > > In Reduce phase when i write to Hbase Table "PFTableNa" > > >> > > > context.write(null , put); > > >> > > > Eclipse say me: > > >> > > > *"The method write(ImmutableBytesWritable, Writable) in the type > > >> > > > > > >> > > > > >> > > > >> > > > TaskInputOutputContext<Text,BytesWritable,ImmutableBytesWritable,Writable> > > >> > > > is not applicable for the arguments (null, Put)"* > > >> > > > > > >> > > > *put *is org > > >> > > > > > >> > > > > > >> > > > > >> > > > >> > > > <eclipse-javadoc:%E2%98%82=LoadUnico/C:%5C/Users%5C/sdigregorio%5C/workspace-luna%5C/hbase-client-0.96.0-hadoop1.jar%3Corg> > > >> > > > .apache > > >> > > > > > >> > > > > > >> > > > > >> > > > >> > > > <eclipse-javadoc:%E2%98%82=LoadUnico/C:%5C/Users%5C/sdigregorio%5C/workspace-luna%5C/hbase-client-0.96.0-hadoop1.jar%3Corg.apache> > > >> > > > .hadoop > > >> > > > > > >> > > > > > >> > > > > >> > > > >> > > > <eclipse-javadoc:%E2%98%82=LoadUnico/C:%5C/Users%5C/sdigregorio%5C/workspace-luna%5C/hbase-client-0.96.0-hadoop1.jar%3Corg.apache.hadoop> > > >> > > > .hbase > > >> > > > > > >> > > > > > >> > > > > >> > > > >> > > > <eclipse-javadoc:%E2%98%82=LoadUnico/C:%5C/Users%5C/sdigregorio%5C/workspace-luna%5C/hbase-client-0.96.0-hadoop1.jar%3Corg.apache.hadoop.hbase> > > >> > > > .client > > >> > > > > > >> > > > > > >> > > > > >> > > > >> > > > <eclipse-javadoc:%E2%98%82=LoadUnico/C:%5C/Users%5C/sdigregorio%5C/workspace-luna%5C/hbase-client-0.96.0-hadoop1.jar%3Corg.apache.hadoop.hbase.client> > > >> > > > .Put > > >> > > > > > >> > > > > > >> > > > > >> > > > >> > > > <eclipse-javadoc:%E2%98%82=LoadUnico/C:%5C/Users%5C/sdigregorio%5C/workspace-luna%5C/hbase-client-0.96.0-hadoop1.jar%3Corg.apache.hadoop.hbase.client(Put.class%E2%98%83Put>.Put(byte[] > > >> > > > row) > > >> > > > byte[] rowkey = key.getBytes(); > > >> > > > Put put = new Put(rowkey); > > >> > > > > > >> > > > the signature of the reduce method: > > >> > > > reduce( Text key,Iterable<BytesWritable> values, Context > context) > > >> > > > > > >> > > > and > > >> > > > > > >> > > > public static class Reduce extends TableReducer<Text, > > BytesWritable, > > >> > > > ImmutableBytesWritable>{ > > >> > > > > > >> > > > in the main method: > > >> > > > Configuration conf = HBaseConfiguration.create(); > > >> > > > Job job = new Job(conf, "LetturaFileHDFS2HBase"); > > >> > > > ... > > >> > > > TableMapReduceUtil.initTableReducerJob("PFTableNa", > Reduce.class, > > >> job); > > >> > > > > > >> > > > Thanks a lot > > >> > > > Silvio > > >> > > > > > >> > > > > >> > > > >> > > > > > > > > >
