----------------------------------------
> From: johnlu...@hotmail.com
> To: user@cassandra.apache.org
> Subject: RE: cassandra hadoop reducer writing to CQL3 - primary key - must it 
> be text type?
> Date: Wed, 9 Oct 2013 18:33:13 -0400
>
> reduce method :
>
>         public void reduce(LongWritable writableRecid, Iterable<LongWritable> 
> values, Context context) throws IOException, InterruptedException
>         {
>             Long sum = 0L;
>             Long recordid = writableRecid.get();
>             List<ByteBuffer> vbles = null;
>             byte[] longByterray = new byte[8];
>             for(int i= 0; i < 8; i++) {
>                 longByterray[i] = (byte)(recordid>> (i * 8));
>             }
>             ByteBuffer recordIdByteBuf = ByteBuffer.allocate(8);
>             recordIdByteBuf.wrap(longByterray);
>             keys.put("recordid", recordIdByteBuf);
>                       ...
>             context.write(keys, vbles);
>         }
>

I finally got it working after finding the LongSerializer class source in 
cassandra,
I see that the correct way to build a ByteBuffer key from a Long is

    public ByteBuffer serialize(Long value)
    {
        return value == null ? ByteBufferUtil.EMPTY_BYTE_BUFFER : 
ByteBufferUtil.bytes(value);
    }

John                                      

Reply via email to