Wouldn't you be better to delegate the compression part to Cassandra (which
support Snappy [1])? This way the compression part will be completely
transparent to your application.

[1] http://www.datastax.com/dev/blog/whats-new-in-cassandra-1-0-compression


On Tue, Jan 28, 2014 at 8:51 PM, Check Peck <comptechge...@gmail.com> wrote:

> I am working on a project in which I am supposed to store the snappy
> compressed data in Cassandra, so that when I retrieve the same data from
> Cassandra, it should be snappy compressed in memory and then I will
> decompress that data using snappy to get the actual data from it.
>
> I am having a byte array in `bytesToStore` variable, then I am snappy
> compressing it using google `Snappy` and stored it back into Cassandra -
>
>     // .. some code here
>     System.out.println(bytesToStore);
>
>     byte[] compressed = Snappy.compress(bytesToStore);
>
>     attributesMap.put("e1", compressed);
>
>     ICassandraClient client = CassandraFactory.getInstance().getDao();
>     // write to Cassandra
>     client.upsertAttributes("0123", attributesMap, "sample_table");
>
> After inserting the data in Cassandra, I went back into CQL mode and I
> queried it and I can see this data in my table for the test_id `0123`-
>
>     cqlsh:testingks> select * from sample_table where test_id = '0123';
>
>      test_id | name | value
>
> ---------+-------------+------------------------------------------------------------------------------------------------
>         0123 |   e1 |
> 0x2cac7fff0000012c4ebb95550000001e42797465204172726179205465737420466f722042696720456e6469616e
>
>
> Now I am trying to read the same data back from Cassandra and everytime it
> is giving me `IllegalArgumentException` -
>
>     public Map<String, byte[]> getDataFromCassandra(final String rowKey,
> final Collection<String> attributeNames) {
>
>         Map<String, byte[]> dataFromCassandra = new
> ConcurrentHashMap<String, byte[]>();
>
>         try {
>             String query="SELECT test_id, name, value from sample_table
> where test_id = '"+rowKey+ "';";
>             //SELECT test_id, name, value from sample_table where test_id
> = '0123';
>             System.out.println(query);
>
>             DatastaxConnection.getInstance();
>
>             ResultSet result =
> DatastaxConnection.getSession().execute(query);
>
>             Iterator<Row> it = result.iterator();
>
>             while (it.hasNext()) {
>                 Row r = it.next();
>                 for(String str : attributeNames) {
>                     ByteBuffer bb = r.getBytes(str); // this line is
> throwing an exception for me
>                     byte[] ba=new byte[bb.remaining()];
>                     bb.get(ba, 0, ba.length);
>                     dataFromCassandra.put(str, ba);
>                 }
>             }
>         } catch (Exception e) {
>             e.printStackTrace();
>         }
>
>         return dataFromCassandra;
>     }
>
> This is the Exception I am getting -
>
>     java.lang.IllegalArgumentException: e1 is not a column defined in this
> metadata
>
> In the above method, I am passing rowKey as `0123` and `attributeNames`
> contains `e1` as the string.
>
> I am expecting Snappy Compressed data in `dataFromCassandra` Map. In this
> map the key should be `e1` and the value should be snappy compressed data
> if I am not wrong.. And then I will iterate this Map to snappy decompress
> the data..
>
> I am using Datastax Java client working with Cassandra 1.2.9.
>
> Any thoughts what wrong I am doing here?
>
> To unsubscribe from this group and stop receiving emails from it, send an
> email to java-driver-user+unsubscr...@lists.datastax.com.
>



-- 

:- a)


Alex Popescu
Sen. Product Manager @ DataStax
@al3xandru

Reply via email to