Hello,
I created a feature request for compression of values about a year ago
(https://issues.basho.com/show_bug.cgi?id=412), but unfortunately it
seems there was no interest in that.
For the time being we have to patch Riak to enable compression, though
we do it in the riak_kv_vnode module by cha
Has there been any talk of using compression, maybe something like
Snappy (http://code.google.com/p/snappy/) since it's fast and
shouldn't affect performance too much?
On Fri, Jun 24, 2011 at 3:29 PM, Aphyr wrote:
> Nope.
>
> On 06/24/2011 03:24 PM, Andrew Berman wrote:
>>
>> And related, does Bi
And related, does Bitcask have any sort of compression built into it?
On Fri, Jun 24, 2011 at 2:58 PM, Andrew Berman wrote:
> Mathias,
>
> I took the BERT encoding and then encoded that as Base64 which should
> pass the test of valid UTF-8 characters. However, now I'm starting to
> think that ma
Mathias,
I took the BERT encoding and then encoded that as Base64 which should
pass the test of valid UTF-8 characters. However, now I'm starting to
think that maybe doing two encodings and storing that for the purpose
of saving space is not worth the trade-off in performance vs just
storing the
Yes, I am able to do that, but I feel like this completely defeats the
purpose of a link by having to do two different calls. I might as
well just store the user id in the data for user_email instead and not
use a link at all with your method. What advantage does a link offer
at that point?
On T
On 23 Jun 2011, at 16:55, Jeremiah Peschka wrote:
> HTTP link walking will get you back the data in the way that you'd expect.
>
> It's a two-step process using PBC. MR link phases will give you a list of
> [bucket, key, tag] that you can then use to pull back the records from Riak.
>
The new
HTTP link walking will get you back the data in the way that you'd expect.
It's a two-step process using PBC. MR link phases will give you a list of
[bucket, key, tag] that you can then use to pull back the records from Riak.
---
Jeremiah Peschka
Founder, Brent Ozar PLF, LLC
On Thursday, Jun
Ah, ok, that makes sense. One more question, when I use the HTTP link
walking, I do get the data back as expected, so is there a way to
replicate this in a Map-Reduce job or using the Erlang PBC (which I
forgot to mention is what I'm using and the reason I'm not using the
HTTP link walking method)
Andrew,
the data looks like JSON, but it's not valid JSON. Have a look at the list
that's in the data section (which is your BERT encoded data), the first
character in that list is 131, which is not a valid UTF-8 character, and JSON
only allows valid UTF-8 characters. With a binary-encoded form
But isn't the value itself JSON? Meaning this part:
{struct,
[{<<"bucket">>,<<"user">>},
{<<"key">>,<<"LikiWUPJSFuxtrhCYpsPfg">>},
{<<"vclock">>,
<<"a85hYGBgzGDKBVIsLKaZdzOYEhnzWBmes6Yd58sCAA==">>},
The object has to be JSON-encoded to be marshalled into the Javascript VM,
and also on the way out if the Accept header indicates application/json. So
you have two places where it needs to be encodable into JSON.
On Thu, Jun 23, 2011 at 11:14 AM, Andrew Berman wrote:
> Mathias,
>
> I thought Ri
Mathias,
I thought Riak was content agnostic when it came to the data being
stored? The map phase is not running Riak.mapValuesJson, so why is
the data itself going through the JSON parser? The JSON value
returned by v with all the info is valid and I see the struct atom in
there so mochijson2 c
Andrew,
you're indeed hitting a JSON encoding problem here. BERT is binary data, and
won't make the JSON parser happy when trying to deserialize it, before handing
it into the map phase. You have two options here, and none of them will involve
JavaScript as the MapReduce language.
1.) Use the
That looks just like the json of death I was experiencing. Can you try
doing a get on that key and using a json validator on it? Riak will
let you put invalid json in, but the map/reduce parser will break on
it.
On Wed, Jun 22, 2011 at 10:59 PM, Andrew Berman wrote:
> Hey Ryan,
>
> Here is the e
Hey Ryan,
Here is the error from the sasl log. It looks like some sort of
encoding error. Any thoughts on how to fix this? I am storing the
data as BERT encoded binary and I set the content-type as
application/octet-stream.
Thanks for your help!
Andrew
ERROR REPORT 9-Jun-2011::21:37:05 =
Andrew,
Maybe you could elaborate on the error? I tested this against master
(commit below) just now with success.
2b1a474f836d962fa035f48c05452e22fc6c2193 Change dependency to allow for
R14B03 as well as R14B02
-Ryan
On Wed, Jun 22, 2011 at 7:03 PM, Andrew Berman wrote:
> Hello,
>
> I'm hav
16 matches
Mail list logo