Hi Dan,

Since the "/" in my key name is being URL encoded (and stored URL
encoded by Riak), I've found that the following key filter makes it
work:

["urldecode"],
["tokenize", "/", 2],
["string_to_int"],
["less_than", 50]

It looks like trying to tokenize on the URL encoded separator ("%2F")
doesn't work very well.

-J

On Thu, Mar 3, 2011 at 1:19 PM, Dan Reverri <d...@basho.com> wrote:
> Hi Jason,
> I'm able to reproduce the issue when the keys I am filtering do not contain
> the token. For example, if my token is "-" and my key is "helloworld" the
> tokens for that key become:
> ["helloworld"]
> Grabbing the second element of that list returns an error:
> 3> string:tokens("helloworld", "-").
> ["helloworld"]
> 4> lists:nth(2, ["helloworld"]).
> ** exception error: no function clause matching lists:nth(1,[])
> It seems Riak should catch errors during the filtering process and discard
> keys that cause errors. I will file a bug.
> Thanks,
> Dan
> Daniel Reverri
> Developer Advocate
> Basho Technologies, Inc.
> d...@basho.com
>
>
> On Thu, Mar 3, 2011 at 10:28 AM, Jason J. W. Williams
> <jasonjwwilli...@gmail.com> wrote:
>>
>> If someone could help me understand just this error, that would help a
>> lot: https://gist.github.com/852450
>>
>> Thank you in advance.
>>
>> -J
>>
>> On Wed, Mar 2, 2011 at 11:55 PM, Jason J. W. Williams
>> <jasonjwwilli...@gmail.com> wrote:
>> > Hi,
>> >
>> > I'm experimenting using key filters to implement indexes. My approach
>> > is for each data key in bucket A, to create a new empty key in a
>> > dedicated index bucket where the original key name and value of the
>> > indexed field is encoded in the key name for a new index key.
>> >
>> > Data key looks like this:
>> >
>> > Bucket - riak_perf_test
>> > Key - ccode_<unique_6_digit_ID> : {"redeemed_count": 23 }
>> >
>> > For each data key created, an index key is created:
>> >
>> > Bucket - idx=redeemed_count=ccode
>> > Key - ccode/23
>> >
>> > (in both keys 23 changes on a per-key basis based on what
>> > "redeemed_count" is set to)
>> >
>> >
>> > My goal is to be able to do a key filtered Map Reduce job on
>> > idx=redeemed_count=ccode that generates a list of all data key names
>> > with a redeemed_count < 50.
>> >
>> > The job (using curl) is here: https://gist.github.com/852451
>> >
>> > It errors out almost immediately in sasl-error.log
>> > (https://gist.github.com/852450), but the request doesn't immediately
>> > error out to the client. The only error the client sees is an eventual
>> > timeout error.
>> >
>> > So my question is, what is the error in sasl-error.log telling me is
>> > wrong with my job construction? And also, why is there only a timeout
>> > error generated to the client instead of a map_reduce_error as I've
>> > seen for non-key filtered jobs?
>> >
>> > Thank you in advance for any help. I greatly appreciate it.
>> >
>> > -J
>> >
>>
>> _______________________________________________
>> riak-users mailing list
>> riak-users@lists.basho.com
>> http://lists.basho.com/mailman/listinfo/riak-users_lists.basho.com
>
>

_______________________________________________
riak-users mailing list
riak-users@lists.basho.com
http://lists.basho.com/mailman/listinfo/riak-users_lists.basho.com

Reply via email to