Hey list,
A script recently introduced to cleanup old data by deleting it has caused
one of our old reporting scripts to start failing with “not_found”. I’d
encountered this once before - so I thought the simple introduction of a
reduce phase using Riak.filterNotFound would fix it.
However, now I
BTW, this cluster is running 1.4.0 still. If 1.4.2 would fix this issue I
could update.
On 21 October 2013 10:42, Matt Black wrote:
> Hey list,
>
> A script recently introduced to cleanup old data by deleting it has caused
> one of our old reporting scripts to start failing with “not_found”. I’
The plot thickens. Having run the same query a couple more times just now -
I see a different error! (No changes we made to the code).
Exception: Error processing stream message:
exit:{ucs,{bad_utf8_character_code}}:[{xmerl_ucs,
f
This is also graph about "Queries per second".
You can see end of the graph, "SET" (Red one) reports zero.
[cid:24D4ADBF-24F7-4448-B9A3-F2C7491A6F46]
2013. 10. 17., 오전 10:59, 성동찬_Chan mailto:c...@kakao.com>> 작성:
Hi! Luke
I'm using JAVA Client. Also I don't use http, so it's protocol buffers. (
I side-stepped this error by adding this little block of code into the top
of my map phase (which we are using elsewhere in the same project):
if(v.values[0].metadata['X-Riak-Deleted'] !== undefined) {
return [];
}
Unfortunately I now have a different problem, which I’ll detail in a
separate
Following on from some earlier errors I was getting, I’m now kind of stuck
between a rock and a hard place.
One of our statistics reports fails with a timeout during a
query.filter_not_found() phase:
Exception:
{"phase":2,"error":"timeout","input":"[<<\"users\">>,<<\"33782eee0470cac583b136fd063d
Hi Nathan,
One alternative to the pure 2i-based solution for this would be time
boxing. Sean referenced it a few months back on the list [1] and it's
worth investigating. There are a few other resources I'm failing to
remember at the moment but I'll send them along tomorrow if I do.
That said, 2i