Hi Chris, I appreciate the help. Please let me know if I can help in anyway. For now I just switched to a "multi" get.
thanks doug On Mon, Aug 19, 2013 at 4:09 PM, Chris Meiklejohn <cmeiklej...@basho.com>wrote: > Hi Doug, > > After loading your backup, I can trigger the crash through the JavaScript > map/reduce job. However, if I do the following: > > {ok, Client} = riak:local_client(). > {ok, O} = Client:get(<<"BUCKET">>, <<"KEY">>). > > both of these calls succeed: > > js_mochijson2:decode(riak_object:get_values(O)). > mochijson2:decode(riak_object:get_values(O)). > > I'm continuing to track down where the crash is occurring, but it appears > to be something related to erlang_js. > > - Chris > > > > On Fri, Aug 16, 2013 at 10:45 AM, Doug Read <doug.r...@qnary.com> wrote: > >> So you are right about the characters in the json, I remove most of the >> key, just leaving some valid json with the special characters, it produces >> the same error. >> >> A simple node program which reads the json from file and does JSON.parse; >> this succeeds. >> >> var fs = require("fs"); >> >> fs.readFile("/tmp/bucket_export/s/e/a/searches_bad.json", function(err, >> data) { >> console.log(data); >> var json = JSON.parse(data); >> console.log(json); >> }); >> >> So the javascript VM or something before/after it is having trouble >> parsing it but i am not sure how to debug from here. >> >> >> >> On Fri, Aug 16, 2013 at 1:58 AM, Christopher Meiklejohn < >> cmeiklej...@basho.com> wrote: >> >>> Hi Doug, >>> >>> Just going through your email again; I just noticed that there is a typo >>> in the bucket name between those two commands: >>> >>> JavaScript: >>> >>> curl -XPOST http://localhost:8098/mapred -H 'Content-Type: >>> application/json' -d '{"inputs":[ [ "aaaaaaaa-4536-9048-87ef2e48ddda", >>> "key_5ad26d0d-4d28-40ca-afcb-1c9895cc5c71" ] ], "query":[ { "map": { >>> "name": "Riak.mapValuesJson", "language": "javascript" } }, { "reduce": { >>> "name": "Riak.filterNotFound", "language": "javascript" } } ] }' >>> >>> Erlang: >>> >>> curl -XPOST http://localhost:8098/mapred -H 'Content-Type: >>> application/json' -d '{"inputs":[ [ "aaaaaaaa-50d7-4536-9048-87ef2e48ddda", >>> "key_5ad26d0d-4d28-40ca-afcb-1c9895cc5c71" ] ], >>> "query":[{"map":{"language":"erlang","module":"riak_kv_mapreduce","function":"map_object_value"}} >>> ] }' >>> >>> Specifically, 'aaaaaaaa-4536-9048-87ef2e48ddda' in the failed JavaScript >>> command vs 'aaaaaaaa-50d7-4536-9048-87ef2e48ddda' in the successful Erlang >>> command. >>> >>> I've verified this is triggering not_founds in the dump you supplied. >>> >>> - Chris >>> >>> -- >>> Christopher Meiklejohn >>> Software Engineer >>> Basho Technologies, Inc. >>> >>> >>> >>> On Friday, August 16, 2013 at 1:36 AM, Doug Read wrote: >>> >>> > No problem. >>> > >>> > Riak 1.3.1 >>> > Have reproduced the error on 1.3.1 Mac osx and 1.4.1 on Ubuntu. I am >>> on my phone now I'll get the build and package versions. >>> > >>> > That is the erlang mapreduce, that has always worked it is the >>> javascript version that fails. The one I originally posted but basically >>> any javascript mapreduce which takes that bucket key as input. >>> > >>> > On Friday, August 16, 2013, Christopher Meiklejohn wrote: >>> > > Hi Doug, >>> > > >>> > > First, my apologies. I confused two email responses I was writing, >>> and that's why my previous e-mail was a bit out of context. >>> > > >>> > > I've loaded your backup file into a locally built cluster here off >>> of the Riak 1.2 branch, but I'm still unable to reproduce the issue using >>> the following map/reduce command: >>> > > >>> > > curl -XPOST http://localhost:8098/mapred -H 'Content-Type: >>> application/json' -d '{"inputs":[ [ "aaaaaaaa-50d7-4536-9048-87ef2e48ddda", >>> "key_5ad26d0d-4d28-40ca-afcb-1c9895cc5c71" ] ], >>> "query":[{"map":{"language":"erlang","module":"riak_kv_mapreduce","function":"map_object_value"}} >>> ] }' >>> > > >>> > > Can you please provide which operating system, build, and package >>> version of Riak you are running? >>> > > >>> > > - Chris >>> > > >>> > > -- >>> > > Christopher Meiklejohn >>> > > Software Engineer >>> > > Basho Technologies, Inc. >>> > > >>> > > >>> > > >>> > > On Friday, August 16, 2013 at 1:23 AM, Christopher Meiklejohn wrote: >>> > > >>> > > > Hi Doug, >>> > > > >>> > > > Can you provide more information as to how you build the image >>> with Vagrant so I can try to reproduce it? The configuration alone isn't >>> going to be enough as this appears to be a systems related issue. >>> > > > >>> > > > - Chris >>> > > > >>> > > > -- >>> > > > Christopher Meiklejohn >>> > > > Software Engineer >>> > > > Basho Technologies, Inc. >>> > > > >>> > > > >>> > > > >>> > > > On Friday, August 16, 2013 at 1:20 AM, Doug Read wrote: >>> > > > >>> > > > > Hi Chris, >>> > > > > >>> > > > > I made a riak-admin backup of the key which reproduces the >>> issue. I was wondering if you could give me some direction of how to debug >>> the issue. >>> > > > > >>> > > > > thanks >>> > > > > doug >>> > > > > >>> > > > > >>> > > > > >>> > > > > On Thu, Aug 15, 2013 at 3:41 PM, Chris Meiklejohn < >>> cmeiklej...@basho.com (mailto:cmeiklej...@basho.com)> wrote: >>> > > > > > The best guess I have at this point is probably something >>> related to character encoding, but without a reproduction case, I'm not >>> able to debug it any further. >>> > > > > > >>> > > > > > Good luck with the upgrade tonight! >>> > > > > > >>> > > > > > - Chris >>> > > > > > >>> > > > > > >>> > > > > > On Thu, Aug 15, 2013 at 3:39 PM, Doug Read < >>> doug.r...@qnary.com (mailto:doug.r...@qnary.com)> wrote: >>> > > > > > > I redirected the output of curl into a file on an ubuntu >>> box. I am upgrading the cluster to 1.4.1 tonight. To your point I PUT the >>> value into the key locally (3 node cluster) and couldn't reproduce either. >>> Also i am turning on the java vm logging. >>> > > > > > > >>> > > > > > > >>> > > > > > > On Thu, Aug 15, 2013 at 3:33 PM, Chris Meiklejohn < >>> cmeiklej...@basho.com (mailto:cmeiklej...@basho.com)> wrote: >>> > > > > > > > Hi Doug, >>> > > > > > > > >>> > > > > > > > I've configured a Riak 1.2 cluster, and run the >>> aforementioned map-reduce job in Erlang and I can't trigger the crash. I'm >>> getting the expected results of the map/reduce job. How did you send me the >>> object that you provided off-list? >>> > > > > > > > >>> > > > > > > > - Chris >>> > > > > > > > >>> > > > > > > > >>> > > > > > > > On Thu, Aug 15, 2013 at 12:36 PM, Chris Meiklejohn < >>> cmeiklej...@basho.com (mailto:cmeiklej...@basho.com)> wrote: >>> > > > > > > > > Hi Doug, >>> > > > > > > > > >>> > > > > > > > > Can you provide a sample of the JSON that you're storing >>> in these objects? It appears that mochijson2's tokenizer is crashing >>> because it thinks the JSON is not valid, where the Spidermonkey parsing is >>> succeeding. >>> > > > > > > > > >>> > > > > > > > > - Chris >>> > > > > > > > > >>> > > > > > > > > >>> > > > > > > > > On Wed, Aug 14, 2013 at 10:58 AM, Doug Read < >>> doug.r...@qnary.com (mailto:doug.r...@qnary.com)> wrote: >>> > > > > > > > > > The following MapReduce job fails using javascript but >>> succeeds when using erlang. >>> > > > > > > > > > >>> > > > > > > > > > Riak 1.2.0 2012-0806 Debian x86_64 >>> > > > > > > > > > 3 nodes, n_val=3 >>> > > > > > > > > > >>> > > > > > > > > > Riak diag gives large list of >>> > > > > > > > > > [warning] The following preflists do not satisfy the >>> n_val: >>> > > > > > > > > > Not really sure what this means but thought i would >>> share. >>> > > > > > > > > > >>> > > > > > > > > > JAVASCRIPT: >>> > > > > > > > > > curl -XPOST http://localhost:8098/mapred -H >>> 'Content-Type: application/json' -d '{"inputs":[ [ >>> "aaaaaaaa-4536-9048-87ef2e48ddda", >>> "key_5ad26d0d-4d28-40ca-afcb-1c9895cc5c71" ] ], "query":[ { "map": { >>> "name": "Riak.mapValu >>> > > > > > > > > >>> > > > > > > > >>> > > > > > > >>> > > > > > >>> > > > > >>> > > > >>> > > >>> > >>> >>> >>> >>> >> >
_______________________________________________ riak-users mailing list riak-users@lists.basho.com http://lists.basho.com/mailman/listinfo/riak-users_lists.basho.com