Hey Sangeetha,
at first sight, what strikes me as odd about your bulk import is that it shells
out to curl. That has a significant impact on the time it takes to load the
data into Riak. As a first means to improve script and performance, I'd
recommend looking into using the Riak Erlang client
Hey!
I'm just about wrapping up the changes and new content for the upcoming update
of the Riak Handbook. To make sure what I wrote makes (mostly) sense, I'm
looking for one or two technical reviewers of the new content. The focus is
more on whether everything makes sense, so not necessarily o
Hey there!
Just thought I'd let you guys know that I just released a new version of the
Riak Handbook, a free update if you already bought it. It comes with more than
40 pages of new content, focussing primarily on use cases and usage scenarios
(data modelling, access patterns, pros and cons),
Anand,
A content type is a cue as to what kind of data you're storing in Riak. The
concept is based on Internet media types [1]. The cue may or may not be
important to your application to figure out what to do with the data. It's also
important for certain Riak features like Riak Search, where
Anand,
I rewrote your script a little bit (you did some odd things in the initialize
method which isn't expected to return anything), and this way it works:
require 'riak'
Riak::Serializers["text/html"] = Riak::Serializers::TextPlain
class RiakClient
def initialize
super
end
def client
@clien
Venki,
You can specify an argument in your map function and hand the argument's value
over when running the MapReduce request. The part in your JavaScript code is as
simple as adding a second and a third argument to your function like so, with
the third argument being the relevant one:
functio
Venki,
You don't have to serialize the argument as a JSON string, it can simply be
specified as a normal JSON data structure, just like the other data in the
MapReduce request:
curl -v -d '{"inputs":[["artists", "Beatles"]],
"query":[{"map":{"language":"javascript","source":"function(v, k, a)
egroup][\"users\"];
>
> }
> }
> return [obj];
> }","args":{"gender":"G0,G1","agegroups":"A0,A2","metrics":"users,cpc_median","groupby":"country"},"keep":true}}]}'
;
> var k = arg;
>
> It is giving undefined variable arg.
>
> On the other hand,
> If I am trying to access this variable passing as a third parameter to
> function(v,k,{"",""})
> It is also giving me error.
>
> Please help me on the above.
>
>
o connect to riak in node.js?
> > > Basho seems to push for Voxer's node_riak
> > > (https://github.com/mranney/node_riak), but are there better alternatives?
> >
> >
> >
> > In addition to node_riak, Mathias Meyer (with some help from Sean
> > Cribbs
int are appended.
>
> Thanks.
> C.
>
> On Wed, Sep 26, 2012 at 2:21 PM, Mathias Meyer (mailto:me...@paperplanes.de)> wrote:
> > The drop of PB in the new js branch is for now just temporary. Allows us to
> > focus on getting the JavaScript code base up to speed at le
Hey folks,
I've been putting some work in the riak-js client for Node.js recently, and as
we're getting pretty close to doing a release, I wanted to keep you posted on
changes and especially breaking changes.
About a year ago, Francisco Treacy started rewriting the code (originally
written i
Hey all,
I'm happy to announce the 0.9.0 release of riak-js, the Riak client for
Node.js. It's a complete rewrite in plain old JavaScript, bringing some new
functionality along the way. You can read all about the fresh release over on
the Basho blog [1].
riak-js now has a new home [2] and full
I mistyped that indeed. The correct name is indeed riak-js on npmjs. Sorry!
Cheers, Mathias
On Tuesday, 13. November 2012 at 21:26, Christopher Meiklejohn wrote:
> On Tuesday, November 13, 2012 at 3:24 PM, Alexander Sicular wrote:
> > Are you published in npm?
> >
> > npm install riak is fo
Hey guys,
A quick one from me. Just shipped riak-js 0.9.1 with some fixes and some neat
additions, most notably request instrumentation, useful to e.g. track metrics
for request times. Here's a simple example that tracks request times per
request method:
var instrument = {
'riak.request.end
Hey everyone,
Just shipped two releases of riak-js today. 0.9.3 [1] is a minor bug fix
release. It should now be installable on Windows too, because there's really no
reason it shoudn't.
The 0.10.0pre1 release brings preliminary support for Protocol Buffers, based
on Nathan LaFreniere riakpbc
Heya,
Just shipped riak-js 0.10.0, with support for Protocol Buffers:
https://npmjs.org/package/riak-js
To use protobufs, specify a different API option when creating a client:
var riak = require('riak-js').getClient({api: 'protobuf'})
I'd love to have some feedback on how the protobuffs supp
inject it into a previous version of riak-js to get useful
> performance. I'd like to drop that hack.
>
>
> Best
>
> Sebastian
>
> On 03.05.2013, at 17:55, Mathias Meyer (mailto:me...@paperplanes.de)> wrote:
>
> > Heya,
> >
> > Just
; The agent is based on the keep alive capable agent included in
> https://github.com/mikeal/request. In a more recent version this was
> extracted to https://github.com/mikeal/forever-agent, but I haven't used it
> since.
>
> On 03.05.2013, at 18:44, Mathias Meyer (m
to the Debug one and it is fine. Maybe you wanna
> fix it
> José
> On May 3, 2013, at 5:55 PM, Mathias Meyer (mailto:me...@paperplanes.de)> wrote:
>
> > Heya,
> >
> > Just shipped riak-js 0.10.0, with support for Protocol Buffers:
> > https://npmjs.o
odules/protobuf.js/node_modules/wtf8/wtf8.js
>
> var wtf8 = require('./build/Release/wtf8.node');
>
> but there is no Release folder. Only Debug
>
>
> On May 3, 2013, at 9:15 PM, Mathias Meyer (mailto:me...@paperplanes.de)> wrote:
> > José,
> >
&g
#x27;t use any debug flag. I just did it again and it still created a
> > Debug folder instead of Release. Maybe it's my node_gyp> I need to check.
> > Has anyone else experienced this?
> > BTW I am using OS X Mountain Lion
> > José
> > On May 4, 2013,
Heya!
Hot on the heels of 0.10.0 I shipped riak-js 0.10.1 over the weekend, which
fixes a bug in the protobuffs code.
More importantly though, it now sports connection pooling/load balancing for
HTTP connections by way of the poolee library, courtesy of Andrew J. Stone.
When creating a client
If you decide to go with a RAID, be sure to add
LVM on top for simpler snapshotting, which will be quite painful if not
impossible to get consistent snapshots using just EBS snapshots on a bunch of
striped volumes.
Let us know if you have more questions, there's lots of details i
rating.
Or you could set your own custom schema for the bucket, telling Riak Search to
treat the followers field as a numeric field. Both is detailed on our wiki:
http://wiki.basho.com/Riak-Search---Schema.html
Mathias Meyer
Developer Advocate, Basho Technologies
On Dienstag, 5. April 2011
contact1.
You can query the separate fields by using an underscore instead, so you would
run this instead:
$client->search("bucket","contact_phone1:999")->run()
Mathias Meyer
Developer Advocate, Basho Technologies
On Mittwoch, 13. April 2011 at 10:23, khyqo wrote:
>
I took the liberty of wrapping this into a pull request.
The API corresponds to Ana's original suggestion, allowing
bucket.new_binary_from_file(key, filename)
Thanks for contributing!
Mathias Meyer
Developer Advocate, Basho Technologies
On Dienstag, 12. April 2011 at 23:21, Ana Nelson
at")
->map("function (v) { return [v.key]; }")
->reduce("Riak.reduceSort")
->run();
Mathias Meyer
Developer Advocate, Basho Technologies
On Donnerstag, 14. April 2011 at 20:20, khyqo wrote:
> good day everyone..
>
> i encountered another problem.. i am confu
t variable SKIP_SEARCH=1.
Mathias Meyer
Developer Advocate, Basho Technologies
On Donnerstag, 21. April 2011 at 21:49, Mikhail Sobolev wrote:
Hi,
>
> (I'm not sure if this is a correct list for posting questions about
> python-riak-client. If it's not, please direct me to the correc
repository's
directory:
$ cake build
$ npm install
Mathias Meyer
Developer Advocate, Basho Technologies
[1]
https://github.com/frank06/riak-js/commit/d65a2fc3ac227aeb3dc17bc5d7c703a4fcb8c232
On Donnerstag, 28. April 2011 at 17:41, Luc Castera wrote:
Hi folks,
>
> I've encountered an i
work, but for the REST API it's best
to use binaries as it doesn't handle atoms.
Mathias Meyer
Developer Advocate, Basho Technologies
On Dienstag, 17. Mai 2011 at 08:58, Dmitry Demeshchuk wrote:
> Greetings.
>
> I'm looking for a way to set expiry_secs for specific bucke
Malka,
the most likely reason is that the JavaScript file is not properly accessible
to Riak on some of the nodes in your cluster.
Have you checked that the file is properly distributed throughout the cluster
and js_source_dir is set accordingly, and Riak is restarted on all nodes?
Mathias
Peter,
wrote my replies inline.
Mathias Meyer
Developer Advocate, Basho Technologies
On Freitag, 13. Mai 2011 at 20:05, Peter Fales wrote:
> Sean,
>
> Thanks to you and Ben for clarifying how that works. Since that was
> so helpful, I'll ask a followup question, and also a
at page to
clarify how they should end up looking when multiple are put together.
The bottom line is that Ripple does produce proper key filter code with
conditions and that you are absolutely correct in bringing up this slight
confusion.
Mathias Meyer
Developer Advocate, Basho Technologies
On
er, unless
you're prepared to deal with the potential conflicts, and e.g. handle siblings
immediately after you reconciled the differences between two objects in your
compare() function, see [1] for more details.
Mathias Meyer
Developer Advocate, Basho Technologies
[1] http://wiki.basho
buckets, and therefore the same distribution
and consistency properties apply to them as to objects stored directly in Riak
KV. Bottom line is there's nothing wrong with just using them instead of
fetching them again from Riak KV.
Mathias Meyer
Developer Advocate, Basho Technologies
On Mit
bly also simple) enough to add something
like that to the Riak Search Solr API.
Mathias Meyer
Developer Advocate, Basho Technologies
On Donnerstag, 26. Mai 2011 at 20:50, Greg Pascale wrote:
> Thanks Mathias,
>
> We'll continue to do that then.
>
> It seems to me, though,
be used to look up the
serialized document in Riak KV.
Mathias Meyer
Developer Advocate, Basho Technologies
On Donnerstag, 26. Mai 2011 at 21:56, Greg Pascale wrote:
> Eric, I believe the key is the document id, which will be the same as the key
> of the corresponding object in .
>
>
[RiakObject, KeyData]
end.
Timeout = 10.riakc_pb_socket:search(Pid, "bucket", "query", [{map, {qfun,
MapObjectKeydata}, none, true}], Timeout).
Mathias Meyer
Developer Advocate, Basho Technologies
On Montag, 30. Mai 2011 at 12:37, Hagbard Celine wrote:
> Hi,
>
application and MapReduce code rely on that. If you need to enforce this before
storing data inside Riak, you can use a pre-commit hook to validate the JSON
using e.g. mochijson2:decode() and have your application respond accordingly,
see [1] for an example.
[1] https://gist.github.com/1031311
collecting the results on your
application's side.
Mathias Meyer
Developer Advocate, Basho Technologies
On Dienstag, 21. Juni 2011 at 14:25, Jeremy Raymond wrote:
> I increased the memory to 3GB on the VMs I'm using for Riak and also replaced
> a JavaScript reduce function I
messages directly to Riak, and then resort to failover should one of
the Rabbits go down.
Mathias Meyer
Developer Advocate, Basho Technologies
[1] http://xing.github.com/beetle/
[2] https://github.com/jtuple/riak_zab
[3] https://github.com/seancribbs/riak_id
[4] https://github.com/jbrisbi
tc., and then
store back the updated value using the vector clock you got when requesting the
object including the siblings.
We have wiki page [1] dedicated to vector clocks and conflict resolution,
explaining the process in more detail.
Mathias Meyer
Developer Advocate, Basho Technologies
[1
And here's the link I neatly forgot to include:
http://wiki.basho.com/Vector-Clocks.html
Mathias Meyer
Developer Advocate, Basho Technologies
On Mittwoch, 22. Juni 2011 at 17:18, Mathias Meyer wrote:
> Manuel,
>
> what you're seeing is not specific to links, it's
educe request itself, but is
less prone to encoding/decoding issues with JSON.
Mathias Meyer
Developer Advocate, Basho Technologies
[1]
http://lists.basho.com/pipermail/riak-users_lists.basho.com/2011-June/004447.html
[2]
http://lists.basho.com/pipermail/riak-users_lists.basho.com/2011-June/0044
ion of a string, you would
see it as a string in the log file as well, not just as a list of numbers.
Mathias Meyer
Developer Advocate, Basho Technologies
On Donnerstag, 23. Juni 2011 at 17:31, Andrew Berman wrote:
> But isn't the value itself JSON? Meaning this part:
>
> {stru
ps://github.com/basho/riak-python-client
[2]
https://github.com/basho/riak-python-client/blob/riak-python-client-1.2.2/README.rst
[3]
https://github.com/basho/riak-python-client/blob/riak-python-client-1.2.2/RELEASE_NOTES.md
Mathias Meyer
Developer Advocate, Bas
ire pain though, as
it only reduces the load of the last step in your MapReduce job, so it's more
of a general practice.
Mathias Meyer
Developer Advocate, Basho Technologies
On Freitag, 24. Juni 2011 at 20:43, David Mitchell wrote:
> I am doing 208 MapReduce jobs in rapid-fire succession
tion,
and try running this setup again to see if you get the same erroneous results?
If you do, some more details on your data and the MapReduce jobs you're running
would be great to reproduce and figure out the problem.
Mathias Meyer
Developer Advocate, Basho Technologies
On Mittwoch, 29.
f them. In your case
I'd say it'd be well worth looking into Erlang as an alternative.
Mathias Meyer
Developer Advocate, Basho Technologies
___
riak-users mailing list
riak-users@lists.basho.com
http://lists.basho.com/mailman/listinfo/riak-users_lists.basho.com
Eric,
are you by any chance still running Riak 0.14.1? There was a bug showing the
same symptoms you're describing, which was fixed in the recent 0.14.2 release.
Mathias Meyer
Developer Advocate, Basho Technologies
On Samstag, 2. Juli 2011 at 14:40, Eric Stevens wrote:
> I've be
bject from a different part
of your code that's fetching the object initially?
Mathias Meyer
Developer Advocate, Basho Technologies
On Mittwoch, 6. Juli 2011 at 00:30, Claus Guttesen wrote:
> Hi.
>
> When saving a new record using db.update() getting the record using
> c
Matt,
in your JS function, you return value, where instead you must return a list of
values, so changing it to
return [value];
fixes the problem.
Mathias Meyer
Developer Advocate, Basho Technologies
On Donnerstag, 7. Juli 2011 at 05:53, Matt Graham wrote:
> Hi,
> I'm trying
Muhammad,
can you see the merge_index directory fill with data on the second machine
that's not responding to queries? Anything unusual showing up in the log?
Can you access the normal data you're indexing through Riak KV on both machines?
Mathias Meyer
Developer Advocate, Basho Te
r the total
number of requests (vnode_gets_total and vnode_puts_total).
Mathias Meyer
Developer Advocate, Basho Technologies
On Samstag, 9. Juli 2011 at 06:07, Jon Baer wrote:
> Hi,
>
> I am wondering if there is a clearly explanation or docs somewhere that
> explains a little bit mor
u may want to look into Statebox [1] as an
alternative way, or try to serialize writes through a messaging system like
RabbitMQ to ensure atomicity to a certain extent.
[1] https://github.com/mochi/statebox
Mathias Meyer
Developer Advocate, Basho Technologies
On Freitag, 8. Juli 2011 at 19:1
terns you
have. Providing us with some more details would help giving you an answer here.
If you only access all data for a particular zip code in one go though, the
answer would probably be yes.
Mathias Meyer
Developer Advocate, Basho Technologies
On Dienstag, 12. Juli 2011 at 15:04, Anton Pod
You don't necessarily have to reindex your data. Copying over the data
directories from the old Riak instances should do.
Mathias Meyer
Developer Advocate, Basho Technologies
On Montag, 11. Juli 2011 at 14:58, Muhammad Yousaf wrote:
>
> Thanks Mathias & Sylvain,
> Everyt
every complete test run, i.e. stop the
Riak processes, wipe the data, and start them up again.
Ripple (Ruby client for Riak) and riak-js (Node.js client) both include a test
server that runs a Riak instance with an in-memory backend, maybe that would be
an alternative to go with?
Mathias Meyer
entry to a list
stored e.g. as JSON under a specific key in an atomic fashion, which is not
possible in Riak either way.
Mathias Meyer
Developer Advocate, Basho Technologies
On Samstag, 30. Juli 2011 at 11:47, Jonathan Langevin wrote:
> Re: 4) - This is regarding to updating the data tha
iak-python-client-1.3.0/RELEASE_NOTES.md
Mathias Meyer
Developer Advocate, Basho Technologies
___
riak-users mailing list
riak-users@lists.basho.com
http://lists.basho.com/mailman/listinfo/riak-users_lists.basho.com
The wiki refers to the current stable release of Riak, which is 0.14.2, and
which still relies on Erlang R13B04. To compile the current development master,
e.g. to try out secondary indexes, you need to use Erlang R14B03 instead.
Mathias Meyer
Developer Advocate, Basho Technologies
On
Which interface for search are you using? Are you using the Solr search
interface, or do you go through MapReduce? The former would using
client.solr().query(), the latter client.search().
Should you use the latter, do you call obj.get() on every result object before
calling get_data(). The res
The short answer: yes, we can and we should. I had that on my radar for a while
too, because it felt un-Pythonic.
As for deprecation, there's no specific rule for the Python client yet. I'm
happy to accept a patch for it for e.g. a version of the client 1.4.0 with an
announcement that support
Is it possible at all you indexed the documents using the Solr interface, and
you're now trying to use them in a MapReduce query? If so, that won't work,
because the get() call expects the objects with their respective bucket and key
to exist in Riak KV, which means you'd have to index via a pre
Your URL is pointing to a non-existing endpoint. Change it to
http://markson.hk:8098/buckets/test/keys/1234
(http://markson.hk:8098/buckets/test/1234) (note the "keys" URL component
before the actual key), and you should be good to go.
Cheers, Mathias
http://nosqlhandbook.com
On Freitag, 1
Vamsi,
there's nothing wrong with your cluster setup. Support for multiple nodes in
Ripple is being worked on in the master branch, you can have a lookie here at
the documentation update on how to use it:
https://github.com/seancribbs/ripple/commit/88fed2bdb1900ccc26fd292b3607d66cbcbe82c4
Chee
If you read it thoroughly, he doesn't recommend against using Riak. He
recommends to start out with a relational database like MySQL or Postgres if
you don't know what Riak is and how you'd benefit from it, or how you data will
evolve over time. Start out with one of them, add Riak to the mix la
Hey everyone,
I'm happy to announce the public release of the Riak Handbook, the most
definitive guide on Riak.
The most comprehensive book on all things Riak, Riak Search, Riak 2i, MapReduce
and data modeling. I could go on and on about all the things it covers, but you
should go and see for
Hey Sean,
for a while I hacked on an EM-based Riak client but didn't find the time to
investigate it further.
So you can do several things with riak-ruby-client an EM:
1) Just use the client with EM.defer, putting it into other threads. That
arguably voids the purpose of using EM, but at least
Thanks for the praise of my book. I'm curious though, what does "advanced"
entail for you guys? I'm continuously working on updates for the book, and I'm
happy to look at things that you think are missing, but it'd be great to have
some more concrete examples of what you think a book on advanced
71 matches
Mail list logo