Hello Drew,
Thinking through the delete operation on a strong consistent bucket, a
question popped up for me.
Should not we use delete_object instead to keep strong consistency? The
delete function takes a key and does not care about the current state/value
of the object itself. If in the meantim
Hi,
We are having issues with Riak Search custom schema. Application has
multiple POJO's and we are using Java client to write into RIakKV.
Configured custom solr schema with indexed fields from all POJOs. After
adding the "Catch-All" field, solr is not indexing any fields from any of
the Pojo.
No
How are you storing these objects? What "Content-Type" is being used?
--
Luke Bakken
Engineer
lbak...@basho.com
On Fri, Jan 9, 2015 at 7:04 AM, Santi Kumar wrote:
> Hi,
> We are having issues with Riak Search custom schema. Application has
> multiple POJO's and we are using Java client to write
No content type is specified explicitly. Directly object is passed to
StoreValue.Builder(). Some where I read that pojo is extracted as json. Am
I missing something?
. Here is the code.
Location user_locaiton = new Location(nameSpace, key);
StoreValue storeUserOp = new StoreValue.Builder(
Can you retrieve the object using "curl" and let me know what the
"Content-Type" header is when returned?
--
Luke Bakken
Engineer
lbak...@basho.com
On Fri, Jan 9, 2015 at 8:44 AM, Santi Kumar wrote:
> No content type is specified explicitly. Directly object is passed to
> StoreValue.Builder(). S
Luke
It's application/json
Here is the curl command and output dump with content-type in bold
curl -v
http://127.0.0.1:8098/buckets/junit_MASTER-Policies/keys/2d29e759-8e30-499a-8ecc-98eb09eeaa9f
* About to connect() to 127.0.0.1 port 8098 (#0)
* Trying 127.0.0.1...
* Adding handle: conn: 0x
Please run your JSON document through the extractor to ensure that
it's being parsed correctly:
http://docs.basho.com/riak/latest/dev/advanced/search/#Extractors
curl -XPUT http://localhost:8098/search/extract \
-H 'Content-Type: application/json' \
--data-binary @object.json
If that w
I ran the command as shown below but I was getting wierd errors. In the
place of @object.json, I used the actual json content. Output was
curl -XPUT http://localhost:8098/search/extract -H 'Content-Type:
application/json' --data-binary
@{"id":null,"tenantId":"eb0a1917-9762-3dd3-a48f-a681d3061212
Hello Zsolt,
Whether or not it is correct or valuable for you to use delete
operations is (as with many features of Riak) largely dependent on
your use-case. Deletes in the strongly consistent mode are much less
valuable than deletes in the eventually consistent mode because keys
are never reaped
Hi Nirav,
If I'm reading your description correctly - you'd like your secondary
index data to be automatically migrated to Riak Search?
--
Luke Bakken
Engineer
lbak...@basho.com
On Wed, Jan 7, 2015 at 10:36 PM, Nirav Shah wrote:
> Hi Luke,
> Can you please advice. ?
>
> "I have a production dat
Hi Santi,
This means that the extractor is working. Your next step is to ensure
that the field names returned by the extractor match the field names
in your Solr schema. Also, please check to ensure that your schema has
all required _yz_* fields.
--
Luke Bakken
Engineer
lbak...@basho.com
On Fri,
Cool I'll check that. Appreciate your timely help Luke
On Jan 9, 2015 11:53 PM, "Luke Bakken" wrote:
> Hi Santi,
>
> This means that the extractor is working. Your next step is to ensure
> that the field names returned by the extractor match the field names
> in your Solr schema. Also, please che
Hi Luke,
Let me rephrase it.
1. We currently have 2i based searches in our code. We are trying to move it to
Solr as per Basho’s recommendation
2. Toachieve this we created Solr schema. But when we tried it in dev, we
noticed Solr indexes only gets created for the new data and not for existing
Nirav,
You will have to GET and PUT every object to index it into Solr.
--
Luke Bakken
Engineer
lbak...@basho.com
On Fri, Jan 9, 2015 at 10:54 AM, Nirav Shah wrote:
> Hi Luke,
>
> Let me rephrase it.
>
>
> 1. We currently have 2i based searches in our code. We are trying to move it
> to Solr as
14 matches
Mail list logo