Yeah, I would want to know they made it there. I like to use polyglot for
the availability of data, I build my recommendation engine in graph, my
bulk data is in mongo, and sql is kind of my default/ad hoc store. This is
working really well for me, but I want to ease up on the payload within my
app and provide a more streamlined synchronization.

On Fri, Sep 12, 2014 at 8:42 PM, Steve Morin <st...@stevemorin.com> wrote:

> You would need make sure they were all persisted down properly to each
> database?  Why are you persisting it to three different databases (sql,
> mongo, graph)?
> -Steve
>
> On Fri, Sep 12, 2014 at 7:35 PM, Patrick Barker <patrickbarke...@gmail.com
> >
> wrote:
>
> > I'm just getting familiar with kafka, currently I just save everything to
> > all my db's in a single transaction, if any of them fail I roll them all
> > back. However, this is slowing my app down. So, as I understand it I
> could
> > write to kafka, close the transaction, and then it would keep on
> publishing
> > out to my databases. I'm not sure what format I would write it in yet, I
> > guess json
> >
> > On Fri, Sep 12, 2014 at 7:00 PM, Steve Morin <steve.mo...@gmail.com>
> > wrote:
> >
> > > What record format are you writing to Kafka with?
> > >
> > > > On Sep 12, 2014, at 17:45, Patrick Barker <patrickbarke...@gmail.com
> >
> > > wrote:
> > > >
> > > > O, I'm not trying to use it for persistence, I'm wanting to sync 3
> > > > databases: sql, mongo, graph. I want to publish to kafka and then
> have
> > it
> > > > update the db's. I'm wanting to keep this as efficient as possible.
> > > >
> > > >> On Fri, Sep 12, 2014 at 6:39 PM, cac...@gmail.com <cac...@gmail.com
> >
> > > wrote:
> > > >>
> > > >> I would say that it depends upon what you mean by persistence. I
> don't
> > > >> believe Kafka is intended to be your permanent data store, but it
> > would
> > > >> work if you were basically write once with appropriate query
> patterns.
> > > It
> > > >> would be an odd way to describe it though.
> > > >>
> > > >> Christian
> > > >>
> > > >>> On Fri, Sep 12, 2014 at 4:05 PM, Stephen Boesch <java...@gmail.com
> >
> > > wrote:
> > > >>>
> > > >>> Hi Patrick,   Kafka can be used at any scale including small ones
> > > >>> (initially anyways). The issues I ran into personally various
> issues
> > > with
> > > >>> ZooKeeper management and a bug in deleting topics (is that fixed
> > yet?)
> > > >> In
> > > >>> any case you might try out Kafka  - given its highly performant,
> > > >> scalable,
> > > >>> and flexible backbone.   After that you will have little worry
> about
> > > >> scale
> > > >>> - given Kafka's use within massive web scale deployments.
> > > >>>
> > > >>> 2014-09-12 15:18 GMT-07:00 Patrick Barker <
> patrickbarke...@gmail.com
> > >:
> > > >>>
> > > >>>> Hey, I'm new to kafka and I'm trying to get a handle on how it all
> > > >>> works. I
> > > >>>> want to integrate polyglot persistence into my application. Kafka
> > > looks
> > > >>>> like exactly what I want just on a smaller scale. I am currently
> > only
> > > >>>> dealing with about 2,000 users, which may grow,  but is kafka a
> good
> > > >> use
> > > >>>> case here, or is there another technology thats better suited?
> > > >>>>
> > > >>>> Thanks
> > > >>
> > >
> >
>

Reply via email to