I created a wiki page that lists all the MySQL replication options that people
posted, plus a couple others. People may/may not find it useful.
https://github.com/wushujames/mysql-cdc-projects/wiki
I wasn't sure where to host it, so I put it up on a Github Wiki.
-James
On Mar 17, 2015, at 11:0
Linkedin Gabblin compaction tool is using Hive to perform the compaction. Does
it mean Lumos is replaced?
Confused…
On Mar 17, 2015, at 10:00 PM, Xiao wrote:
> Hi, all,
>
> Do you know whether Linkedin plans to open source Lumos in the near future?
>
> I found the answer from Qiao Lin’s po
Hi, all,
Do you know whether Linkedin plans to open source Lumos in the near future?
I found the answer from Qiao Lin’s post about replication from Oracle/mySQL to
Hadoop.
- https://engineering.linkedin.com/data-ingestion/gobblin-big-data-ease
At the source side, it can be DataBus-ba
AFAIK , linkedin uses databus to do the same. Aesop is built on top of
databus , extending its beautiful capabilities to mysql n hbase
On Mar 18, 2015 7:37 AM, "Xiao" wrote:
> Hi, all,
>
> Do you know how Linkedin team publishes changed rows in Oracle to Kafka? I
> believe they already knew the w
Hi, all,
Do you know how Linkedin team publishes changed rows in Oracle to Kafka? I
believe they already knew the whole problem very well.
Using triggers? or directly parsing the log? or using any Oracle GoldenGate
interfaces?
Any lesson or any standard message format? Could the Linkedin peo
This is a great set of projects!
We should put this list of projects on a site somewhere so people can more
easily see and refer to it. These aren't Kafka-specific, but most seem to be
"MySQL CDC." Does anyone have a place where they can host a page? Preferably a
wiki, so we can keep it up to d
Pretty much a hijack / plug as well (=
https://github.com/mardambey/mypipe
"MySQL binary log consumer with the ability to act on changed rows and
publish changes to different systems with emphasis on Apache Kafka."
Mypipe currently encodes events using Avro before pushing them into Kafka
and is
Great work.
Sorry for kinda hijacking this thread, but I though that we had built
some-thing on mysql bin log event propagator and wanted to share it .
You guys can also look into Aesop ( https://github.com/Flipkart/aesop). Its
a change propagation frame-work. It has relays which listens to bin log
Really really nice!
Thank you.
On Mon, Mar 16, 2015 at 7:18 AM, Pierre-Yves Ritschard
wrote:
> Hi kafka,
>
> I just wanted to mention I published a very simple project which can
> connect as MySQL replication client and stream replication events to
> kafka: https://github.com/pyr/sqlstream
>
>
Hi James,
Thanks for the kind words. I will definitely work on the persitence of
binlog position, with a couple of persistence options.
The trickier part is figuring out the way to correctly figure out a key
for the topic. Not all events indicate which database/table/entity they
operate on. Getti
Super cool, and super simple.
I like how it is pretty much a pure translation of the binlog into Kafka, with
no interpretation of the events. That means people can layer whatever they want
on top of it. They would have to understand what the mysql binary events mean,
but they would just have to
Hi kafka,
I just wanted to mention I published a very simple project which can
connect as MySQL replication client and stream replication events to
kafka: https://github.com/pyr/sqlstream
When you don't have control over an application, it can provide a simple
way of consolidating SQL data in kaf
12 matches
Mail list logo