Hi, we just opened sourced a syslog producer to Kafka
https://github.com/stealthly/go_kafka_client/tree/master/syslog. This
server has a few features that folks might be interested in.

Besides it producing your data to Kafka you can also configure it (via
command line) to associate meta data with the log data (as a protobuf) when
that happens. The purpose of this is for deeper analytics down stream. e.g.
(--source i-59a059a8 --tag dc=ny9 --tag floor=2 --tag aisle=17 --tag rack=5
--tag u=7 --log.type.id 3) which would associate the meta data with the log
data at a persistent point in time... you can set source, tags, log.type.id
to anything you wanted for yourself for your needs to parse later on
downstream. A little bit more written up about that here
http://allthingshadoop.com/2015/01/16/syslog-producer-for-apache-kafka/

easy to get started too with the docker image

*docker run --net=host stealthly/syslog --topic logs --broker.list
brokerHost:9092*
And thats it, now send your data to port 5140 (TCP) or 5141 (UDP) both
configurable to the host you are running the docker image on.

We are in progress of rolling this out to clients but wanted to get it out
there for anyone to try out and get feedback so that as we continue to
stabilize the existing deployments we can build in more usage scenarios.

/*******************************************
 Joe Stein
 Founder, Principal Consultant
 Big Data Open Source Security LLC
 http://www.stealth.ly
 Twitter: @allthingshadoop <http://www.twitter.com/allthingshadoop>
********************************************/

Reply via email to