Re: Comparing Pulsar and Kafka: unified queuing and streaming

2017-12-07 Thread Andrew Stevenson
any estimation on when I can try out Kafka Connect with Pulsar? Can you also point me when I can find the Kafka-to-Pulsar source and sink? - KN On Wed, Dec 6, 2017 at 2:48 AM, Andrew Stevenson wrote: > In terms of building out the Apache Pulsar ecosystem, Landoop is working > on porting our

Re: Kafka streams on Kubernetes

2017-12-06 Thread Andrew Stevenson
Hi Artur, What’s your Kubernetes set up? Azure, AWS, GKE? You should be able to hit your brokers from any node in the cluster but it’s best at abstract via Kubernetes service. BTW, Lenses will support deployment into Kubernetes our next release Andrew Stevenson https://landoop.com

Re: Comparing Pulsar and Kafka: unified queuing and streaming

2017-12-06 Thread Andrew Stevenson
In terms of building out the Apache Pulsar ecosystem, Landoop is working on porting our Kafka Connect Connectors to Pulsars framework, We already have a Kafka to Pulsar source and sink. On 05/12/2017, 19:59, "Jason Gustafson" wrote: > I believe a lot of users are using the kafka high level

Re: Kafka Monitoring

2017-12-06 Thread Andrew Stevenson
And Lenses - https://www.landoop.com/kafka-lenses/ On 06/12/2017, 11:21, "Abhimanyu Nagrath" wrote: There are a couple of tools for this. 1. Linkedin Burrow - https://github.com/linkedin/Burrow 2. Yahoo Kafka Monitoring Tool - https://github.com/yahoo/kafka-manager

Re: Kafka and spark integration

2016-10-28 Thread Andrew Stevenson
Spark has a Kafka Integration, if you want to write data from Kafka to HDFS use the HDFS Kafka Connect Sink from Confluent. On 27/10/2016, 03:37, "Mohan Nani" wrote: Any body know the end to end hadoop data flow which has Kafka - spark integration. I am primarily concerned o

Re: Architecture recommendations for a tricky use case

2016-09-30 Thread Andrew Stevenson
· Kafka Connect for ingress “E” · Kafka Streams , Flink or Spark Streaming for “T” – Read from and write back to Kafka – Keep the sources of data for you processing engine small Separation of concerns, why should Spark care about where you upstream sources are for example ·

RE: Running kafka connector application

2016-06-30 Thread Andrew Stevenson
The twitter connector pom builds a fat jar with all dependencies. You need to add this to the classpath before you start Connect. This is what the Confluent scripts are doing. Regards Andrew From: Ewen Cheslack-Postava Sent: ‎14/‎06/‎20

RE: Kafka Connect - Source Connector for Apache Cassandra

2016-04-14 Thread Andrew Stevenson
umber of sinks and sources connectors in the works and happy to take more on. www.datamountainer.com Regards Andrew From: Andrew Stevenson<mailto:asteven...@outlook.com> Sent: ‎15/‎04/‎2016 07:04 To: users@kafka.apache.org<mailto:users@kafka.apache.or

RE: Kafka Connect - Source Connector for Apache Cassandra

2016-04-14 Thread Andrew Stevenson
ndrew, so at this point one can not go this way: Cassandra > Connector_A > Kafka > Connector_B > Some other storage? On Thu, Apr 14, 2016 at 6:08 AM, Andrew Stevenson < and...@datamountaineer.com> wrote: > And this one. We’re adding a source to the sink plus ssl and user/

Re: Kafka Connect - Source Connector for Apache Cassandra

2016-04-14 Thread Andrew Stevenson
And this one. We’re adding a source to the sink plus ssl and user/name password support. Kerberos support is planned. https://github.com/datamountaineer/stream-reactor On 14/04/16 01:54, "Joe Stein" wrote: >There is one being worked on here >https://github.com/tuplejump/kafka-connect-cass

Cassandra connector

2016-02-18 Thread Andrew Stevenson
Hi Guys, I posted on the Confluent mailing list about my Cassandra Connect sink. Comments please. Be gentle! https://github.com/andrewstevenson/stream-reactor/tree/master/kafka-connect Regards Andrew

RE: Topic per entity

2015-10-31 Thread Andrew Stevenson
I too would be interested in any responses to this question. I'm using kafka for event notification and once secure will put real payload in it and take advantage of the durable commit log. I want to let users describe a DAG in orientdb and have the Kafka Client processor load and execute it. Ea