Hi David, As Eno said, you can use Kafka Streams to pipe the raw logs as well as those anomaly events to two different topics, and use separate Kafka Connect to read them into two files.
An alternative way to use Kafka Streams only, is to use the `printAsText` operator in the Streams DSL, or you can also implement your own processor implementation with the Processor API to print the stream into local files, if that is acceptable. You can take a look at this example in the web docs (search for "Applying a custom processor"): http://docs.confluent.io/3.0.0/streams/developer-guide.html#kafka-streams-dsl Guozhang On Fri, Jul 15, 2016 at 9:20 AM, Eno Thereska <eno.there...@gmail.com> wrote: > Hi David, > > One option would be to first output your info to a topic using Kafka > Streams, and then use Connect again (as a sink) to read from the topic and > write to a file in the file system. > > Eno > > > On 15 Jul 2016, at 08:24, David Newberger <david.newber...@wandcorp.com> > wrote: > > > > Hello All, > > > > I'm curious if I can output to a .txt file after doing some stream > processing using Kafka Streams. The scenario I'm trying to implement is a > simple web log processing application with alerts on specific criteria. I > know I can ingest log files from the local filesystem into Kafka using > connect. I also believe I can use Kafka Streams to process the logs looking > for the specific criteria to be met. > > > > Where I'm having difficulty is telling if I can either output the info > which meets the criteria to one local file system .txt file and the raw > unprocessed logs to another local file system text file directly from Kafka > Steams or Kafka Connect. I'd like to output the 2 files to the local file > system because this is a simple proof of concept application which I'd like > to keep from using other tools in the chain if possible. > > > > Cheers! > > > > David Newberger > > > > -- -- Guozhang