Reading commands from kafka and triggering a redshift copy is sufficiently
simple it could just be a bash script.  But if you've already got a spark
streaming job set up, may as well use it for consistency's sake.  There's
definitely no need to mess around with akka.

On Fri, Jan 15, 2016 at 6:25 PM, Afshartous, Nick <nafshart...@turbine.com>
wrote:

>
> Hi,
>
>
> We have a streaming job that consumes from Kafka and outputs to S3.  We're
> going to have the job also send commands (to copy from S3 to Redshift) into
> a different Kafka topic.
>
>
> What would be the best framework for consuming and processing the copy
> commands ?  We're considering creating a second streaming job or using Akka.
>
>
> Thanks for any suggestions,
>
> --
>
>     Nick
>

Reply via email to