Let us assume that you want to build an integration test setup where
you run all participating components in Docker.
You create a docker-compose.yml with four Docker images, something like this:
# Start docker-compose.yml
version: '2'
services:
myapp:
build: myapp_dir
links:
- ka
Can this docker image be used to spin up kafka cluster in a CI/CD pipeline
like Jenkins to run the integration tests? Or it can be done only in the
local machine that has docker installed? I assume that the box where the
CI/CD pipeline runs should have docker installed correct?
On Mon, Jul 4, 2016
The application output is that it inserts data to cassandra at the end of
every batch.
On Mon, Jul 4, 2016 at 5:20 AM, Lars Albertsson wrote:
> I created such a setup for a client a few months ago. It is pretty
> straightforward, but it can take some work to get all the wires
> connected.
>
> I
I created such a setup for a client a few months ago. It is pretty
straightforward, but it can take some work to get all the wires
connected.
I suggest that you start with the spotify/kafka
(https://github.com/spotify/docker-kafka) Docker image, since it
includes a bundled zookeeper. The alternati
You can use this https://github.com/wurstmeister/kafka-docker to spin up a
kafka cluster and then point your sparkstreaming to it to consume from it.
On Fri, Jul 1, 2016 at 1:19 AM, SRK wrote:
> Hi,
>
> I need to do integration tests using Spark Streaming. My idea is to spin up
> kafka using doc