Hi Philipp,
afaik, Flink doesn't offer this out-of-the-box. You could either hack 
something as suggested or use Kafka to glue different jobs together.

Both may affect exactly/at-least once guarantees, however. Also refer to
https://ci.apache.org/projects/flink/flink-docs-release-1.3/dev/connectors/
guarantees.html


Nico

On Thursday, 31 August 2017 16:06:55 CEST Philip Doctor wrote:
> I have a few Flink jobs.  Several of them share the same code.  I was
> wondering if I could make those shared steps their own job and then specify
> that the sink for one process was the source for another process, stiching
> my jobs together.  Is this possible ? I didn’t see it in the docs.. It
> feels like I could possibly hack something together with writeToSocket() on
> my data stream and then create a source that reads from a socket, but I was
> hoping there was a more fully baked solution to this.
 
> Thanks for your time.

Attachment: signature.asc
Description: This is a digitally signed message part.

Reply via email to