I would agree with Ron.
If you have a chance to use Scala, then it is much easier to compose Flink
process functions (or what have you) into a data stream. Simple Functional
Programming power.
Coming from a Java background into the Scala ecosystem sometime ago, I was
just surprised that proper lan
Hi, Oscar
IMO, Flink as a big data compute engine, its main goal is to provide
general-purpose computing power, not as a back-end service, or to solve a
specific business problem, so it doesn't need a dependency injection
framework, so that's why you didn't find information about it in the Flink
c
Hi,
we are currently migrating some of our jobs into hexagonal architecture and
I have seen that we can use spring as dependency injection framework, see:
https://getindata.com/blog/writing-flink-jobs-using-spring-dependency-injection-framework/
Has anybody analyzed different JVM DI frameworks e.