Hi, Oscar

IMO, Flink as a big data compute engine, its main goal is to provide
general-purpose computing power, not as a back-end service, or to solve a
specific business problem, so it doesn't need a dependency injection
framework, so that's why you didn't find information about it in the Flink
community. Of course from the point of view of your needs, because you want
to use DataStream or Table API to solve a specific business problem, which
may have a lot of external dependencies, so the dependency injection
framework will be very useful. Based on what you've heard and my
experience, Spring should be a good choice.
Best,
Ron

Oscar Perez via user <user@flink.apache.org> 于2023年8月1日周二 23:39写道:

> Hi,
> we are currently migrating some of our jobs into hexagonal architecture
> and I have seen that we can use spring as dependency injection framework,
> see:
>
>
> https://getindata.com/blog/writing-flink-jobs-using-spring-dependency-injection-framework/
>
> Has anybody analyzed different JVM DI frameworks e.g guice, micronaut, etc
> and feasibility and performance on apache flink?
>
> using google I have found some issues with dagger and flink while
> guice/spring seems better suited but I could not find a study of
> performance recommendations from the flink community.
>
> Thanks!
> Oscar
>

Reply via email to