I would agree with Ron.

If you have a chance to use Scala, then it is much easier to compose Flink
process functions (or what have you) into a data stream. Simple Functional
Programming power.
Coming from a Java background into the Scala ecosystem sometime ago, I was
just surprised that proper language does not require any DI Framework.
Sometimes (10 % of cases) I used the Scala Macwire library to get DI
behaviour.
Best regards,
Alexey

On Sun, Aug 6, 2023 at 2:19 PM liu ron <ron9....@gmail.com> wrote:

> Hi, Oscar
>
> IMO, Flink as a big data compute engine, its main goal is to provide
> general-purpose computing power, not as a back-end service, or to solve a
> specific business problem, so it doesn't need a dependency injection
> framework, so that's why you didn't find information about it in the Flink
> community. Of course from the point of view of your needs, because you want
> to use DataStream or Table API to solve a specific business problem, which
> may have a lot of external dependencies, so the dependency injection
> framework will be very useful. Based on what you've heard and my
> experience, Spring should be a good choice.
>
> Best,
> Ron
>
> Oscar Perez via user <user@flink.apache.org> 于2023年8月1日周二 23:39写道:
>
>> Hi,
>> we are currently migrating some of our jobs into hexagonal architecture
>> and I have seen that we can use spring as dependency injection framework,
>> see:
>>
>>
>> https://getindata.com/blog/writing-flink-jobs-using-spring-dependency-injection-framework/
>>
>> Has anybody analyzed different JVM DI frameworks e.g guice, micronaut,
>> etc and feasibility and performance on apache flink?
>>
>> using google I have found some issues with dagger and flink while
>> guice/spring seems better suited but I could not find a study of
>> performance recommendations from the flink community.
>>
>> Thanks!
>> Oscar
>>
>

Reply via email to