Hi,

issues like this arise from the fact that we have tight coupling between various parts of our ecosystem - the model, the core and runners. We should decouple this and enable runners to have their own release cycles, because anything other will not scale in the long run. We cannot have more and more runners in our release dependencies, because various conflicts will block our ability to create a stable release, at some point. We might want to define a roadmap to avoid this.

Best,

 Jan

On 11/26/24 15:46, Kenneth Knowles wrote:
Are we using --target 8 in the Ci/CD and/or --source 8 for some modules? Are the problems independent of what those flags control?

Just curious - I am not advocating for anything.

Kenn

On Mon, Nov 25, 2024 at 1:38 PM Yi Hu via dev <dev@beam.apache.org> wrote:

    Hi Beam community,

    As Beam repo CI/CD has moved to Java 11 [1], I would like to bring
    up attention to two outstanding open Issues that could affect
    Spark and Samza runner users [2, 3]

    For Spark Runner issue, currently Spark runner does not support
    Spark 3.4.0+ with Java11. This is due to Spark 3.4.0 upgraded to
    use slf4j 2. However the same Spark version still runs fine with
    Java8.

    For Samza Runner, currently it does not support Java11. Newer
    Samza does support Java11, however, the upgrade appears nontrivial.

    It would be great if someone actively using Spark and Samza runner
    could volunteer to add or complete Java 11 support, or share their
    solution for these issues.

    Thanks for your attention,

    Yi

    [1] https://github.com/apache/beam/issues/31677
    [2] https://github.com/apache/beam/issues/32207
    [3] https://github.com/apache/beam/issues/32208

Reply via email to