I believe that 2.41.0 is the "oldest" safe version[1] to use as there were
initially some bugs introduced when migrating from PDone to outputting the
write results.
[1]
https://github.com/apache/beam/commit/2cb2ee2ba3b5efb0f08880a9325f092485b3ccf2
On Thu, Feb 2, 2023 at 3:16 PM Kaggal, Vinod C. (
Thank you for the response!
We are currently using :
2.37.0
2.33.0
Is there a compatible elasticsearchio version that may work with our beam
version?
Thank you!
Vinod
From: Chamikara Jayalath
Date: Thursday, February 2, 2023 at 6:01 PM
To: user@beam.apache.org , Kaggal, Vinod C. (Vinod C),
M
On Thu, Feb 2, 2023 at 1:56 PM Kaggal, Vinod C. (Vinod C.), M.S. via user <
user@beam.apache.org> wrote:
> Hello! Thank you for all the hard work on implementing these useful
> libraries.
>
>
>
> *Background:* We have been using Apache Storm in production for some time
> (over 8 years) and have re
Hello! Thank you for all the hard work on implementing these useful libraries.
Background: We have been using Apache Storm in production for some time (over 8
years) and have recently switched over to Beam. One of the topologies that we
had in Storm was to ingest data, index to elastic (write) a
It looks like Calcite stopped considering field names in RelNode equality
as of Calcite 2.22 (which we use in Beam v2.34.0+). This can result in a
planner state where two nodes that only differ by field name are considered
equivalent.
I have a fix for Beam in https://github.com/apache/beam/pull/25
*~Vincent*
On Thu, Feb 2, 2023 at 3:01 AM Alexey Romanenko
wrote:
> - d...@beam.apache.org
> + user@beam.apache.org
>
> Hi Enzo,
>
> Can you make sure that all your workers were properly added and listed in
> Spark WebUI?
>
> Did you specify a “ --master spark://HOST:PORT” option while running
Hi,
I know we use the portability framework when the sdk language (python) is
different from the runner's language(java) .
If my runner is Java based and I want to use the portability framework for
Java SDK. Is there any optimization on the Beam side rather than running
two separate docker images
Hi All,
I'm new to using Apache Beam using Go.
pubsubio.Write(scope, "project", "topic", ppMessages)
When I try to publish a message in a topic I get an error message
"Could not find the sink for pubsub, Check that the sink library specifies
alwayslink = 1
I found a StackOverFlow post for the sa
- d...@beam.apache.org
+ user@beam.apache.org
Hi Enzo,
Can you make sure that all your workers were properly added and listed in Spark
WebUI?
Did you specify a “ --master spark://HOST:PORT” option while running your Beam
job with a SparkRunner?
PS: Please, use user@beam.apache.org mailing li