Thanks for pointing this out Robert, I had somehow in my mind that it
was not official until 1.18 but I forgot to double check, for info I
was able to build the Go SDK container without any trouble so I assume
most of the things are 'ready'.
It would be great to take a look at fixing the hardcoded
Just a quick type check: is it the case that a Serializer is
expected to be able to properly serde any subclass of object? More
generally that any Serializer should be able to properly serde
V? Typically this isn't the case. Not saying we shouldn't make the proposed
change, but it could result in s
Of-course IMO it would be fine as well to not force developers to use
withKeySerializer() / withValueSerializer() in the first place.
This way you could use the standard way of configuring the Kafka serializer
classes using properties as per the Kafka Consumer/Producer documentation.
Just an idea.
Good point. Doing things the "normal" way for users of the storage system
is a good on-ramp. Conversely, having a "normal Beam" way is good for
people who use Beam more than Kafka. Can we have both easily?
Kenn
On Wed, Feb 9, 2022 at 6:50 AM Matt Casters wrote:
> Of-course IMO it would be fine
Thanks for writing this up. This is an instant classic sort of
bug/inconsistency.
Kenn
On Tue, Feb 8, 2022 at 11:46 AM Anand Inguva wrote:
> Hi all,
>
> For the Python SDK, there is some inconsistency across different APIs when
> it comes to parsing the boolean flags. The issue is briefly defin
This is your daily summary of Beam's current P1 issues, not including flaky
tests
(https://issues.apache.org/jira/issues/?jql=project%20%3D%20BEAM%20AND%20statusCategory%20!%3D%20Done%20AND%20priority%20%3D%20P1%20AND%20(labels%20is%20EMPTY%20OR%20labels%20!%3D%20flake).
See https://beam.apache.
This is your daily summary of Beam's current flaky tests
(https://issues.apache.org/jira/issues/?jql=project%20%3D%20BEAM%20AND%20statusCategory%20!%3D%20Done%20AND%20labels%20%3D%20flake)
These are P1 issues because they have a major negative impact on the community
and make it hard to determin
> On 8 Feb 2022, at 14:16, Matt Casters wrote:
>
> For KafkaIO.read() we made a specific provision in the form of class
> ConfluentSchemaRegistryDeserializer but it doesn't look like we covered the
> producer side of Avro values yet.
Talking about read from Kafka with Avro and Confluent Schem
+Ismael
Doing it in “normal” way, especially for Kafka, may require some additional
non-evident steps (well, of course it can be documented). So, I’d prefer to
have a more user-friendly API around it, like we have for reading Avro messages
with a schema stored in Confluent Schema Registry, whic
Dear Ahmet, David
That’s a good suggestion! I’ll try to free up some time in the weekend to
create a PR for this.
Sincerely
Kirill Ism
> On 9 Feb 2022, at 03:15, Ahmet Altay wrote:
>
> Thank you for the feedback Kirill. If you have time we will happily accept
> your contribution in the for
So yes, reading the generic records with a consumer worked great. It's
really convenient to have a way of handling both the coder and the
deserializer at once.
To test I hooked KafkaIO up to a free Confluent Cloud service with schema
registry. Reading works great and once I have my next fixes read
For Go we *will* (and do) need to cross compile and in many cases do so
automatically, for Amd64 instances, but don't currently support Arm64
workers.
We will likely need to support platform specific containers for all the
languages, or include boot loaders for all platforms and select which one
w
Based on discussion on https://issues.apache.org/jira/browse/LEGAL-601 I
think it will be simplest to license it under ASL2 and include a NOTICE
file. The user will be free to "clone and go".
I would bring these points back to the dev list:
- ASL2 is what people expect from an ASF project, so it
Hi Brian,
My point mainly is that KafkaIO (and others as well) tend to be restrictive
towards the API user.
In this case for example, errors are thrown at runtime if you don't set the
serializers using the Beam API. [1]
Instead of helping the inexperienced Kafka user, which is great, this
blocked
14 matches
Mail list logo