+1 (non-binding)
Verified sigs, sums, license, build and test.
Tested simple table ops with Spark 3.2.
On Mon, Jun 6, 2022 at 5:39 PM Szehon Ho wrote:
> +1 (non-binding)
>
>
>1. Verified signatures
>2. Verified checksums
>3. RAT checks
>4. Build and test
>5. Tested with Spa
+1 (non-binding)
1. Verified signatures
2. Verified checksums
3. RAT checks
4. Build and test
5. Tested with Spark 3.2, create a table and run a few queries
Thanks
Szehon
On Mon, Jun 6, 2022 at 10:46 AM Daniel Weeks
wrote:
> +1 (binding)
>
> verified sigs/sums/license/build/tes
>
> There’s also the question of how useful this would be in practice given
> the complexity of using C++ (or Rust etc) within some of the major
> frameworks.
>
One place this would be useful is for the Arrow's DataSet API [1]. An
option the Arrow community might be open to is hosting parts of th
Thanks, Ryan, that’s helpful.
I’m curious – is there a Flink-native means to write delete files, then? I’ve
written mine through the native Java AP e.g. creating an EqualityDeleteWriter I
and applied them to the ‘main’ Table in a transaction.
ah
From: Ryan Blue
Sent: Monday, June 6, 2022 1:52
Andreas,
We haven't built a strategy in the Flink sink that will use position
deletes. The difficulty is that position deletes require knowing where the
row that you want to delete is located, which means you either have to have
expensive row-level indexing or you need to scan through potential da
+1 (binding)
verified sigs/sums/license/build/tests
As for the detached commit, I believe I commented on this in a prior
release and the parent commit is the head of the 0.13.x branch and the
detached commit is just the version bump, so I'm ok with it, but it sure
would be nice if that wasn't det
Hi folks, I'm processing data from an Iceberg table with Flink and had a
question about positional deletes.
I batch process a source Table to create DataStream of Records that I'd like to
delete from it. I initially created equality delete files with all the values
from these Records, but for p