Hi all,
I’d like to follow up on the discussion about packaging all components into
a framework to simplify distribution of the testing setup across Iceberg
implementations and third-party consumers.
I’ve opened PR https://github.com/apache/iceberg/pull/13533, which sets up
a workflow to build an
Hi all,
Thanks for all the suggestions. I agree that a good starting point would be
to have some fixtures that can be easily reused across different
implementations. This could be a Docker image published from the main
Iceberg repository, bundling everything needed.
Then we can proceed to experim
Hi all, also happy to support this!
I think one thing I'm looking forward out of this work in addition to the
general improvements is being able to use this as a building block for
instrumenting different Iceberg implementations. From what I currently
understand maintainers need to know what diffe
Thanks Leon for bringing this up.
The main reason that all the implementations test against Spark is that it
is well supported and has a nice SQL API to easily set up test cases. But
most importantly, it uses the Iceberg Java SDK underneath, which we
consider the reference implementation of Iceber
Hi, Leon:
> How complex would it be to integrate sqllogictest into non-Rust clients?
This seems non-trivial to me. Note that it's not only about
parsing/executing sqllogictest, the underlying sql engine needs to
integrate with iceberg's language client.
> Should we centralize the shared Docker i
Thank you, Leon
> How complex would it be to integrate sqllogictest into non-Rust clients?
I checked a bit about existsing sqllogictest integration projects:
- java: https://github.com/hydromatic/sql-logic-test
- go: https://github.com/alkemir/sqllogictest
- python: https://github.com/duckdb/duc
Hi Xuanwo, Renjie
I think sqllogictests is a good replacement on the JSON spec, and I’m
definitely not trying to recommend on using JSON spec as I think it is very
be too complex to execute.
As Renjie pointed out, sqllogictests only suitable when sql engine is
supported, but right now not all of
Hi, Leon:
Thanks for raising this.
In rust we also have similar plan to do integration tests against rust and
java implementation: https://github.com/apache/iceberg-rust/pull/581
This approach is pure data driven, as Xuanwo mentioned, motivated by
sqllogictests. That's to say, we will define a s
Thank you Leon for starting this.
It's very important for open formats like Iceberg to be interoperable across
different implementations. And it's on the top list of iceberg-rust.
My only concern is about the JSON spec. I'm thinking of if it's a good idea for
us to adopt sqllogictests format:
Hi Kevin,
Thanks for bringing up the Arrow integration tests as a reference! I’ve
looked into that setup as well. However, I found it difficult to apply the
same model to Iceberg since Arrow and Iceberg are very different. Arrow
tests are centered around in-memory serialization and deserialization
Hi Leon,
Thanks for starting this thread! I think this is a great idea. Happy to
support this in any way I can.
Matt Topol and I have previously discussed cross-client testing regarding
the iceberg-go and iceberg-python implementations. There are a class of
bugs that can be caught in this way. We
Hello all,
I would like to start a discussion on standardizing the cross client
integration testing in iceberg projects. With all the active development
among the different client implementations (python, rust, go, etc), it will
be important to make sure the implementations are interoperable betwe
12 matches
Mail list logo