Oh, I think it must be that Spark wraps StructLike to act like an
InternalRow, so we can reuse it for Record or the metadata rows. Would it
be possible to adapt the Record code to use StructLike instead?
On Mon, Jul 19, 2021 at 10:23 PM Peter Vary
wrote:
> Thanks Ryan for checking this out!
>
>
Thanks Ryan for checking this out!
IcebergWritable wraps a Record to a Container, and a Writable, so that is
why I try to create a Record here.
The problem is that the metadata table scan returns a StructLike and I have
to match that with the metadata schema and then with the read schema.
I have
Peter,
The "data" tasks produce records using Iceberg's Record class and the
internal representations. I believe that's what the existing Iceberg object
inspectors use. Couldn't you just wrap this with an IcebergWritable and use
the regular object inspectors?
On Thu, Jul 15, 2021 at 8:53 AM Peter
I have put together a somewhat working solution:
case METADATA:
return (CloseableIterable) CloseableIterable.transform(((DataTask)
currentTask).rows(), row -> {
Record record = GenericRecord.create(readSchema);
List tableFields = tableSchema.asStruct().fields();
for (int i = 0; i < r