+1 (non-binding)
1. Download the source tarball, signature (.asc), and checksum (.sha512):
OK
2. Import gpg keys: download KEYS and run gpg --import
/path/to/downloaded/KEYS (optional if this hasn’t changed) : OK
3. Verify the signature by running: gpg --verify
apache-iceberg-xx.tar.gz.asc: I g
Whoops, forgot to CC mailing list.
Ah, the metrics code _does_ allow you to use a name mapping if you specify one
in the call to ParquetUtil.fileMetrics, which is what we did. If you don’t,
though, the mapping property from the table (if present) doesn’t appear to be
used automatically.
To be
We should probably fix that so that the mapping from the table is used by
default when importing files to a table.
I agree that the regular write path in Spark should handle stats as
expected. That's what we use all the time. I'd recommend trying to move to
it when you can. We're planning on relea
+1
1. Download the source tarball, signature (.asc), and checksum (.sha512): OK
2. Import gpg keys: download KEYS and run gpg --import /path/to/downloaded/KEYS
(optional if this hasn’t changed) : OK
3. Verify the signature by running: gpg --verify apache-iceberg-xx.tar.gz.asc:
I got a warning
+1
- Validated checksum and signature
- Ran license checks
- Built and ran tests
- Queried a Hadoop FS table created with 0.9.0 in Spark 3.0.1
- Created a Hive table from Spark 3.0.1
- Tested metadata tables from Spark
- Tested Hive and Hadoop table reads in Hive 2.3.7
I was
+1
1. Download the source tarball, signature (.asc), and checksum (.sha512):
OK
2. Import gpg keys: download KEYS and run gpg --import
/path/to/downloaded/KEYS (optional if this hasn’t changed) : OK
3. Verify the signature by running: gpg --verify
apache-iceberg-xx.tar.gz.asc: OK
4. Verify the
+1 for 0.10.0 RC4.
Bests,
Dongjoon.
On Wed, Nov 4, 2020 at 7:17 PM Jingsong Li wrote:
> +1
>
> 1. Download the source tarball, signature (.asc), and checksum (.sha512):
> OK
> 2. Import gpg keys: download KEYS and run gpg --import
> /path/to/downloaded/KEYS (optional if this hasn’t changed) :