+1 (binding)
Verified build, checksums, did some testing

On Thu, Nov 7, 2024, 19:32 Kevin Liu <kevin.jq....@gmail.com> wrote:

> +1 non-binding
>
> Verified signatures, checksums, and license
>
> Ran build and tests with JDK17
>
>
> Best,
>
> Kevin Liu
>
> On Thu, Nov 7, 2024 at 7:24 AM Prashant Singh <prashant010...@gmail.com>
> wrote:
>
>> Thank you Russell !
>>
>> +1 (non-binding)
>> - Verified signature, checksum, license, build.
>> - ran our internal services iceberg integration tests with JDK 17
>> - manually tested spark-sql
>>
>> Thanks,
>> Prashant Singh
>>
>> On Thu, Nov 7, 2024 at 12:32 AM Fokko Driesprong <fo...@apache.org>
>> wrote:
>>
>>> Thanks Russel for running this release!
>>>
>>> +1 (binding)
>>>
>>> Checked signatures, checksum, licenses and did some local testing.
>>>
>>> Kind regards,
>>> Fokko
>>>
>>> Op do 7 nov 2024 om 08:35 schreef Eduard Tudenhöfner <
>>> etudenhoef...@apache.org>:
>>>
>>>> +1 (binding)
>>>>
>>>> Verified signature/checksum/license and build/test with JDK17
>>>>
>>>> On Thu, Nov 7, 2024 at 3:11 AM Daniel Weeks <dwe...@apache.org> wrote:
>>>>
>>>>> +1 (binding)
>>>>>
>>>>> Verified sigs/sums/license/build/test (Java 17)
>>>>>
>>>>> -Dan
>>>>>
>>>>> On Wed, Nov 6, 2024 at 3:23 PM Jack Ye <yezhao...@gmail.com> wrote:
>>>>>
>>>>>> +1 (binding)
>>>>>>
>>>>>> - Verified signature, checksum, license
>>>>>> - Ran build and test with JDK 11 and 17
>>>>>> - Ran AWS integration tests
>>>>>> - Ran on Spark 3.5 with some manual tests
>>>>>>
>>>>>> Best,
>>>>>> Jack Ye
>>>>>>
>>>>>> On Wed, Nov 6, 2024 at 9:01 AM Amogh Jahagirdar <2am...@gmail.com>
>>>>>> wrote:
>>>>>>
>>>>>>> +1 binding
>>>>>>>
>>>>>>> Verified signatures/checksums/license and ran build/tests with JDK17.
>>>>>>>
>>>>>>> Thanks,
>>>>>>>
>>>>>>> Amogh Jahagirdar
>>>>>>>
>>>>>>> On Tue, Nov 5, 2024 at 10:35 PM Yufei Gu <flyrain...@gmail.com>
>>>>>>> wrote:
>>>>>>>
>>>>>>>> +1 (binding)
>>>>>>>>
>>>>>>>>
>>>>>>>> Verified signature, checksum, license, build.
>>>>>>>>
>>>>>>>> Successfully tested the following Spark SQL commands on Polaris,
>>>>>>>> using Spark 3.5.3 with the binary artifacts Iceberg 1.7.0 jar. All
>>>>>>>> operations worked as expected.
>>>>>>>>
>>>>>>>> create database db1;
>>>>>>>> show databases;
>>>>>>>> create table db1.t1 (id int, name string);
>>>>>>>> insert into db1.t1 values (1, 'a');
>>>>>>>> select * from db1.t1;
>>>>>>>> insert into db1.t1 values (2, 'b');
>>>>>>>> call polaris.system.expire_snapshots('db1.t1', timestamp '2024-11-11');
>>>>>>>> select * from db1.t1.snapshots;
>>>>>>>>
>>>>>>>> Notably, the snapshot summary shows the latest Iceberg version:
>>>>>>>>
>>>>>>>> 2024-11-05 18:31:10.92  2780504056765263301  5332711584219924798  
>>>>>>>> append
>>>>>>>> file:/tmp/polaris/db1/t1/metadata/snap-2780504056765263301-1-634b033c-45dd-40d6-8cb4-468fe6015ba4.avro
>>>>>>>> {
>>>>>>>>   "added-data-files": "1",
>>>>>>>>   "added-files-size": "611",
>>>>>>>>   "added-records": "1",
>>>>>>>>   "app-id": "local-1730860071427",
>>>>>>>>   "changed-partition-count": "1",
>>>>>>>>   "engine-name": "spark",
>>>>>>>>   "engine-version": "3.5.3",*  "iceberg-version": "Apache Iceberg 
>>>>>>>> 1.7.0 (commit 5f7c992ca673bf41df1d37543b24d646c24568a9)",
>>>>>>>> *  "spark.app.id": "local-1730860071427",
>>>>>>>>   "total-data-files": "2",
>>>>>>>>   "total-delete-files": "0",
>>>>>>>>   "total-equality-deletes": "0",
>>>>>>>>   "total-files-size": "1222",
>>>>>>>>   "total-position-deletes": "0",
>>>>>>>>   "total-records": "2"
>>>>>>>> }
>>>>>>>>
>>>>>>>>
>>>>>>>> Yufei
>>>>>>>>
>>>>>>>>
>>>>>>>> On Mon, Nov 4, 2024 at 11:58 PM Jean-Baptiste Onofré <
>>>>>>>> j...@nanthrax.net> wrote:
>>>>>>>>
>>>>>>>>> +1 (non binding)
>>>>>>>>>
>>>>>>>>> I checked:
>>>>>>>>> - Parquet has been updated
>>>>>>>>> - the planned Avro readers used in both Flink and Spark, they are
>>>>>>>>> actually used
>>>>>>>>> - Signature and hash are good
>>>>>>>>> - No binary file found in the source distribution
>>>>>>>>> - ASF header is present in all expected file
>>>>>>>>> - LICENSE and NOTICE look good
>>>>>>>>> - Build is OK
>>>>>>>>> - Tested using Spark SQL with JDBC Catalog and Apache Polaris
>>>>>>>>> without problem
>>>>>>>>>
>>>>>>>>> Thanks !
>>>>>>>>> Regards
>>>>>>>>> JB
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> On Mon, Nov 4, 2024 at 9:46 PM Russell Spitzer
>>>>>>>>> <russell.spit...@gmail.com> wrote:
>>>>>>>>> >
>>>>>>>>> > Hi y'all!
>>>>>>>>> >
>>>>>>>>> > I propose that we release the following RC as the official
>>>>>>>>> Apache Iceberg 1.7.0 release.
>>>>>>>>> >
>>>>>>>>> > The commit ID is 5f7c992ca673bf41df1d37543b24d646c24568a9
>>>>>>>>> > * This corresponds to the tag: apache-iceberg-1.7.0-rc1
>>>>>>>>> > *
>>>>>>>>> https://github.com/apache/iceberg/commits/apache-iceberg-1.7.0-rc1
>>>>>>>>> > *
>>>>>>>>> https://github.com/apache/iceberg/tree/5f7c992ca673bf41df1d37543b24d646c24568a9
>>>>>>>>> >
>>>>>>>>> > The release tarball, signature, and checksums are here:
>>>>>>>>> > *
>>>>>>>>> https://dist.apache.org/repos/dist/dev/iceberg/apache-iceberg-1.7.0-rc1
>>>>>>>>> >
>>>>>>>>> > You can find the KEYS file here:
>>>>>>>>> > * https://dist.apache.org/repos/dist/dev/iceberg/KEYS
>>>>>>>>> >
>>>>>>>>> > Convenience binary artifacts are staged on Nexus. The Maven
>>>>>>>>> repository URL is:
>>>>>>>>> > *
>>>>>>>>> https://repository.apache.org/content/repositories/orgapacheiceberg-1176/
>>>>>>>>> >
>>>>>>>>> > Difference between 1.7.0 and 1.6.1
>>>>>>>>> >
>>>>>>>>> https://github.com/apache/iceberg/compare/apache-iceberg-1.6.1...apache-iceberg-1.7.0-rc1
>>>>>>>>> >
>>>>>>>>> > Difference between 1.7.0 RC1 and 1.7.0 RC0
>>>>>>>>> >
>>>>>>>>> https://github.com/apache/iceberg/compare/apache-iceberg-1.7.0-rc0...apache-iceberg-1.7.0-rc1
>>>>>>>>> >
>>>>>>>>> > Please download, verify, and test.
>>>>>>>>> >
>>>>>>>>> > Please vote in the next 72 hours.
>>>>>>>>> >
>>>>>>>>> > [ ] +1 Release this as Apache Iceberg 1.7.0
>>>>>>>>> > [ ] +0
>>>>>>>>> > [ ] -1 Do not release this because...
>>>>>>>>> >
>>>>>>>>> > Only PMC members have binding votes, but other community members
>>>>>>>>> are encouraged to cast
>>>>>>>>>
>>>>>>>>

Reply via email to