To Rozov, we use "Apache SkyWalking Eyes" in our GitHub Action.

- https://github.com/apache/skywalking-eyes

- 
https://github.com/apache/spark-kubernetes-operator/blob/6116bb08c282911389fe2f5af49794a456111e97/.github/workflows/build_and_test.yml#L24

In addition, you can download RAT jar manually and run it.

Dongjoon.

On 2025/05/05 02:36:01 "Rozov, Vlad" wrote:
> +1 (not binding)
> 
> Checked checksum and signatures, build,  and confirmed that binary files are 
> not included into the release.
> 
> Is Apache RAT part of the gradle build? If not, how headers are validated to 
> include correct license?
> 
> Thank you,
> 
> Vlad
> 
> 
> 
> > On May 4, 2025, at 5:38 PM, Dongjoon Hyun <dongj...@apache.org> wrote:
> > 
> > 
> > 
> > +1
> > 
> > I checked the checksum and signatures, and tested with K8s v1.32.
> > 
> > Dongjoon.
> > 
> > On 2025/05/04 23:58:54 Zhou Jiang wrote:
> >> +1 , thanks for driving this release!
> >> 
> >> *Zhou JIANG*
> >> 
> >> 
> >> 
> >> On Sun, May 4, 2025 at 16:58 Dongjoon Hyun <dongjoon.h...@gmail.com> wrote:
> >> 
> >>> Please vote on releasing the following candidate as Apache Spark K8s
> >>> Operator 0.1.0. This vote is open for the next 72 hours and passes if a
> >>> majority +1 PMC votes are cast, with a minimum of 3 +1 votes.
> >>> 
> >>> [ ] +1 Release this package as Apache Spark K8s Operator 0.1.0
> >>> [ ] -1 Do not release this package because ...
> >>> 
> >>> TAG:
> >>> https://github.com/apache/spark-kubernetes-operator/releases/tag/v0.1.0-rc1
> >>> (commit: fe9ec3af199a3061a282c529107c0077f0f97d34)
> >>> 
> >>> RELEASE FILES:
> >>> 
> >>> https://dist.apache.org/repos/dist/dev/spark/spark-kubernetes-operator-v0.1.0-rc1/
> >>> 
> >>> LIST OF ISSUES:
> >>> https://issues.apache.org/jira/projects/SPARK/versions/12354567
> >>> 
> >>> Thanks,
> >>> Dongjoon.
> >>> 
> >> 
> > 
> > ---------------------------------------------------------------------
> > To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
> > 
> 
> 

---------------------------------------------------------------------
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org

Reply via email to