On Thu, Oct 19, 2023 at 12:18 PM Kenneth Knowles <k...@apache.org> wrote:

> +1 to more helpful guide on "how to usefully participate in RC validation"
> but also big +1 to Robert, Jack, Johanna.
>
> TL;DR the RC validation is an opportunity for downstream testing.
>
> Robert alluded to the origin of the spreadsheet: I created it long ago to
> validate that the human language on our web page actually works. Maybe
> someone should automate that with an LLM now.
>
> Robert also alluded to clean environment: our gradle scripts and GHA
> scripts and CI environment are heavily enough engineered that they don't
> represent what a user will experience. We could potentially use our starter
> repos for an adequate smoke test here.
>
> Those are both ways that *we* can pretend to be users. But actual users
> checking the RC to make sure they'll have a smooth upgrade is by far the
> most impactful validation.
>
> This thread honestly makes me want to delete the spreadsheet but maybe
> come up with a guide for downstream projects to validate against an RC.
> Maybe that's an extreme reaction...
>

I would very much be in favor of that.

On Wed, Oct 18, 2023 at 2:32 PM Robert Bradshaw via dev <dev@beam.apache.org>
> wrote:
>
>> +1 That's a great idea. They have incentive to make sure the issue was
>> resolved for them, plus we get to ensure there were no other regressions.
>>
>> On Wed, Oct 18, 2023 at 11:30 AM Johanna Öjeling via dev <
>> dev@beam.apache.org> wrote:
>>
>>> When I have contributed to Apache Airflow, they have tagged all
>>> contributors concerned in a GitHub issue when the RC is available and asked
>>> us to validate it. Example: #29424
>>> <https://github.com/apache/airflow/issues/29424>.
>>>
>>> I found that to be an effective way to notify contributors of the RC and
>>> nudge them to help out. In the issue description there is a reference to
>>> the guidelines on how to test the RC and a note that people are encouraged
>>> to vote on the mailing list (which could admittedly be more highlighted
>>> because I did not pay attention to it until now and was unaware that
>>> contributors had a vote).
>>>
>>> It might be an idea to consider something similar here to increase the
>>> participation?
>>>
>>> On Tue, Oct 17, 2023 at 7:01 PM Jack McCluskey via dev <
>>> dev@beam.apache.org> wrote:
>>>
>>>> I'm +1 on helping explain what we mean by "validate the RC" since we're
>>>> really just asking users to see if their existing use cases work along with
>>>> our typical slate of tests. I don't know if offloading that work to our
>>>> active validators is the right approach though, documentation/screen share
>>>> of their specific workflow is definitely less useful than having a more
>>>> general outline of how to install the RC and things to look out for when
>>>> testing.
>>>>
>>>> On Tue, Oct 17, 2023 at 12:55 PM Austin Bennett <aus...@apache.org>
>>>> wrote:
>>>>
>>>>> Great effort.  I'm also interested in streamlining releases -- so if
>>>>> there are alot of manual tests that could be automated, would be great
>>>>> to discover and then look to address.
>>>>>
>>>>> On Tue, Oct 17, 2023 at 8:47 AM Robert Bradshaw via dev <
>>>>> dev@beam.apache.org> wrote:
>>>>>
>>>>>> +1
>>>>>>
>>>>>> I would also strongly suggest that people try out the release against
>>>>>> their own codebases. This has the benefit of ensuring the release won't
>>>>>> break your own code when they go out, and stress-tests the new code 
>>>>>> against
>>>>>> real-world pipelines. (Ideally our own tests are all passing, and this
>>>>>> validation is automated as much as possible (though ensuring it matches 
>>>>>> our
>>>>>> documentation and works in a clean environment still has value), but
>>>>>> there's a lot of code and uses out there that we don't have access to
>>>>>> during normal Beam development.)
>>>>>>
>>>>>> On Tue, Oct 17, 2023 at 8:21 AM Svetak Sundhar via dev <
>>>>>> dev@beam.apache.org> wrote:
>>>>>>
>>>>>>> Hi all,
>>>>>>>
>>>>>>> I’ve participated in RC testing for a few releases and have observed
>>>>>>> a bit of a knowledge gap in how releases can be tested. Given that Beam
>>>>>>> encourages contributors to vote on RC’s regardless of tenure, and that
>>>>>>> voting on an RC is a relatively low-effort, high leverage way to 
>>>>>>> influence
>>>>>>> the release of the library, I propose the following:
>>>>>>>
>>>>>>> During the vote for the next release, voters can document the
>>>>>>> process they followed on a separate document, and add the link on 
>>>>>>> column G
>>>>>>> here
>>>>>>> <https://docs.google.com/spreadsheets/d/1qk-N5vjXvbcEk68GjbkSZTR8AGqyNUM-oLFo_ZXBpJw/edit#gid=437054928>.
>>>>>>> One step further, could be a screencast of running the test, and 
>>>>>>> attaching
>>>>>>> a link of that.
>>>>>>>
>>>>>>> We can keep repeating this through releases until we have
>>>>>>> documentation for many of the different tests. We can then add these 
>>>>>>> docs
>>>>>>> into the repo.
>>>>>>>
>>>>>>> I’m proposing this because I’ve gathered the following feedback from
>>>>>>> colleagues that are tangentially involved with Beam: They are 
>>>>>>> interested in
>>>>>>> participating in release validation, but don’t know how to get started.
>>>>>>> Happy to hear other suggestions too, if there are any to address the
>>>>>>> above.
>>>>>>>
>>>>>>> Thanks,
>>>>>>>
>>>>>>>
>>>>>>> Svetak Sundhar
>>>>>>>
>>>>>>>   Data Engineer
>>>>>>> s <nellywil...@google.com>vetaksund...@google.com
>>>>>>>
>>>>>>>

Reply via email to