I can't download all the PR descriptions due to rate limiting. Back to the
plan suggested by Nathan. We need a few more volunteers.

-adam

On Thu, Apr 2, 2020 at 11:55 AM Adam Feuer <a...@starcat.io> wrote:

> I have an improvement to the process. I just made a scrapy
> <https://scrapy.org/> script that can download the PR title and
> description from Github. (Yeah that should be accessible via the Github API
> but I couldn't figure out how.)
>
> So I can make a spreadsheet or an HTML doc that has all the PR
> descriptions in it. Seems like this would be faster to than clicking on
> them but I don't know. I'll give that a shot and post it here.
>
> -adam
>
> On Thu, Apr 2, 2020 at 11:35 AM Nathan Hartman <hartman.nat...@gmail.com>
> wrote:
>
>> On Thu, Apr 2, 2020 at 1:01 PM Adam Feuer <a...@starcat.io> wrote:
>> >
>> > Bumping this up. It seems like we need a plan to tackle going through
>> the
>> > 613 closed PRs and summarizing (only merged ones need to be summarized).
>> > This would be easier with a team of people... anyone want to help?
>> >
>> > One way we could do this is one group take bug fixes, and another take
>> > features, and then make a list for each. Then we put them together in a
>> > document.
>>
>> So that we don't duplicate work, I think we should also stick to a range
>> of
>> PRs.
>>
>> For example I could go through PRs 1 through 200 looking for new features
>> -- architectures, drivers, boards, etc -- but *not* bug fixes per Adam's
>> suggestion and make a list of those. Another volunteer could look at PRs
>> 201 through 400, etc.
>>
>> So we need 3 volunteers to look for features, I'm 1 so we need 2 more.
>>
>> And we need the same thing for bug fixes, 3 volunteers.
>>
>> Any takers?
>>
>> Nathan
>>
>
>
> --
> Adam Feuer <a...@starcat.io>
>


-- 
Adam Feuer <a...@starcat.io>

Reply via email to