This is still part of the Apache Spark project, conceptually?
IIRC Apache projects still need to use JIRA, so we can't do this.

On Thu, Aug 8, 2024 at 5:08 AM Mich Talebzadeh <mich.talebza...@gmail.com>
wrote:

> Hi Martin,
>
> If I understood it correctly, your  proposal suggests centralizing issue
> tracking for the Spark Connect Go client on GitHub Issues, instead of using
> both Jira and GitHub.? The primary motivation is to simplify the
> contribution process for developers?
>
> Few points if I may:
>
>
>    - How will critical or high-priority issues be handled within GitHub
>    Issues?
>    - What mechanisms will be in place to ensure timely response and
>    resolution of issues?
>    - How will the participants measure and track issue resolution and
>    development progress?
>    - What is the plan for migrating existing Jira issues to GitHub Issues?
>
>
> HTH,
>
> Mich Talebzadeh,
> Architect | Data Engineer | Data Science | Financial Crime
>
> PhD <https://en.wikipedia.org/wiki/Doctor_of_Philosophy> Imperial College
> London <https://en.wikipedia.org/wiki/Imperial_College_London>
> London, United Kingdom
>
> *Disclaimer:* The information provided is correct to the best of my
> knowledge but of course cannot be guaranteed . It is essential to note
> that, as with any advice, quote "one test result is worth one-thousand
> expert opinions (Werner  <https://en.wikipedia.org/wiki/Wernher_von_Braun>Von
> Braun <https://en.wikipedia.org/wiki/Wernher_von_Braun>)".
>
>
> On Thu, 8 Aug 2024 at 06:54, Martin Grund <mar...@databricks.com.invalid>
> wrote:
>
>> Hi folks,
>>
>> I wanted to start a discussion for the following proposal: To make it
>> easier for folks to contribute to the Spark Connect Go client, I was
>> contemplating not requiring them to deal with two accounts (one for Jira)
>> and one for Gihutb but allow using GitHub Issues for bugs and issues that
>> are specific to *only* the Spark Connect Go client.
>>
>> Jira will still be used for issues that span core Spark. This allows us
>> to easily label issues for starter tasks in one place, for example.
>>
>> I am explicitly not proposing to change the behavior for the main Spark
>> repository; here, the existing procedure remains.
>>
>> What do you think?
>>
>> Martin
>>
>

Reply via email to