Hi All,

Indeed, markdown link checks are inherently flaky because they rely on
external resources to be cooperative. This hinders PR progress.

Here's a recent example [1]:

ERROR: 1 dead links found!
[✖] 
https://medium.com/@jitenderkmr/demystifying-snowflake-ids-a-unique-identifier-in-distributed-computing-72796a827c9d
→ Status: 403

That link works fine in a browser.

So +1 to remove markdown link checks (we can always fix when people report
dead links).
[1]
https://github.com/apache/polaris/actions/runs/19534001072/job/55941993028?pr=2802

Cheers,
Dmitri.

On Thu, Nov 20, 2025 at 6:12 AM Robert Stupp <[email protected]> wrote:

> Hi all,
>
> As Adam recently mentioned [1], the "Check Markdown links" workflow is
> known to regularly produce false failures.
>
> It seems that some external sites have somewhat aggressive rate limits
> and/or bot protections in place that lead to these false failures.
>
> We cannot control nor work around those externally controlled things.
> PRs are getting blocked by these false failures. Retrying the workflow
> does not help. And if the mentioned assumptions are correct, retrying
> actually makes the (rate limit) situation even worse, leading to
> other/more false failures from "Check Markdown links".
>
> I propose to remove "Check Markdown links" from the required checks
> [2]. The workflow would still run, but not block PRs. Reviewers can
> still inspect and cross-check potential failures from that workflow.
>
> Thoughts?
>
> Robert
>
> [1] https://github.com/apache/polaris/issues/3097
> [2] https://github.com/apache/polaris/pull/3102
>

Reply via email to