Re: [DISCUSS] When a DAG is paused, change the dag run state from running to failed.

2025-04-03 Thread Brent Bovenzi
The issue is that duration is based off of start and end dates. If there is
no end date we usually default to now. But that is misleading when a dag
run is running but the dag is paused.
Let me take a look at where we use duration in the 3.0 UI and see if we can
reduce that confusion. We don't have the "5 longest dag runs" in our new
dashboard page, which replaces cluster activity. If we wanted that feature
again, we should be mindful of this and filter out paused dags in the API
request.



On Thu, Apr 3, 2025, 1:27 PM Pedro Nunes Leal
 wrote:

> A 2025-03-31 22:26, Jens Scheffler escreveu:
> > Hi,
> >
> > thanks for working on the bug and raising a PR to fix it.
> >
> > As other commiters also commented I think from product view I'd expect
> > a
> > different resolution. We use the "Pause DAG" in most cases for
> > administrative or infrastructure problems to prevent further failures
> > and/or to drain infra to switch some backend.
> >
> > I assume when we pause a long-running DAG that is in-between execution
> > of tasks we want to really "pause" scheduling, we don't want to set it
> > to failed. That would also not be correct because once we un-pause the
> > running DAGs should continoue to work. I see no reason marking this
> > failed anf then manually running behind to reset the state later.
> >
> > My view on this is that as also proposed in the discussion of the bug,
> > we should rather filter the paused DAG from clouster activity reporting
> > such that paused DAGs are not reported with excessive runtime. Also
> > later if un-paused it would be "right" that the overall DAG runtime was
> > longer than normal (would not expect to deduct the paused time from
> > runtime of the DAG.)
> >
> > If I want (as operator/admin) to really terminate existing running
> > instances I'd rather walk through Browse -> DAG Runs --> Filter for
> > running with paused DAG id and mark them as failed explicitly.
> >
> > Jens
> >
> > On 31.03.25 20:50, Pedro Nunes Leal wrote:
> >> Hello everyone,
> >>
> >> Currently, I'm trying to fix this bug:
> >> https://github.com/apache/airflow/issues/3
> >>
> >> Basically, the issue is that the DAGs would be stuck on running even
> >> though they were paused.
> >> Consequently, the duration of the dag run will keep on increasing even
> >> though the DAG is paused.
> >>
> >> My proposal to solve this problem is changing the DAGs state from
> >> running to failed, when paused, to avoid the increment of their
> >> duration.
> >>
> >> Since this can be an impactful change, I would like to hear what
> >> others think about it.
> >>
> >> Link for the Pull Request:
> >> https://github.com/apache/airflow/pull/47557
> >>
> >>
> >> -
> >> To unsubscribe, e-mail: dev-unsubscr...@airflow.apache.org
> >> For additional commands, e-mail: dev-h...@airflow.apache.org
> >>
> >
> > -
> > To unsubscribe, e-mail: dev-unsubscr...@airflow.apache.org
> > For additional commands, e-mail: dev-h...@airflow.apache.org
> That can be a better approach.
>
> However, if I'm not mistaken, the code related to the cluster activity
> page doesn't exist in Airflow 3 (the version where I'm trying to do the
> changes).
>
> So what should I do in this case?
> Is there any other way not involving cluster activity to solve this
> problem?
>
> The change to queued state instead of fail was my proposal at the
> beginning, and it really pauses the DAG.
> This is the type of solution I was thinking, because as I said before in
> the pull request, I feel that the cluster activity behavior is just a
> symptom from a bigger problem (the DAGs doesn't really pause, they just
> keep running).
>
> -
> To unsubscribe, e-mail: dev-unsubscr...@airflow.apache.org
> For additional commands, e-mail: dev-h...@airflow.apache.org
>
>


Re: [DISCUSS] When a DAG is paused, change the dag run state from running to failed.

2025-04-03 Thread Pedro Nunes Leal

A 2025-03-31 22:26, Jens Scheffler escreveu:

Hi,

thanks for working on the bug and raising a PR to fix it.

As other commiters also commented I think from product view I'd expect 
a

different resolution. We use the "Pause DAG" in most cases for
administrative or infrastructure problems to prevent further failures
and/or to drain infra to switch some backend.

I assume when we pause a long-running DAG that is in-between execution
of tasks we want to really "pause" scheduling, we don't want to set it
to failed. That would also not be correct because once we un-pause the
running DAGs should continoue to work. I see no reason marking this
failed anf then manually running behind to reset the state later.

My view on this is that as also proposed in the discussion of the bug,
we should rather filter the paused DAG from clouster activity reporting
such that paused DAGs are not reported with excessive runtime. Also
later if un-paused it would be "right" that the overall DAG runtime was
longer than normal (would not expect to deduct the paused time from
runtime of the DAG.)

If I want (as operator/admin) to really terminate existing running
instances I'd rather walk through Browse -> DAG Runs --> Filter for
running with paused DAG id and mark them as failed explicitly.

Jens

On 31.03.25 20:50, Pedro Nunes Leal wrote:

Hello everyone,

Currently, I'm trying to fix this bug:
https://github.com/apache/airflow/issues/3

Basically, the issue is that the DAGs would be stuck on running even
though they were paused.
Consequently, the duration of the dag run will keep on increasing even
though the DAG is paused.

My proposal to solve this problem is changing the DAGs state from
running to failed, when paused, to avoid the increment of their 
duration.


Since this can be an impactful change, I would like to hear what
others think about it.

Link for the Pull Request: 
https://github.com/apache/airflow/pull/47557



-
To unsubscribe, e-mail: dev-unsubscr...@airflow.apache.org
For additional commands, e-mail: dev-h...@airflow.apache.org



-
To unsubscribe, e-mail: dev-unsubscr...@airflow.apache.org
For additional commands, e-mail: dev-h...@airflow.apache.org

That can be a better approach.

However, if I'm not mistaken, the code related to the cluster activity 
page doesn't exist in Airflow 3 (the version where I'm trying to do the 
changes).


So what should I do in this case?
Is there any other way not involving cluster activity to solve this 
problem?


The change to queued state instead of fail was my proposal at the 
beginning, and it really pauses the DAG.
This is the type of solution I was thinking, because as I said before in 
the pull request, I feel that the cluster activity behavior is just a 
symptom from a bigger problem (the DAGs doesn't really pause, they just 
keep running).


-
To unsubscribe, e-mail: dev-unsubscr...@airflow.apache.org
For additional commands, e-mail: dev-h...@airflow.apache.org



Re: [VOTE] Release Airflow 3.0.0 from 3.0.0rc1 & Task SDK 1.0.0 from 1.0.0rc1

2025-04-03 Thread Vikram Koka
Thanks for the update, Ash!

I am running RC1 based on the Python package above.

Best regards,
Vikram


On Thu, Apr 3, 2025 at 2:30 PM Ash Berlin-Taylor  wrote:

> Anyone waiting for the docker images is going to have to wait until
> tomorrow, (or perhaps even Monday) as the build isn’t currently behaving
> itself after the split of airflow-core and the new meta package airflow
>
>   #95 5.136 The conflict is caused by:
>   #95 5.136 The user requested apache-airflow-core==3.0.0rc1.post1
>   #95 5.136 apache-airflow 3.0.0rc1.post1 depends on
> apache-airflow-core==3.0.0.rc1
>
> It’s a quirk of the RC naming, we’ll fix it and get the docker images
> build.
>
> -ash
>
> > On 3 Apr 2025, at 22:12, Vikram Koka 
> wrote:
> >
> > Awesome!
> > Thank you Kaxil for all your work and also thank you to all the
> > contributors whose hard work and dedication made this release a reality.
> >
> > Best regards,
> > Vikram
> >
> >
> > On Thu, Apr 3, 2025 at 2:08 PM Kaxil Naik  wrote:
> >
> >> Docker images will be out soon too.
> >>
> >> On Fri, 4 Apr 2025 at 02:35, Kaxil Naik  wrote:
> >>
> >>> Hey fellow Airflowers,
> >>>
> >>> I am thrilled to announce the availability of Apache Airflow 3.0.0rc1 &
> >> *Task
> >>> SDK 1.0.0rc1* for testing! Airflow 3.0 marks a significant milestone as
> >>> the first major release in over four years, introducing improvements
> that
> >>> enhance user experience, task execution, and system scalability.
> >>>
> >>> This email is calling for a vote on the release,
> >>> which will last at least 7 days until 10th April.
> >>> and until 3 binding +1 votes have been received.
> >>>
> >>> Consider this my (non-binding) +1.
> >>>
> >>> Airflow 3.0.0rc1 is available at:
> >>> https://dist.apache.org/repos/dist/dev/airflow/3.0.0rc1/
> >>>
> >>>
> >>> "apache-airflow" Meta package:
> >>>
> >>>
> >>>   - *apache-airflow-3.0.0-source.tar.gz* is a source release that comes
> >>>   with INSTALL instructions.
> >>>   - *apache-airflow-3.0.0.tar.gz* is the binary Python "sdist" release.
> >>>   - *apache_airflow-3.0.0-py3-none-any.whl* is the binary Python
> >>>   wheel "binary" release.
> >>>
> >>> "apache-airflow-core" package
> >>>
> >>>
> >>>   - *apache_airflow_core-3.0.0.tar.gz* is the binary Python "sdist"
> >>>   release.
> >>>   - *apache_airflow_3.0.0-py3-none-any.whl* is the binary Python
> >>>   wheel "binary" release.
> >>>
> >>>
> >>> Task SDK 1.0.0rc1 is available at:
> >>> https://dist.apache.org/repos/dist/dev/airflow/task-sdk/1.0.0rc1/
> >>>
> >>>
> >>> "apache-airflow-task-sdk" package
> >>>
> >>>   - *apache-airflow-task-sdk-1.0.0-source.tar.gz* is a source release
> >>>   - *apache_airflow_task_sdk-1.0.0.tar.gz* is the binary Python "sdist"
> >>>   release.
> >>>   - *apache_airflow_task_sdk-1.0.0-py3-none-any.whl* is the binary
> >>>   Python wheel "binary" release.
> >>>
> >>>
> >>>
> >>> Public keys are available at:
> >>> https://dist.apache.org/repos/dist/release/airflow/KEYS
> >>>
> >>> Please vote accordingly:
> >>>
> >>> [ ] +1 approve
> >>> [ ] +0 no opinion
> >>> [ ] -1 disapprove with the reason
> >>>
> >>> Only votes from PMC members are binding, but all members of the
> community
> >>> are encouraged to test the release and vote with "(non-binding)".
> >>>
> >>> The test procedure for PMC members is described in:
> >>>
> >>>
> >>
> https://github.com/apache/airflow/blob/main/dev/README_RELEASE_AIRFLOW.md\#verify-the-release-candidate-by-pmc-members
> >>>
> >>> The test procedure for contributors and members of the community who
> >> would
> >>> like to test this RC is described in:
> >>>
> >>>
> >>
> https://github.com/apache/airflow/blob/main/dev/README_RELEASE_AIRFLOW.md\#verify-the-release-candidate-by-contributors
> >>>
> >>> Please note that the version number excludes the 'rcX' string, so it's
> >> now
> >>> simply 3.0.0 for Airflow package and 1.0.0 for Task SDK. This will
> allow
> >>> us to rename the artifact without modifying
> >>> the artifact checksums when we actually release.
> >>>
> >>> Release Notes:
> >>> https://github.com/apache/airflow/blob/3.0.0rc1/RELEASE_NOTES.rst
> >>>
> >>>
> >>> *Testing Instructions using PyPI*:
> >>>
> >>> You can build a virtualenv that installs this, and other required
> >> packages
> >>> (e.g. task sdk), like this:
> >>>
> >>> ```
> >>>
> >>> uv venv
> >>>
> >>> uv pip install apache-airflow
> apache-airflow-providers-standard==0.3.0rc1
> >>> --pre
> >>>
> >>> ```
> >>>
> >>> Get Involved
> >>>
> >>> We encourage the community to test this release and report any issues
> or
> >>> feedback. Your contributions help us ensure a stable and reliable
> Airflow
> >>> 3.0.0 release. Please report issues using Github at
> >>> https://github.com/apache/airflow/issues and mark that this is an
> issue
> >>> in 3.0.0. For an updated list of all known issues in the beta can also
> be
> >>> found in the above link with the label “affected_version:3.0.0rc”
> >>>
> >>> A huge thank you to all the contributors who have worked on 

Airflow Mentors for Summer 2025 MLH Fellowship

2025-04-03 Thread Alex Gornet
Hi Airflow team,

I'm Alex Gornet, a Partner Success Manager at Major League Hacking. Here at
MLH we run an Open Source Fellowship program
 that functions like an internship for
student developers to contribute to open source projects. *Fellows from our
program have successfully contributed to Airflow since Fall 2024. *

This year Royal Bank of Canada is sponsoring some seats in the program, and
they'd like their Fellows to contribute to Airflow as a way to give back to
the community. *With the recent release of Airflow 3.0, these Fellows would
be great candidates for tackling the backlog of bugs / issues following the
transition. *

*Right now, we're looking for maintainers / core contributors to provide
technical mentorship to the students*, so we'd love your help to identify 2
maintainers / core contributors (or more!) who could take lead here and
support us bringing on some great new contributors.

   - *The program runs from May 19th to August 8th*, and we'd ask for ~2
   hours of your time each week.

*If this is something you'd be interested in helping out with, you can
simply reply to this email or reach out to me directly at
al...@majorleaguehacking.com * — since we're
not too far out from program launch, the sooner you can let us know the
better!

If you have any questions about the program or want to chat more about it,
let me know and I'm happy to find time. You can also reach out to Dennis
Ferruzzi who has mentored the last couple of cohorts to hear more about his
mentoring experience.

Best,
Alex


[VOTE] Release Airflow 3.0.0 from 3.0.0rc1 & Task SDK 1.0.0 from 1.0.0rc1

2025-04-03 Thread Kaxil Naik
Hey fellow Airflowers,

I am thrilled to announce the availability of Apache Airflow 3.0.0rc1 & *Task
SDK 1.0.0rc1* for testing! Airflow 3.0 marks a significant milestone as the
first major release in over four years, introducing improvements that
enhance user experience, task execution, and system scalability.

This email is calling for a vote on the release,
which will last at least 7 days until 10th April.
and until 3 binding +1 votes have been received.

Consider this my (non-binding) +1.

Airflow 3.0.0rc1 is available at:
https://dist.apache.org/repos/dist/dev/airflow/3.0.0rc1/


"apache-airflow" Meta package:


   - *apache-airflow-3.0.0-source.tar.gz* is a source release that comes
   with INSTALL instructions.
   - *apache-airflow-3.0.0.tar.gz* is the binary Python "sdist" release.
   - *apache_airflow-3.0.0-py3-none-any.whl* is the binary Python
   wheel "binary" release.

"apache-airflow-core" package


   - *apache_airflow_core-3.0.0.tar.gz* is the binary Python "sdist"
   release.
   - *apache_airflow_3.0.0-py3-none-any.whl* is the binary Python
   wheel "binary" release.


Task SDK 1.0.0rc1 is available at:
https://dist.apache.org/repos/dist/dev/airflow/task-sdk/1.0.0rc1/


"apache-airflow-task-sdk" package

   - *apache-airflow-task-sdk-1.0.0-source.tar.gz* is a source release
   - *apache_airflow_task_sdk-1.0.0.tar.gz* is the binary Python "sdist"
   release.
   - *apache_airflow_task_sdk-1.0.0-py3-none-any.whl* is the binary Python
   wheel "binary" release.



Public keys are available at:
https://dist.apache.org/repos/dist/release/airflow/KEYS

Please vote accordingly:

[ ] +1 approve
[ ] +0 no opinion
[ ] -1 disapprove with the reason

Only votes from PMC members are binding, but all members of the community
are encouraged to test the release and vote with "(non-binding)".

The test procedure for PMC members is described in:
https://github.com/apache/airflow/blob/main/dev/README_RELEASE_AIRFLOW.md\#verify-the-release-candidate-by-pmc-members

The test procedure for contributors and members of the community who would
like to test this RC is described in:
https://github.com/apache/airflow/blob/main/dev/README_RELEASE_AIRFLOW.md\#verify-the-release-candidate-by-contributors

Please note that the version number excludes the 'rcX' string, so it's now
simply 3.0.0 for Airflow package and 1.0.0 for Task SDK. This will allow us
to rename the artifact without modifying
the artifact checksums when we actually release.

Release Notes:
https://github.com/apache/airflow/blob/3.0.0rc1/RELEASE_NOTES.rst


*Testing Instructions using PyPI*:

You can build a virtualenv that installs this, and other required packages
(e.g. task sdk), like this:

```

uv venv

uv pip install apache-airflow apache-airflow-providers-standard==0.3.0rc1
 --pre

```

Get Involved

We encourage the community to test this release and report any issues or
feedback. Your contributions help us ensure a stable and reliable Airflow
3.0.0 release. Please report issues using Github at
https://github.com/apache/airflow/issues and mark that this is an issue in
3.0.0. For an updated list of all known issues in the beta can also be
found in the above link with the label “affected_version:3.0.0rc”

A huge thank you to all the contributors who have worked on this milestone
release!
Best,
Kaxil

---
What's new in 3.0.0?

Notable Features

DAG versioning & Bundles

Airflow now tracks DAG versions, offering better visibility into historical
DAG changes and execution states. The introduction of DAG Bundles ensures
tasks run with the correct code version, even as DAGs evolve.

Modern Web Application

The UI has been rebuilt using React and a complete API-driven structure,
improving maintainability and extensibility. It includes a new
component-based design system and an enhanced information architecture. A
new React-based plugin system supports custom widgets, improved workflow
visibility, and integration with external tools.

Task Execution Interface

Airflow 3.0 adopts a client / server architecture, decoupling task
execution from the internal meta-database via API-based interaction. This
allows for remote execution across networks, multi-language support,
enhanced security, and better dependency management. The Edge Executor
further enables seamless remote task execution without direct database
connections.

Data Assets & Asset-Centric Syntax

Airflow 3.0 enhances dataset management by introducing Data Assets,
expanding beyond tables and files to include ML models and more. Assets can
be explicitly defined using the @asset decorator, simplifying tracking and
dependencies.

External Event-Driven Scheduling

Airflow now supports event-driven DAG triggers from external sources like
message queues and blob stores. This builds upon dataset scheduling and
enhances integration with the external data ecosystem.


Re: [VOTE] Release Airflow 3.0.0 from 3.0.0rc1 & Task SDK 1.0.0 from 1.0.0rc1

2025-04-03 Thread Vikram Koka
Awesome!
Thank you Kaxil for all your work and also thank you to all the
contributors whose hard work and dedication made this release a reality.

Best regards,
Vikram


On Thu, Apr 3, 2025 at 2:08 PM Kaxil Naik  wrote:

> Docker images will be out soon too.
>
> On Fri, 4 Apr 2025 at 02:35, Kaxil Naik  wrote:
>
> > Hey fellow Airflowers,
> >
> > I am thrilled to announce the availability of Apache Airflow 3.0.0rc1 &
> *Task
> > SDK 1.0.0rc1* for testing! Airflow 3.0 marks a significant milestone as
> > the first major release in over four years, introducing improvements that
> > enhance user experience, task execution, and system scalability.
> >
> > This email is calling for a vote on the release,
> > which will last at least 7 days until 10th April.
> > and until 3 binding +1 votes have been received.
> >
> > Consider this my (non-binding) +1.
> >
> > Airflow 3.0.0rc1 is available at:
> > https://dist.apache.org/repos/dist/dev/airflow/3.0.0rc1/
> >
> >
> > "apache-airflow" Meta package:
> >
> >
> >- *apache-airflow-3.0.0-source.tar.gz* is a source release that comes
> >with INSTALL instructions.
> >- *apache-airflow-3.0.0.tar.gz* is the binary Python "sdist" release.
> >- *apache_airflow-3.0.0-py3-none-any.whl* is the binary Python
> >wheel "binary" release.
> >
> > "apache-airflow-core" package
> >
> >
> >- *apache_airflow_core-3.0.0.tar.gz* is the binary Python "sdist"
> >release.
> >- *apache_airflow_3.0.0-py3-none-any.whl* is the binary Python
> >wheel "binary" release.
> >
> >
> > Task SDK 1.0.0rc1 is available at:
> > https://dist.apache.org/repos/dist/dev/airflow/task-sdk/1.0.0rc1/
> >
> >
> > "apache-airflow-task-sdk" package
> >
> >- *apache-airflow-task-sdk-1.0.0-source.tar.gz* is a source release
> >- *apache_airflow_task_sdk-1.0.0.tar.gz* is the binary Python "sdist"
> >release.
> >- *apache_airflow_task_sdk-1.0.0-py3-none-any.whl* is the binary
> >Python wheel "binary" release.
> >
> >
> >
> > Public keys are available at:
> > https://dist.apache.org/repos/dist/release/airflow/KEYS
> >
> > Please vote accordingly:
> >
> > [ ] +1 approve
> > [ ] +0 no opinion
> > [ ] -1 disapprove with the reason
> >
> > Only votes from PMC members are binding, but all members of the community
> > are encouraged to test the release and vote with "(non-binding)".
> >
> > The test procedure for PMC members is described in:
> >
> >
> https://github.com/apache/airflow/blob/main/dev/README_RELEASE_AIRFLOW.md\#verify-the-release-candidate-by-pmc-members
> >
> > The test procedure for contributors and members of the community who
> would
> > like to test this RC is described in:
> >
> >
> https://github.com/apache/airflow/blob/main/dev/README_RELEASE_AIRFLOW.md\#verify-the-release-candidate-by-contributors
> >
> > Please note that the version number excludes the 'rcX' string, so it's
> now
> > simply 3.0.0 for Airflow package and 1.0.0 for Task SDK. This will allow
> > us to rename the artifact without modifying
> > the artifact checksums when we actually release.
> >
> > Release Notes:
> > https://github.com/apache/airflow/blob/3.0.0rc1/RELEASE_NOTES.rst
> >
> >
> > *Testing Instructions using PyPI*:
> >
> > You can build a virtualenv that installs this, and other required
> packages
> > (e.g. task sdk), like this:
> >
> > ```
> >
> > uv venv
> >
> > uv pip install apache-airflow apache-airflow-providers-standard==0.3.0rc1
> >  --pre
> >
> > ```
> >
> > Get Involved
> >
> > We encourage the community to test this release and report any issues or
> > feedback. Your contributions help us ensure a stable and reliable Airflow
> > 3.0.0 release. Please report issues using Github at
> > https://github.com/apache/airflow/issues and mark that this is an issue
> > in 3.0.0. For an updated list of all known issues in the beta can also be
> > found in the above link with the label “affected_version:3.0.0rc”
> >
> > A huge thank you to all the contributors who have worked on this
> milestone
> > release!
> > Best,
> > Kaxil
> >
> > ---
> > What's new in 3.0.0?
> >
> > Notable Features
> >
> > DAG versioning & Bundles
> >
> > Airflow now tracks DAG versions, offering better visibility into
> > historical DAG changes and execution states. The introduction of DAG
> > Bundles ensures tasks run with the correct code version, even as DAGs
> > evolve.
> >
> > Modern Web Application
> >
> > The UI has been rebuilt using React and a complete API-driven structure,
> > improving maintainability and extensibility. It includes a new
> > component-based design system and an enhanced information architecture. A
> > new React-based plugin system supports custom widgets, improved workflow
> > visibility, and integration with external tools.
> >
> > Task Execution Interface
> >
> > Airflow 3.0 adopts a client / server architecture, decoupling task
> > execution from the internal meta-database via API-based interaction. This
> > allows for remote execution across networks, multi-language su

Re: [VOTE] Release Airflow 3.0.0 from 3.0.0rc1 & Task SDK 1.0.0 from 1.0.0rc1

2025-04-03 Thread Kaxil Naik
Couldn’t sleep so there you go!

Following image is now available

docker pull apache/airflow:3.0.0.rc1.post4
docker pull apache/airflow:3.0.0.rc1.post4-python3.12
docker pull apache/airflow:3.0.0.rc1.post4-python3.11
docker pull apache/airflow:3.0.0.rc1.post4-python3.10


On Fri, 4 Apr 2025 at 03:37, Vikram Koka 
wrote:

> Thanks for the update, Ash!
>
> I am running RC1 based on the Python package above.
>
> Best regards,
> Vikram
>
>
> On Thu, Apr 3, 2025 at 2:30 PM Ash Berlin-Taylor  wrote:
>
> > Anyone waiting for the docker images is going to have to wait until
> > tomorrow, (or perhaps even Monday) as the build isn’t currently behaving
> > itself after the split of airflow-core and the new meta package airflow
> >
> >   #95 5.136 The conflict is caused by:
> >   #95 5.136 The user requested apache-airflow-core==3.0.0rc1.post1
> >   #95 5.136 apache-airflow 3.0.0rc1.post1 depends on
> > apache-airflow-core==3.0.0.rc1
> >
> > It’s a quirk of the RC naming, we’ll fix it and get the docker images
> > build.
> >
> > -ash
> >
> > > On 3 Apr 2025, at 22:12, Vikram Koka 
> > wrote:
> > >
> > > Awesome!
> > > Thank you Kaxil for all your work and also thank you to all the
> > > contributors whose hard work and dedication made this release a
> reality.
> > >
> > > Best regards,
> > > Vikram
> > >
> > >
> > > On Thu, Apr 3, 2025 at 2:08 PM Kaxil Naik  wrote:
> > >
> > >> Docker images will be out soon too.
> > >>
> > >> On Fri, 4 Apr 2025 at 02:35, Kaxil Naik  wrote:
> > >>
> > >>> Hey fellow Airflowers,
> > >>>
> > >>> I am thrilled to announce the availability of Apache Airflow
> 3.0.0rc1 &
> > >> *Task
> > >>> SDK 1.0.0rc1* for testing! Airflow 3.0 marks a significant milestone
> as
> > >>> the first major release in over four years, introducing improvements
> > that
> > >>> enhance user experience, task execution, and system scalability.
> > >>>
> > >>> This email is calling for a vote on the release,
> > >>> which will last at least 7 days until 10th April.
> > >>> and until 3 binding +1 votes have been received.
> > >>>
> > >>> Consider this my (non-binding) +1.
> > >>>
> > >>> Airflow 3.0.0rc1 is available at:
> > >>> https://dist.apache.org/repos/dist/dev/airflow/3.0.0rc1/
> > >>>
> > >>>
> > >>> "apache-airflow" Meta package:
> > >>>
> > >>>
> > >>>   - *apache-airflow-3.0.0-source.tar.gz* is a source release that
> comes
> > >>>   with INSTALL instructions.
> > >>>   - *apache-airflow-3.0.0.tar.gz* is the binary Python "sdist"
> release.
> > >>>   - *apache_airflow-3.0.0-py3-none-any.whl* is the binary Python
> > >>>   wheel "binary" release.
> > >>>
> > >>> "apache-airflow-core" package
> > >>>
> > >>>
> > >>>   - *apache_airflow_core-3.0.0.tar.gz* is the binary Python "sdist"
> > >>>   release.
> > >>>   - *apache_airflow_3.0.0-py3-none-any.whl* is the binary Python
> > >>>   wheel "binary" release.
> > >>>
> > >>>
> > >>> Task SDK 1.0.0rc1 is available at:
> > >>> https://dist.apache.org/repos/dist/dev/airflow/task-sdk/1.0.0rc1/
> > >>>
> > >>>
> > >>> "apache-airflow-task-sdk" package
> > >>>
> > >>>   - *apache-airflow-task-sdk-1.0.0-source.tar.gz* is a source release
> > >>>   - *apache_airflow_task_sdk-1.0.0.tar.gz* is the binary Python
> "sdist"
> > >>>   release.
> > >>>   - *apache_airflow_task_sdk-1.0.0-py3-none-any.whl* is the binary
> > >>>   Python wheel "binary" release.
> > >>>
> > >>>
> > >>>
> > >>> Public keys are available at:
> > >>> https://dist.apache.org/repos/dist/release/airflow/KEYS
> > >>>
> > >>> Please vote accordingly:
> > >>>
> > >>> [ ] +1 approve
> > >>> [ ] +0 no opinion
> > >>> [ ] -1 disapprove with the reason
> > >>>
> > >>> Only votes from PMC members are binding, but all members of the
> > community
> > >>> are encouraged to test the release and vote with "(non-binding)".
> > >>>
> > >>> The test procedure for PMC members is described in:
> > >>>
> > >>>
> > >>
> >
> https://github.com/apache/airflow/blob/main/dev/README_RELEASE_AIRFLOW.md\#verify-the-release-candidate-by-pmc-members
> > >>>
> > >>> The test procedure for contributors and members of the community who
> > >> would
> > >>> like to test this RC is described in:
> > >>>
> > >>>
> > >>
> >
> https://github.com/apache/airflow/blob/main/dev/README_RELEASE_AIRFLOW.md\#verify-the-release-candidate-by-contributors
> > >>>
> > >>> Please note that the version number excludes the 'rcX' string, so
> it's
> > >> now
> > >>> simply 3.0.0 for Airflow package and 1.0.0 for Task SDK. This will
> > allow
> > >>> us to rename the artifact without modifying
> > >>> the artifact checksums when we actually release.
> > >>>
> > >>> Release Notes:
> > >>> https://github.com/apache/airflow/blob/3.0.0rc1/RELEASE_NOTES.rst
> > >>>
> > >>>
> > >>> *Testing Instructions using PyPI*:
> > >>>
> > >>> You can build a virtualenv that installs this, and other required
> > >> packages
> > >>> (e.g. task sdk), like this:
> > >>>
> > >>> ```
> > >>>
> > >>> uv venv
> > >>>
> > >>> uv pip install apache-airflow
> > apache-ai

Re: [VOTE] Release Airflow 3.0.0 from 3.0.0rc1 & Task SDK 1.0.0 from 1.0.0rc1

2025-04-03 Thread Ash Berlin-Taylor
Anyone waiting for the docker images is going to have to wait until tomorrow, 
(or perhaps even Monday) as the build isn’t currently behaving itself after the 
split of airflow-core and the new meta package airflow

  #95 5.136 The conflict is caused by:
  #95 5.136 The user requested apache-airflow-core==3.0.0rc1.post1
  #95 5.136 apache-airflow 3.0.0rc1.post1 depends on 
apache-airflow-core==3.0.0.rc1

It’s a quirk of the RC naming, we’ll fix it and get the docker images build.

-ash

> On 3 Apr 2025, at 22:12, Vikram Koka  wrote:
> 
> Awesome!
> Thank you Kaxil for all your work and also thank you to all the
> contributors whose hard work and dedication made this release a reality.
> 
> Best regards,
> Vikram
> 
> 
> On Thu, Apr 3, 2025 at 2:08 PM Kaxil Naik  wrote:
> 
>> Docker images will be out soon too.
>> 
>> On Fri, 4 Apr 2025 at 02:35, Kaxil Naik  wrote:
>> 
>>> Hey fellow Airflowers,
>>> 
>>> I am thrilled to announce the availability of Apache Airflow 3.0.0rc1 &
>> *Task
>>> SDK 1.0.0rc1* for testing! Airflow 3.0 marks a significant milestone as
>>> the first major release in over four years, introducing improvements that
>>> enhance user experience, task execution, and system scalability.
>>> 
>>> This email is calling for a vote on the release,
>>> which will last at least 7 days until 10th April.
>>> and until 3 binding +1 votes have been received.
>>> 
>>> Consider this my (non-binding) +1.
>>> 
>>> Airflow 3.0.0rc1 is available at:
>>> https://dist.apache.org/repos/dist/dev/airflow/3.0.0rc1/
>>> 
>>> 
>>> "apache-airflow" Meta package:
>>> 
>>> 
>>>   - *apache-airflow-3.0.0-source.tar.gz* is a source release that comes
>>>   with INSTALL instructions.
>>>   - *apache-airflow-3.0.0.tar.gz* is the binary Python "sdist" release.
>>>   - *apache_airflow-3.0.0-py3-none-any.whl* is the binary Python
>>>   wheel "binary" release.
>>> 
>>> "apache-airflow-core" package
>>> 
>>> 
>>>   - *apache_airflow_core-3.0.0.tar.gz* is the binary Python "sdist"
>>>   release.
>>>   - *apache_airflow_3.0.0-py3-none-any.whl* is the binary Python
>>>   wheel "binary" release.
>>> 
>>> 
>>> Task SDK 1.0.0rc1 is available at:
>>> https://dist.apache.org/repos/dist/dev/airflow/task-sdk/1.0.0rc1/
>>> 
>>> 
>>> "apache-airflow-task-sdk" package
>>> 
>>>   - *apache-airflow-task-sdk-1.0.0-source.tar.gz* is a source release
>>>   - *apache_airflow_task_sdk-1.0.0.tar.gz* is the binary Python "sdist"
>>>   release.
>>>   - *apache_airflow_task_sdk-1.0.0-py3-none-any.whl* is the binary
>>>   Python wheel "binary" release.
>>> 
>>> 
>>> 
>>> Public keys are available at:
>>> https://dist.apache.org/repos/dist/release/airflow/KEYS
>>> 
>>> Please vote accordingly:
>>> 
>>> [ ] +1 approve
>>> [ ] +0 no opinion
>>> [ ] -1 disapprove with the reason
>>> 
>>> Only votes from PMC members are binding, but all members of the community
>>> are encouraged to test the release and vote with "(non-binding)".
>>> 
>>> The test procedure for PMC members is described in:
>>> 
>>> 
>> https://github.com/apache/airflow/blob/main/dev/README_RELEASE_AIRFLOW.md\#verify-the-release-candidate-by-pmc-members
>>> 
>>> The test procedure for contributors and members of the community who
>> would
>>> like to test this RC is described in:
>>> 
>>> 
>> https://github.com/apache/airflow/blob/main/dev/README_RELEASE_AIRFLOW.md\#verify-the-release-candidate-by-contributors
>>> 
>>> Please note that the version number excludes the 'rcX' string, so it's
>> now
>>> simply 3.0.0 for Airflow package and 1.0.0 for Task SDK. This will allow
>>> us to rename the artifact without modifying
>>> the artifact checksums when we actually release.
>>> 
>>> Release Notes:
>>> https://github.com/apache/airflow/blob/3.0.0rc1/RELEASE_NOTES.rst
>>> 
>>> 
>>> *Testing Instructions using PyPI*:
>>> 
>>> You can build a virtualenv that installs this, and other required
>> packages
>>> (e.g. task sdk), like this:
>>> 
>>> ```
>>> 
>>> uv venv
>>> 
>>> uv pip install apache-airflow apache-airflow-providers-standard==0.3.0rc1
>>> --pre
>>> 
>>> ```
>>> 
>>> Get Involved
>>> 
>>> We encourage the community to test this release and report any issues or
>>> feedback. Your contributions help us ensure a stable and reliable Airflow
>>> 3.0.0 release. Please report issues using Github at
>>> https://github.com/apache/airflow/issues and mark that this is an issue
>>> in 3.0.0. For an updated list of all known issues in the beta can also be
>>> found in the above link with the label “affected_version:3.0.0rc”
>>> 
>>> A huge thank you to all the contributors who have worked on this
>> milestone
>>> release!
>>> Best,
>>> Kaxil
>>> 
>>> ---
>>> What's new in 3.0.0?
>>> 
>>> Notable Features
>>> 
>>> DAG versioning & Bundles
>>> 
>>> Airflow now tracks DAG versions, offering better visibility into
>>> historical DAG changes and execution states. The introduction of DAG
>>> Bundles ensures tasks run with the correct code version, even as DAGs
>>> evolve.
>>> 
>>> Modern Web Applica

Re: [VOTE] Release Airflow 3.0.0 from 3.0.0rc1 & Task SDK 1.0.0 from 1.0.0rc1

2025-04-03 Thread Kaxil Naik
Docker images will be out soon too.

On Fri, 4 Apr 2025 at 02:35, Kaxil Naik  wrote:

> Hey fellow Airflowers,
>
> I am thrilled to announce the availability of Apache Airflow 3.0.0rc1 & *Task
> SDK 1.0.0rc1* for testing! Airflow 3.0 marks a significant milestone as
> the first major release in over four years, introducing improvements that
> enhance user experience, task execution, and system scalability.
>
> This email is calling for a vote on the release,
> which will last at least 7 days until 10th April.
> and until 3 binding +1 votes have been received.
>
> Consider this my (non-binding) +1.
>
> Airflow 3.0.0rc1 is available at:
> https://dist.apache.org/repos/dist/dev/airflow/3.0.0rc1/
>
>
> "apache-airflow" Meta package:
>
>
>- *apache-airflow-3.0.0-source.tar.gz* is a source release that comes
>with INSTALL instructions.
>- *apache-airflow-3.0.0.tar.gz* is the binary Python "sdist" release.
>- *apache_airflow-3.0.0-py3-none-any.whl* is the binary Python
>wheel "binary" release.
>
> "apache-airflow-core" package
>
>
>- *apache_airflow_core-3.0.0.tar.gz* is the binary Python "sdist"
>release.
>- *apache_airflow_3.0.0-py3-none-any.whl* is the binary Python
>wheel "binary" release.
>
>
> Task SDK 1.0.0rc1 is available at:
> https://dist.apache.org/repos/dist/dev/airflow/task-sdk/1.0.0rc1/
>
>
> "apache-airflow-task-sdk" package
>
>- *apache-airflow-task-sdk-1.0.0-source.tar.gz* is a source release
>- *apache_airflow_task_sdk-1.0.0.tar.gz* is the binary Python "sdist"
>release.
>- *apache_airflow_task_sdk-1.0.0-py3-none-any.whl* is the binary
>Python wheel "binary" release.
>
>
>
> Public keys are available at:
> https://dist.apache.org/repos/dist/release/airflow/KEYS
>
> Please vote accordingly:
>
> [ ] +1 approve
> [ ] +0 no opinion
> [ ] -1 disapprove with the reason
>
> Only votes from PMC members are binding, but all members of the community
> are encouraged to test the release and vote with "(non-binding)".
>
> The test procedure for PMC members is described in:
>
> https://github.com/apache/airflow/blob/main/dev/README_RELEASE_AIRFLOW.md\#verify-the-release-candidate-by-pmc-members
>
> The test procedure for contributors and members of the community who would
> like to test this RC is described in:
>
> https://github.com/apache/airflow/blob/main/dev/README_RELEASE_AIRFLOW.md\#verify-the-release-candidate-by-contributors
>
> Please note that the version number excludes the 'rcX' string, so it's now
> simply 3.0.0 for Airflow package and 1.0.0 for Task SDK. This will allow
> us to rename the artifact without modifying
> the artifact checksums when we actually release.
>
> Release Notes:
> https://github.com/apache/airflow/blob/3.0.0rc1/RELEASE_NOTES.rst
>
>
> *Testing Instructions using PyPI*:
>
> You can build a virtualenv that installs this, and other required packages
> (e.g. task sdk), like this:
>
> ```
>
> uv venv
>
> uv pip install apache-airflow apache-airflow-providers-standard==0.3.0rc1
>  --pre
>
> ```
>
> Get Involved
>
> We encourage the community to test this release and report any issues or
> feedback. Your contributions help us ensure a stable and reliable Airflow
> 3.0.0 release. Please report issues using Github at
> https://github.com/apache/airflow/issues and mark that this is an issue
> in 3.0.0. For an updated list of all known issues in the beta can also be
> found in the above link with the label “affected_version:3.0.0rc”
>
> A huge thank you to all the contributors who have worked on this milestone
> release!
> Best,
> Kaxil
>
> ---
> What's new in 3.0.0?
>
> Notable Features
>
> DAG versioning & Bundles
>
> Airflow now tracks DAG versions, offering better visibility into
> historical DAG changes and execution states. The introduction of DAG
> Bundles ensures tasks run with the correct code version, even as DAGs
> evolve.
>
> Modern Web Application
>
> The UI has been rebuilt using React and a complete API-driven structure,
> improving maintainability and extensibility. It includes a new
> component-based design system and an enhanced information architecture. A
> new React-based plugin system supports custom widgets, improved workflow
> visibility, and integration with external tools.
>
> Task Execution Interface
>
> Airflow 3.0 adopts a client / server architecture, decoupling task
> execution from the internal meta-database via API-based interaction. This
> allows for remote execution across networks, multi-language support,
> enhanced security, and better dependency management. The Edge Executor
> further enables seamless remote task execution without direct database
> connections.
>
> Data Assets & Asset-Centric Syntax
>
> Airflow 3.0 enhances dataset management by introducing Data Assets,
> expanding beyond tables and files to include ML models and more. Assets can
> be explicitly defined using the @asset decorator, simplifying tracking and
> dependencies.
>
> External Event-Driven Scheduling
>
> Airflow now su

Re: [ANNOUNCE] UV mandatory tooling and (much nicer) doc building coming

2025-04-03 Thread Kunal Bhattacharya
Thank you so much for this Jarek, incredible effort. uv really has been a
lifesaver and I can't wait to now try out the simplified dev workflow.

Regards,
Kunal Bhattacharya



On Thu, Apr 3, 2025 at 11:16 AM Amogh Desai 
wrote:

> Thank you for the effort Jarek!
>
> I am looking forward to having a simpler life building docs. Personally it
> was a nightmare for me earlier.
> Will shout out if / when I get to that stage.
>
> Thanks & Regards,
> Amogh Desai
>
>
> On Wed, Apr 2, 2025 at 10:37 PM Buğra Öztürk 
> wrote:
>
> > Amazing news! Thanks for the huge effort Jarek! :)
> >
> > On Wed, 2 Apr 2025, 18:45 Pavankumar Gopidesu, 
> > wrote:
> >
> > > That's awesome Jarek, thank you for this :)
> > >
> > > Regards,
> > > Pavan
> > >
> > > On Wed, Apr 2, 2025 at 1:09 PM Jarek Potiuk  wrote:
> > >
> > > > You might also want to do *docker system prune*  or even *docker
> system
> > > > prune --all*  or run *breeze doctor* to clean-up some stale cache,
> > > images,
> > > > docker volumes
> > > >
> > > > Also IntelliJ/PyCharm users *uv run setup_idea.py* will add some
> > missing
> > > > directories and regenerate your IntelliJ project configuration.
> > > >
> > > >
> > > >
> > > >
> > > >
> > > > On Wed, Apr 2, 2025 at 1:11 PM Jarek Potiuk 
> wrote:
> > > >
> > > > > Hello here,
> > > > >
> > > > > As part of the packaging work - I merged the
> > > > > https://github.com/apache/airflow/pull/48223
> > > > >
> > > > > *TL;DR; Rebase all PRS, rebuild images and uv sync - and
> > > > > hopefully everything should work as before even if a lot of things
> > > moved.
> > > > > Hopefully the move will be largely transparent (except changing
> > > *include:
> > > > > in open PRs in docs). *
> > > > >
> > > > > *NOTE! UV is now mandatory and a lot of code is gone thanks to
> that.
> > > > > Breeze will also refuse to work if uv is not installed.*
> > > > >
> > > > > It took a bit of time, but we are in a much more standard and
> better
> > > > shape
> > > > > now - and as a side effect (which was intended but I had to
> implement
> > > it
> > > > as
> > > > > part of that monster PR to fix documentation) we now have a much
> > > simpler
> > > > > (more guidelines are coming) way to iterate on doc building.
> > > > >
> > > > > *Few important things first: *
> > > > >
> > > > > 1) Make sure to rebase your PRs, Run `breeze image build`, Rnu `uv
> > > sync`.
> > > > > Due to the way git handles things - you ** might **  have some
> > dangling
> > > > > generated directories in your repo and they might cause some
> > problems.
> > > > Run
> > > > > "git status" after rebase and see you have some files you need to
> > > delete
> > > > > (manually)
> > > > >
> > > > > 2) If you are brave enough - you might want to run `breeze doctor`
> > and
> > > > > cleanup git repo - it should clean all files that should be
> removed,
> > > but
> > > > > also it might remove some of your custom configurations and files
> you
> > > > > created,
> > > > >
> > > > > 3) Generally everything should work as it worked before with breeze
> > > (for
> > > > > example `breeze build-docs` command works as before. But a number
> of
> > > > > folders/distributions/code (not airflow nore providers directly)
> were
> > > > > moved/updated. For now you can just continue to build docs as
> before
> > -
> > > > with
> > > > > breeze. But simpler/faster ways are coming as follow up.
> > > > >
> > > > > 4) If you have some new examples or documentation included in your
> > PRs
> > > > the
> > > > > doc build might start falling for you - but this is because
> > `include::`
> > > > or
> > > > > `exampleinclude:`  might need to be updated - look at other
> examples
> > -
> > > I
> > > > > fixed the includes in all providers. More explanation in follow-up
> > doc
> > > > > build improvement PR - in the meantime, feel free to ask on slack
> or
> > PR
> > > > for
> > > > > help.
> > > > >
> > > > > *Generated provider_dependencies.json do not need to be updated*
> > > > >
> > > > > The "generated/provider_dependencies.json" is no longer committed
> to
> > > the
> > > > > repo - it is .gitignored. We are generating it as-needed on the
> > flight.
> > > > It
> > > > > should be automatically regenerated when you run pre-commits
> locally
> > > and
> > > > > when you build the breeze image.
> > > > >
> > > > > There might be some cases when we add dependencies and you will
> need
> > to
> > > > > regenerate it but that should happen automatically as needed.
> > > > >
> > > > > *New, updated folders*
> > > > >
> > > > > The change are mostly with these:
> > > > >
> > > > > ./dev/pyproject.toml
> > > > > ./devel-common/pyproject.toml
> > > > > ./doc
> > > > > ./docker-stack-docs
> > > > > ./providers-summary-docs
> > > > >
> > > > > *More explanation for distributions/folder changes*
> > > > >
> > > > > The dev is now a separate distribution with its own pyproject.toml
> > > > > dependencies that are used for all the release management and
> general
> > > dev
> > > > > ho

Re: [ANNOUNCE] UV mandatory tooling and (much nicer) doc building coming

2025-04-03 Thread Vincent Beck
Another huge PR for another massive change. Thanks for the effort Jarek!

On 2025/04/03 10:05:50 Kunal Bhattacharya wrote:
> Thank you so much for this Jarek, incredible effort. uv really has been a
> lifesaver and I can't wait to now try out the simplified dev workflow.
> 
> Regards,
> Kunal Bhattacharya
> 
> 
> 
> On Thu, Apr 3, 2025 at 11:16 AM Amogh Desai 
> wrote:
> 
> > Thank you for the effort Jarek!
> >
> > I am looking forward to having a simpler life building docs. Personally it
> > was a nightmare for me earlier.
> > Will shout out if / when I get to that stage.
> >
> > Thanks & Regards,
> > Amogh Desai
> >
> >
> > On Wed, Apr 2, 2025 at 10:37 PM Buğra Öztürk 
> > wrote:
> >
> > > Amazing news! Thanks for the huge effort Jarek! :)
> > >
> > > On Wed, 2 Apr 2025, 18:45 Pavankumar Gopidesu, 
> > > wrote:
> > >
> > > > That's awesome Jarek, thank you for this :)
> > > >
> > > > Regards,
> > > > Pavan
> > > >
> > > > On Wed, Apr 2, 2025 at 1:09 PM Jarek Potiuk  wrote:
> > > >
> > > > > You might also want to do *docker system prune*  or even *docker
> > system
> > > > > prune --all*  or run *breeze doctor* to clean-up some stale cache,
> > > > images,
> > > > > docker volumes
> > > > >
> > > > > Also IntelliJ/PyCharm users *uv run setup_idea.py* will add some
> > > missing
> > > > > directories and regenerate your IntelliJ project configuration.
> > > > >
> > > > >
> > > > >
> > > > >
> > > > >
> > > > > On Wed, Apr 2, 2025 at 1:11 PM Jarek Potiuk 
> > wrote:
> > > > >
> > > > > > Hello here,
> > > > > >
> > > > > > As part of the packaging work - I merged the
> > > > > > https://github.com/apache/airflow/pull/48223
> > > > > >
> > > > > > *TL;DR; Rebase all PRS, rebuild images and uv sync - and
> > > > > > hopefully everything should work as before even if a lot of things
> > > > moved.
> > > > > > Hopefully the move will be largely transparent (except changing
> > > > *include:
> > > > > > in open PRs in docs). *
> > > > > >
> > > > > > *NOTE! UV is now mandatory and a lot of code is gone thanks to
> > that.
> > > > > > Breeze will also refuse to work if uv is not installed.*
> > > > > >
> > > > > > It took a bit of time, but we are in a much more standard and
> > better
> > > > > shape
> > > > > > now - and as a side effect (which was intended but I had to
> > implement
> > > > it
> > > > > as
> > > > > > part of that monster PR to fix documentation) we now have a much
> > > > simpler
> > > > > > (more guidelines are coming) way to iterate on doc building.
> > > > > >
> > > > > > *Few important things first: *
> > > > > >
> > > > > > 1) Make sure to rebase your PRs, Run `breeze image build`, Rnu `uv
> > > > sync`.
> > > > > > Due to the way git handles things - you ** might **  have some
> > > dangling
> > > > > > generated directories in your repo and they might cause some
> > > problems.
> > > > > Run
> > > > > > "git status" after rebase and see you have some files you need to
> > > > delete
> > > > > > (manually)
> > > > > >
> > > > > > 2) If you are brave enough - you might want to run `breeze doctor`
> > > and
> > > > > > cleanup git repo - it should clean all files that should be
> > removed,
> > > > but
> > > > > > also it might remove some of your custom configurations and files
> > you
> > > > > > created,
> > > > > >
> > > > > > 3) Generally everything should work as it worked before with breeze
> > > > (for
> > > > > > example `breeze build-docs` command works as before. But a number
> > of
> > > > > > folders/distributions/code (not airflow nore providers directly)
> > were
> > > > > > moved/updated. For now you can just continue to build docs as
> > before
> > > -
> > > > > with
> > > > > > breeze. But simpler/faster ways are coming as follow up.
> > > > > >
> > > > > > 4) If you have some new examples or documentation included in your
> > > PRs
> > > > > the
> > > > > > doc build might start falling for you - but this is because
> > > `include::`
> > > > > or
> > > > > > `exampleinclude:`  might need to be updated - look at other
> > examples
> > > -
> > > > I
> > > > > > fixed the includes in all providers. More explanation in follow-up
> > > doc
> > > > > > build improvement PR - in the meantime, feel free to ask on slack
> > or
> > > PR
> > > > > for
> > > > > > help.
> > > > > >
> > > > > > *Generated provider_dependencies.json do not need to be updated*
> > > > > >
> > > > > > The "generated/provider_dependencies.json" is no longer committed
> > to
> > > > the
> > > > > > repo - it is .gitignored. We are generating it as-needed on the
> > > flight.
> > > > > It
> > > > > > should be automatically regenerated when you run pre-commits
> > locally
> > > > and
> > > > > > when you build the breeze image.
> > > > > >
> > > > > > There might be some cases when we add dependencies and you will
> > need
> > > to
> > > > > > regenerate it but that should happen automatically as needed.
> > > > > >
> > > > > > *New, updated folders*
> > > > > >
> > > > > > The change are mostly with the