I definitely think there's potential to interact with an airflow MCP server. Though I think I'd be interested to see how many and how frequently people are making use of MCP servers in the wild before investing effort in building and maintaining one for airflow. I'm sure the data is available out there, just needs finding. -- Regards, Aritra Basu
On Wed, 28 May 2025, 11:18 pm Julian LaNeve, <jul...@astronomer.io.invalid> wrote: > I think this would be interesting now that the Streamable HTTP spec < > https://modelcontextprotocol.io/specification/2025-03-26/basic/transports> > is out. I think in theory we could publish this first as an Airflow > provider that installs a plugin to expose an MCP endpoint, as a PoC - this > becomes a much nicer experience than a local stdio one. > -- > Julian LaNeve > CTO > > Email: jul...@astronomer.io > <mailto:jul...@astronomer.io>Mobile: 330 509 5792 > > > On May 28, 2025, at 1:25 PM, Shahar Epstein <sha...@apache.org> wrote: > > > > Dear community, > > > > Following the thread on Slack [1], initiated by Jason Sebastian Kusuma, > I'd > > like to start an effort to officially support MCP in Airflow's codebase. > > > > *Some background * > > Model Context Protocol (MCP) is an open standard, open-source framework > > that standardizes the way AI models like LLM integrate and share data > with > > external tools, systems and data sources. Think of it as a "USB-C for > AI" - > > a universal connector that simplifies and standardizes AI integrations. A > > notable example of an MCP server is GitHub's official implementation > [3], which > > allows LLMs such as Claude, Copilot, and OpenAI (or: "MCP clients") to > > fetch pull request details, analyze code changes, and generate review > > summaries. > > > > *How could an MCP server be useful in Airflow?* > > Imagine the possibilities when LLMs can seamlessly interact with > Airflow’s > > API: triggering DAGs using natural language, retrieving DAG run history, > > enabling smart debugging, and more. This kind of integration opens the > door > > to a more intuitive, conversational interface for workflow orchestration. > > > > *Why do we need to support it officially?* > > Quid pro quo - LLMs become an integral part of the modern development > > experience, while Airflow evolves into the go-to for orchestrating AI > > workflows. By officially supporting it, we’ll enable multiple users to > > interact with Airflow through their LLMs, streamlining automation and > > improving accessibility across diverse workflows. All of that is viable > > with relatively small development effort (see next paragraph). > > > > *How should it be implemented?* > > As of today, there have been several implementations of MCP servers for > > Airflow API, the most visible one [4] made by Abhishek Bhakat from > > Astronomer. > > The efforts of implementing it and maintaining it in our codebase > shouldn't > > be too cumbersome (at least in theory), as we could utilize packages like > > fastmcp to auto-generate the server using the existing OpenAPI specs. I'd > > be very happy if Abhishek could share his experience in this thread. > > > > *Where else could we utilize MCP?* > > Beyond the scope of the public API, I could also imagine using it to > > communicate with Breeze. > > > > *How do we proceed from here?* > > Feel free to share your thoughts here in this discussion. > > If there are no objections, I'll be happy to start working on an AIP. > > > > > > Sincerely, > > Shahar Epstein > > > > > > *References:* > > [1] Slack discussion, > > https://apache-airflow.slack.com/archives/C06K9Q5G2UA/p1746121916951569 > > [2] Introducing the model context protocol, > > https://www.anthropic.com/news/model-context-protocol > > [3] GitHub Official MCP server, > https://github.com/github/github-mcp-server > > [4] Unofficial MCP Server made by Abhishek Hakat, > > https://github.com/abhishekbhakat/airflow-mcp-server > >