Orchestration is no longer just about moving data; it is about governing enterprise intelligence. To reflect our deep commitment to and embrace of open-source software, we shared earlier that Cloud Composer is now officially Managed Service for Apache Airflow.
We announced a massive leap forward in our orchestration capabilities, fundamentally reimagining how data teams operate in the AI era. With four major launches, we are embedding AI directly into your workflows to democratize access, accelerate productivity, and power your most demanding MLOps.
1. Apache Airflow 3.1 is now Generally Available
We announced Apache Airflow 3.1 in General Availability to power your most demanding AI and MLOps workloads. This release combines the significant foundation of Airflow 3.0 with the recent community innovations of 3.1.
Key capabilities include:
- Decoupled architecture: A robust separation between the entire Airflow system and the execution layer for better scalability and enhanced security.
- DAG versioning: Native support for automated DAG versioning, retaining the historical structure and run history.
- Powerful managed backfills: A redesigned backfill system that is now a first-class citizen, fully managed by the scheduler.
- Event-driven scheduling and data assets: Enhanced capabilities for triggering workflows based on assets as well as external events, like messages arriving in a message queue.
- Human-in-the-Loop (HITL) and deadline alerts: Pause execution for human decision-making via the UI, and set proactive time-based thresholds for critical pipelines.
- And many more…

2. Agentic troubleshooting with Data Engineering Agents
Managing complex pipelines just got significantly easier. The Data Engineering Agent is now embedded directly in your Managed Airflow dashboard to quickly analyze logs, identify root causes, and suggest fixes.
- Rapid resolution: By integrating Gemini Cloud Assist Investigations1, you can leverage AI to troubleshoot DAG Run failures and receive personalized fix proposals directly in the console.
- Reduced MTTR: This agentic approach helps minimize Mean Time to Repair (MTTR) by eliminating manual log parsing. Furthermore, troubleshooting is now elevated to the DAG execution level—rather than just the task level—providing a holistic view of pipeline health.

3. Orchestration pipelines and deployment automation framework
You no longer need to be an Apache Airflow expert to harness its power. Orchestration pipelines are a core component of our new cross-product Deployment Automation Framework, allowing you to create end-to-end data pipelines efficiently.
-
Declarative orchestration: Define your entire pipeline—including the orchestration logic, infrastructure configuration, and dependencies—in simple, human-readable YAML files.
-
Cross-product bundles: These YAML definitions are easily deployed as a complete bundle to the cloud. For example, without knowing Airflow syntax, a user can quickly create and deploy a comprehensive data integration pipeline across dbt, Spark, DTS, and more.
-
Unified IDE experience: Alongside automated validation and deployment via GitHub actions, the Google Data Cloud extension makes agentic authoring and troubleshooting the centerpiece of your workflow. You can now rely on powerful AI agents to build and debug pipelines directly in your IDE, with the ability to visually inspect the agent-generated DAGs for complete oversight.
Crucially, this declarative approach breaks down the traditional silos between advanced Python developers and data analysts. By shifting to human-readable YAML, we are fostering a more inclusive data culture where a wider range of practitioners can independently author, understand, and manage critical data workflows.
4. MCP Server for Managed Airflow (Public Preview)
To further bridge the gap between AI and orchestration, we are launching the Managed Airflow MCP Server in Public Preview.
-
Agentic tooling: This server provides tools like
list_environments,get_dag_run, andget_task_instanceto fetch critical information about your environments. -
Seamless integration & reduced context-switching: Both humans and agents can use these tools to simplify task management. Most importantly, this drastically reduces the context-switching developers face when debugging complex DAGs. By bringing environment and task data directly into your preferred interfaces, you can troubleshoot faster without constantly pivoting between different consoles.
Embrace the future of data orchestration
With these launches, we are fundamentally lowering the barrier to entry for orchestration while simultaneously raising the ceiling for what power users can achieve. By taking away the infrastructure burden and providing native, agentic tooling, data teams can stop wrestling with boilerplate code and start focusing primarily on deriving insights and driving business value.
Whether you are a seasoned Data Engineer building dynamic Python DAGs or a Data Analyst defining straightforward YAML pipelines, Managed Service for Apache Airflow is built for you.
Get Started Today Ready to experience the next generation of data pipeline orchestration? Create a new environment in the Google Cloud Console, explore the Google Cloud Data Agent Kit extension, and start building your agentic future today.
1. Availability might be limited (details)