Now Reading: Migrating from Apache Airflow v2 to v3

Loading
svg

Migrating from Apache Airflow v2 to v3

NewsMarch 16, 2026Artifice Prime
svg22

During the 2025 holidays, I had some downtime and decided to migrate one of my pet projects from Apache Airflow 2.10.3 to 3.0.6. This article is based entirely on my hands-on experience with that migration and captures my initial takeaways of what worked well, what felt rough around the edges and where Airflow 3 genuinely changes how we think about workflow orchestration.

Rather than covering every new feature, I want to focus on the five changes that stood out the most during the migration.

1. Unified SDK imports improved developer experience

One of the first changes I appreciated as a developer was the move to SDK-first imports. In Airflow 2.x, DAG authoring often required importing objects from multiple modules such as decorators and models. Airflow 3 consolidates this into a more intuitive and unified SDK surface, making DAG code easier to read, write and maintain.

Airflow 2.10.3

from airflow.decorators import dag, task
from airflow.models import Param

Airflow 3.0.6

from airflow.sdk import dag, task, chain
from airflow.sdk.definitions.param import Param

This may look like a small change, but across a growing codebase it significantly reduces cognitive overhead and improves consistency in DAG authoring.

2. Clearer separation between DAG code and the metadata database

The SDK shift also reflects a larger architectural change in Airflow 3: DAG and task code are now intentionally decoupled from the metadata database.

In practice, this resulted in:

  • Fewer accidental dependencies on metadata DB objects
  • Clearer boundaries between orchestration and execution
  • A safer and more scalable execution model built around APIs

As someone who prefers lightweight and decoupled architectures, this felt like a very welcome change and a solid foundation for the future of Airflow.

3. DAG versioning made historical runs trustworthy

With Airflow 3.x, each DAG run is tied to a specific DAG version, and this turned out to be one of the most valuable improvements for day-to-day operations.

In Airflow 2.x, even small changes such as renaming a task or refactoring logic could cause historical runs to appear inconsistent or confusing in the UI. Debugging older runs often meant mentally mapping today’s DAG code to yesterday’s execution.

With DAG versioning:

  • Each run executes against the exact DAG definition it started with
  • Historical runs remain accurate and easy to reason about
  • Debugging past failures no longer depends on current code

This alone significantly improved traceability and confidence when evolving workflows.

4. New UI: modern, but still a work in progress

The redesigned UI in Airflow 3 is one area where my experience was mixed. I have been on the receiving end of feedback from users and clients who found the new layout disorienting, mostly due to buttons moving or workflows changing. While I am personally open to UI changes, the new interface did feel rough around the edges. Some of these behaviours may vary depending on deployment configuration and version.

Some of the issues I noticed:

  • Task ordering appeared inconsistent in the DAG Grid view
  • The enable/disable DAG toggle intermittently disappeared
  • The Delete DAG action was harder to discover
  • Page loads felt slower at times
  • Searching historical DAG runs using date-based filters was less intuitive than before

The UI is clearly more modern and designed to scale better long-term, but today it feels less polished for everyday operational workflows. I expect many of these issues to be addressed as Airflow 3.x matures, though I still find myself missing the predictability of the older UI.

5. Asset-Based scheduling simplified cross-DAG dependencies

One of the most impactful conceptual changes in Airflow 3 is the shift from Datasets to Assets, especially for modelling cross-DAG dependencies.

In my project, several workflows follow a familiar pattern:

  • A file lands in S3
  • One job processes the file
  • A downstream job runs only after that data is ready

In Airflow 2.x, this usually meant chaining sensors and explicit DAG triggers. While functional, this approach added coupling and operational complexity.

With Assets, the focus shifts from “which DAG triggers which?” to “what data is now available?” Workflows become data-driven rather than DAG-driven, resulting in cleaner definitions, fewer sensors and better visibility into real data dependencies.

This felt like a much more natural way to express how data pipelines actually work.

Airflow 2.10.3

from airflow.operators.trigger_dagrun import TriggerDagRunOperator

trigger_downstream = TriggerDagRunOperator(
    task_id="trigger_downstream",
    trigger_dag_id="downstream_dag"
)

Airflow 3.0.6

Upstream job

from airflow.sdk import task
from airflow.sdk.definitions.asset import Asset

MY_JOB_ASSET = [Asset("db://my-data")]

@task(outlets=[MY_JOB_ASSET])
def produce_data():
    pass

Downstream job

from airflow.sdk import dag, task
from datetime import datetime

@dag(
    start_date=datetime(2024, 1, 1),
    schedule=[MY_JOB_ASSET]
)
def downstream_dag():

    @task
    def consume_data():
        pass

    consume_data()

downstream_dag()

Why this migration matters

Beyond individual features, migrating to Airflow 3 felt less like an optional upgrade and more like a necessary step forward. Airflow 3 represents a clear architectural direction for the project: API-driven execution, better isolation, data-aware scheduling and a platform designed for modern scale.

While Airflow 2.x is still widely used, it is clearly moving toward long-term maintenance (end-of-life April 2026) with most innovation and architectural investment happening in the 3.x line. Delaying migration only widens the gap:

  • More breaking changes accumulate
  • Provider compatibility becomes harder to manage
  • Teams miss out on improvements that simplify debugging and orchestration

For me, moving from 2.10.3 to 3.0.6 wasn’t just about staying current; it was about aligning with where Airflow is headed. Even with a few rough edges, Airflow 3 feels like the foundation the project needed for its next phase.

Disclaimer: The views expressed are my own and do not represent those of my employer.

This article is published as part of the Foundry Expert Contributor Network.
Want to join?

Original Link:https://www.infoworld.com/article/4145050/migrating-from-apache-airflow-v2-to-v3.html
Originally Posted: Mon, 16 Mar 2026 09:00:00 +0000

0 People voted this article. 0 Upvotes - 0 Downvotes.

Artifice Prime

Atifice Prime is an AI enthusiast with over 25 years of experience as a Linux Sys Admin. They have an interest in Artificial Intelligence, its use as a tool to further humankind, as well as its impact on society.

svg
svg

What do you think?

It is nice to know your opinion. Leave a comment.

Leave a reply

Loading
svg To Top
  • 1

    Migrating from Apache Airflow v2 to v3

Quick Navigation