- Learning Path Libraries: This path is only available in the libraries listed. To access this path, purchase a license for the corresponding library.
- Data
Data Orchestration and Workflow Management with Apache Airflow
Apache Airflow is an open-source platform for orchestrating complex workflows and data pipelines. It enables scheduling, monitoring, and managing workflows through a user-friendly interface. Built on Python, Airflow supports dynamic pipeline generation, scalability, and extensibility, making it ideal for automating and optimizing data engineering processes. This learning path teaches data engineers how to build and manage data pipelines with Apache Airflow for better data orchestration and workflow management.
Content in this path
Data Orchestration and Workflow Management with Apache Airflow
Watch the following course to get started!
Hands-on Practice with Apache Airflow
The following labs will help you get practical experience with Apache Airflow.
- How to get up and running with Airflow
- How to create basic, complex, and dynamic DAGs
- How to optimize workflows in Airflow
- How to manage workflows in Airflow
- How to integrate databases with Airflow
- How to work with APIs in Airflow
- How to integrate external services with Airflow
- How to build data pipelines with Airflow
- How to monitor and maintain Airflow
- How to implement version control in Airflow
- Learners should have the following knowledge to gain the most from this learning path:
- Basic Python knowledge
- Basic SQL and database knowledge
- Basic Command Line Interface (CLI) skills
- Experience with data pipeline concepts and processes
- Data Engineering
- Databases
- ETL
- Python