Overview
Understand the Astro platform, Set up your local development environment, Build and run your first DAG on Astro, Deploy DAGs to Astro, Monitor DAGs and Task metrics and setup alerts
Data engineers, Analytics Engineers, Anyone who wants to run Airflow in the cloud with Astro
A prior basic knowledge of Airflow
Ready to run Apache Airflow without the infrastructure headaches? Astro by Astronomer is the modern way to build, test, and deploy data pipelines—and this course will get you up and running fast.
In this hands-on course, you'll go from zero to deploying your first DAGs on Astro Cloud. You'll learn how to set up your local development environment with the Astro CLI, build and test DAGs using the Astro IDE, and explore multiple deployment strategies to fit your team's workflow. We'll also cover essential platform features like metrics, alerting, and even coding with Astro AI.
What you'll learn:
Understand what Astro is and why it's transforming how teams manage Airflow
Install the Astro CLI and run Airflow locally in minutes
Build, test, and debug DAGs using the Astro IDE
Work with connections, variables, and environment management
Deploy your projects using project deploy, DAG-only, image-only, and Git-based workflows
Monitor your pipelines with built-in metrics and alerting
Who this course is for: Data engineers, analytics engineers, and developers who want a faster, simpler way to manage Apache Airflow pipelines in the cloud.
No prior Astronomer experience required—just bring some basic Python knowledge and you're good to go.
Enjoy the course!
Marc Lamberti
Marc Lamberti
Hi there,
My name is Marc Lamberti, I'm 27 years old and I'm very happy to arouse your curiosity! I'm currently working as Big Data Engineer in full-time for the biggest online bank in France, dealing with more than 1 500 000 clients. For more than 3 years now, I created different ETLs in order to address the problems that a bank encounters everyday such as, a platform to monitor the information system in real time to detect anomalies and reduce the number of client's calls, a tool detecting in real time any suspicious transaction or potential fraudster, an ETL to valorize massive amount of data into Cassandra and so on.
The biggest issue when you are a Big Data Engineer is to deal with the growing number of available open source tools. You have to know how to use them, when to use them and how they connect to each other in order to build robust, secure and performing systems solving your underlying business needs.
I strongly believe that the best way to learn and understand a new skill is by taking a hands-on approach with just enough theory to explain the concepts and a big dose of practice to be ready in a production environment. That's why in each of my courses you will always find practical examples associated with theoric explanations.
Have a great learning time!
