Running my dbt project with Airflow using Astronomer.
Instead of running dbt run manually, this sets up Airflow to do it automatically every day. Uses something called Cosmos which is pretty cool - it reads your dbt project and creates airflow tasks for each model.
need docker running first, then:
astro dev startgo to http://localhost:8080 (airflow ui)
- username: admin
- password: admin
in airflow ui go to Admin > Connections and add:
- conn id: snowflake_conn
- conn type: Snowflake
- fill in your snowflake details
dags/dbt_dag.py- the actual dag codedags/dbt/- the dbt project lives hereDockerfile- container stuffrequirements.txt- python packages
- need to have docker desktop running before astro dev start
- the snowflake connection id in the dag has to match what you create in airflow ui
- cosmos automatically figures out the order to run dbt models which is nice