Skip to content

Chaivara/dbt-dag

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

2 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

dbt-dag

Running my dbt project with Airflow using Astronomer.

what this is

Instead of running dbt run manually, this sets up Airflow to do it automatically every day. Uses something called Cosmos which is pretty cool - it reads your dbt project and creates airflow tasks for each model.

how to run locally

need docker running first, then:

astro dev start

go to http://localhost:8080 (airflow ui)

  • username: admin
  • password: admin

setting up snowflake connection

in airflow ui go to Admin > Connections and add:

  • conn id: snowflake_conn
  • conn type: Snowflake
  • fill in your snowflake details

files

  • dags/dbt_dag.py - the actual dag code
  • dags/dbt/ - the dbt project lives here
  • Dockerfile - container stuff
  • requirements.txt - python packages

things that tripped me up

  • need to have docker desktop running before astro dev start
  • the snowflake connection id in the dag has to match what you create in airflow ui
  • cosmos automatically figures out the order to run dbt models which is nice

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published