Loading...
Loading...
Use when running a dbt Fusion project with Astronomer Cosmos. Covers Cosmos 1.11+ configuration for Fusion on Snowflake/Databricks with ExecutionMode.LOCAL. Before implementing, verify dbt engine is Fusion (not Core), warehouse is supported, and local execution is acceptable. Does not cover dbt Core.
npx skill4agent add astronomer/agents cosmos-dbt-fusionVersion note: dbt Fusion support was introduced in Cosmos 1.11.0. Requires Cosmos ≥1.11.Reference: See reference/cosmos-config.md for ProfileConfig, operator_args, and Airflow 3 compatibility details.
Before starting, confirm: (1) dbt engine = Fusion (not Core → use cosmos-dbt-core), (2) warehouse = Snowflake or Databricks only, (3)is acceptable (no containerized/async/virtualenv).ExecutionMode.LOCAL
| Constraint | Details |
|---|---|
| Execution mode | |
| No async | |
| No containerized | |
| No virtualenv | Fusion is a binary, not a Python package |
| Warehouse support | Snowflake + Databricks only (public beta) |
CRITICAL: Cosmos 1.11.0 introduced dbt Fusion compatibility.
# Check installed version
pip show astronomer-cosmos
# Install/upgrade if needed
pip install "astronomer-cosmos>=1.11.0"pip show astronomer-cosmosUSER root
RUN apt-get update && apt-get install -y curl
ENV SHELL=/bin/bash
RUN curl -fsSL https://public.cdn.getdbt.com/fs/install/install.sh | sh -s -- --update
USER astro| Environment | Typical path |
|---|---|
| Astro Runtime | |
| System-wide | |
dbtdbt --version| Load mode | When to use | Required inputs |
|---|---|---|
| Large projects; fastest parsing | |
| Complex selectors; need dbt-native selection | Fusion binary accessible to scheduler |
| Simple setups; let Cosmos pick | (none) |
from cosmos import RenderConfig, LoadMode
_render_config = RenderConfig(
load_method=LoadMode.AUTOMATIC, # or DBT_MANIFEST, DBT_LS
)Reference: See reference/cosmos-config.md for full ProfileConfig options and examples.
| Warehouse | ProfileMapping Class |
|---|---|
| Snowflake | |
| Databricks | |
from cosmos import ProfileConfig
from cosmos.profiles import SnowflakeUserPasswordProfileMapping
_profile_config = ProfileConfig(
profile_name="default",
target_name="dev",
profile_mapping=SnowflakeUserPasswordProfileMapping(
conn_id="snowflake_default",
),
)CRITICAL: dbt Fusion with Cosmos requireswithExecutionMode.LOCALpointing to the Fusion binary.dbt_executable_path
from cosmos import ExecutionConfig
_execution_config = ExecutionConfig(
dbt_executable_path="/home/astro/.local/bin/dbt", # REQUIRED: path to Fusion binary
# execution_mode is LOCAL by default - do not change
)| Allowed | Not Allowed |
|---|---|
| ✅ Install Fusion binary into Airflow image/runtime | ❌ |
✅ | ❌ |
❌ |
from cosmos import ProjectConfig
_project_config = ProjectConfig(
dbt_project_path="/path/to/dbt/project",
# manifest_path="/path/to/manifest.json", # for dbt_manifest load mode
# install_dbt_deps=False, # if deps precomputed in CI
)from cosmos import DbtDag, ProjectConfig, ProfileConfig, ExecutionConfig, RenderConfig
from cosmos.profiles import SnowflakeUserPasswordProfileMapping
from pendulum import datetime
_project_config = ProjectConfig(
dbt_project_path="/usr/local/airflow/dbt/my_project",
)
_profile_config = ProfileConfig(
profile_name="default",
target_name="dev",
profile_mapping=SnowflakeUserPasswordProfileMapping(
conn_id="snowflake_default",
),
)
_execution_config = ExecutionConfig(
dbt_executable_path="/home/astro/.local/bin/dbt", # Fusion binary
)
_render_config = RenderConfig()
my_fusion_dag = DbtDag(
dag_id="my_fusion_cosmos_dag",
project_config=_project_config,
profile_config=_profile_config,
execution_config=_execution_config,
render_config=_render_config,
start_date=datetime(2025, 1, 1),
schedule="@daily",
)from airflow.sdk import dag, task # Airflow 3.x
# from airflow.decorators import dag, task # Airflow 2.x
from airflow.models.baseoperator import chain
from cosmos import DbtTaskGroup, ProjectConfig, ProfileConfig, ExecutionConfig
from pendulum import datetime
_project_config = ProjectConfig(dbt_project_path="/usr/local/airflow/dbt/my_project")
_profile_config = ProfileConfig(profile_name="default", target_name="dev")
_execution_config = ExecutionConfig(dbt_executable_path="/home/astro/.local/bin/dbt")
@dag(start_date=datetime(2025, 1, 1), schedule="@daily")
def my_dag():
@task
def pre_dbt():
return "some_value"
dbt = DbtTaskGroup(
group_id="dbt_fusion_project",
project_config=_project_config,
profile_config=_profile_config,
execution_config=_execution_config,
)
@task
def post_dbt():
pass
chain(pre_dbt(), dbt, post_dbt())
my_dag()AIRFLOW__COSMOS__PRE_DBT_FUSION=1