setting-up-astro-project

Compare original and translation side by side

🇺🇸

Original

English
🇨🇳

Translation

Chinese

Astro Project Setup

Astro项目设置

This skill helps you initialize and configure Airflow projects using the Astro CLI.
To run the local environment, see the managing-astro-local-env skill. To write DAGs, see the authoring-dags skill.

本技能可帮助你使用Astro CLI初始化并配置Airflow项目。
如需运行本地环境,请查看managing-astro-local-env技能。 如需编写DAG,请查看authoring-dags技能。

Initialize a New Project

初始化新项目

bash
astro dev init
Creates this structure:
project/
├── dags/                # DAG files
├── include/             # SQL, configs, supporting files
├── plugins/             # Custom Airflow plugins
├── tests/               # Unit tests
├── Dockerfile           # Image customization
├── packages.txt         # OS-level packages
├── requirements.txt     # Python packages
└── airflow_settings.yaml # Connections, variables, pools

bash
astro dev init
创建如下结构:
project/
├── dags/                # DAG文件
├── include/             # SQL、配置文件及支持文件
├── plugins/             # 自定义Airflow插件
├── tests/               # 单元测试
├── Dockerfile           # 镜像自定义配置
├── packages.txt         # 操作系统级包
├── requirements.txt     # Python包
└── airflow_settings.yaml # 连接、变量、资源池配置

Adding Dependencies

添加依赖项

Python Packages (requirements.txt)

Python包(requirements.txt)

apache-airflow-providers-snowflake==5.3.0
pandas==2.1.0
requests>=2.28.0
apache-airflow-providers-snowflake==5.3.0
pandas==2.1.0
requests>=2.28.0

OS Packages (packages.txt)

操作系统级包(packages.txt)

gcc
libpq-dev
gcc
libpq-dev

Custom Dockerfile

自定义Dockerfile

For complex setups (private PyPI, custom scripts):
dockerfile
FROM quay.io/astronomer/astro-runtime:12.4.0

RUN pip install --extra-index-url https://pypi.example.com/simple my-package
After modifying dependencies: Run
astro dev restart

适用于复杂配置(私有PyPI、自定义脚本):
dockerfile
FROM quay.io/astronomer/astro-runtime:12.4.0

RUN pip install --extra-index-url https://pypi.example.com/simple my-package
修改依赖项后: 运行
astro dev restart

Configuring Connections & Variables

配置连接与变量

airflow_settings.yaml

airflow_settings.yaml

Loaded automatically on environment start:
yaml
airflow:
  connections:
    - conn_id: my_postgres
      conn_type: postgres
      host: host.docker.internal
      port: 5432
      login: user
      password: pass
      schema: mydb

  variables:
    - variable_name: env
      variable_value: dev

  pools:
    - pool_name: limited_pool
      pool_slot: 5
环境启动时会自动加载:
yaml
airflow:
  connections:
    - conn_id: my_postgres
      conn_type: postgres
      host: host.docker.internal
      port: 5432
      login: user
      password: pass
      schema: mydb

  variables:
    - variable_name: env
      variable_value: dev

  pools:
    - pool_name: limited_pool
      pool_slot: 5

Export/Import

导出/导入

bash
undefined
bash
undefined

Export from running environment

从运行中的环境导出

astro dev object export --connections --file connections.yaml
astro dev object export --connections --file connections.yaml

Import to environment

导入到环境

astro dev object import --connections --file connections.yaml

---
astro dev object import --connections --file connections.yaml

---

Validate Before Running

运行前验证

Parse DAGs to catch errors without starting the full environment:
bash
astro dev parse

解析DAG以在不启动完整环境的情况下捕获错误:
bash
astro dev parse

Related Skills

相关技能

  • managing-astro-local-env: Start, stop, and troubleshoot the local environment
  • authoring-dags: Write and validate DAGs (uses MCP tools)
  • testing-dags: Test DAGs (uses MCP tools)
  • managing-astro-local-env:启动、停止并排查本地环境问题
  • authoring-dags:编写并验证DAG(使用MCP工具)
  • testing-dags:测试DAG(使用MCP工具)