ADK Project Scaffolding Guide
Requires: (
uv tool install google-agents-cli
) —
install uv first if needed.
Use the
CLI to create new ADK agent projects or enhance existing ones with deployment, CI/CD, and infrastructure scaffolding.
Prerequisite: Clarify Requirements (MANDATORY for new projects)
Before scaffolding a new project, load /google-agents-cli-workflow
and complete Phase 0 — clarify the user's requirements before running any
command. Ask what the agent should do, what tools/APIs it needs, and whether they want a prototype or full deployment.
Step 1: Choose Architecture
Mapping user choices to CLI flags:
| Choice | CLI flag |
|---|
| RAG with vector search | --agent agentic_rag --datastore agent_platform_vector_search
|
| RAG with document search | --agent agentic_rag --datastore agent_platform_search
|
| A2A protocol | |
| Prototype (no deployment) | |
| Deployment target | --deployment-target <agent_runtime|cloud_run|gke>
|
| CI/CD runner | --cicd-runner <github_actions|cloud_build>
|
| Session storage | --session-type <in_memory|cloud_sql|agent_platform_sessions>
|
Product name mapping
The platform formerly known as "Vertex AI" is now Gemini Enterprise Agent Platform (short: Agent Platform). Users may refer to products by different names. Map them to the correct CLI values:
| User may say | CLI value |
|---|
| Agent Engine, Vertex AI Agent Engine, Agent Runtime | --deployment-target agent_runtime
|
| Vertex AI Search, Agent Search | --datastore agent_platform_search
|
| Vertex AI Vector Search, Vector Search | --datastore agent_platform_vector_search
|
| Agent Engine sessions, Agent Platform Sessions | --session-type agent_platform_sessions
|
The
Python SDK package name is unchanged.
Step 2: Create or Enhance the Project
Create a New Project
bash
agents-cli scaffold create <project-name> \
--agent <template> \
--deployment-target <target> \
--region <region> \
--prototype
Constraints:
- Project name must be 26 characters or less, lowercase letters, numbers, and hyphens only.
- Do NOT the project directory before running — the CLI creates it automatically. If you mkdir first, will fail or behave unexpectedly.
- Auto-detect the guidance filename based on the IDE you are running in and pass
--agent-guidance-filename
accordingly ( for Gemini CLI, for Claude Code, for OpenAI Codex/other).
- When enhancing an existing project, check where the agent code lives. If it's not in , pass (e.g. ). Getting this wrong causes enhance to miss or misplace files.
Reference Files
| File | Contents |
|---|
| Full flag reference for and commands |
Enhance an Existing Project
bash
agents-cli scaffold enhance . --deployment-target <target>
agents-cli scaffold enhance . --cicd-runner <runner>
Run this from inside the project directory (or pass the path instead of
).
Upgrade a Project
Upgrade an existing project to a newer agents-cli version, intelligently applying updates while preserving your customizations:
bash
agents-cli scaffold upgrade # Upgrade current directory
agents-cli scaffold upgrade <project-path> # Upgrade specific project
agents-cli scaffold upgrade --dry-run # Preview changes without applying
agents-cli scaffold upgrade --auto-approve # Auto-apply non-conflicting changes
Execution Modes
The CLI defaults to
strict programmatic mode — all required params must be supplied as CLI flags or a
is raised. No approval flags needed. Pass all required params explicitly.
Common Workflows
Always ask the user before running these commands. Present the options (CI/CD runner, deployment target, etc.) and confirm before executing.
bash
# Add deployment to an existing prototype (strict programmatic)
agents-cli scaffold enhance . --deployment-target agent_runtime
# Add CI/CD pipeline (ask: GitHub Actions or Cloud Build?)
agents-cli scaffold enhance . --cicd-runner github_actions
Template Options
| Template | Deployment | Description |
|---|
| Agent Runtime, Cloud Run, GKE | Standard ADK agent (default) |
| Agent Runtime, Cloud Run, GKE | Agent-to-agent coordination (A2A protocol) |
| Agent Runtime, Cloud Run, GKE | RAG with data ingestion pipeline |
Deployment Options
| Target | Description |
|---|
| Managed by Google (Vertex AI Agent Runtime). Sessions handled automatically. |
| Container-based deployment. More control, requires Dockerfile. |
| Container-based on GKE Autopilot. Full Kubernetes control. |
| No deployment scaffolding. Code only. |
"Prototype First" Pattern (Recommended)
Start with
to skip CI/CD and Terraform. Focus on getting the agent working first, then add deployment later with
:
bash
# Step 1: Create a prototype
agents-cli scaffold create my-agent --agent adk --prototype
# Step 2: Iterate on the agent code...
# Step 3: Add deployment when ready
agents-cli scaffold enhance . --deployment-target agent_runtime
Agent Runtime and session_type
When using
agent_runtime as the deployment target, Agent Runtime manages sessions internally. If your code sets a
session_type`, clear it — Agent Runtime overrides it.
Step 3: Load Dev Workflow
After scaffolding, save
to the project root if it isn't there already.
Then immediately load /google-agents-cli-workflow
— it contains the development workflow, coding guidelines, and operational rules you must follow when implementing the agent.
Key files to customize: (instruction, tools, model),
(custom tool functions),
(project ID, location, API keys).
Files to preserve: section (CLI reads this), deployment configs under
,
,
(the
must match the directory name — default
).
RAG projects () — provision datastore first:
Before running
or testing your RAG agent, you must provision the datastore and ingest data:
bash
agents-cli infra datastore # Provision datastore infrastructure
agents-cli data-ingestion # Ingest data into the datastore
Use
—
not . Both provision the datastore, but
is faster because it skips unrelated Terraform. Without this step, the agent won't have data to search over.
Vector Search region: defaults to
, separate from
(
). It sets both the Vector Search collection region and the BQ ingestion dataset region, kept colocated to avoid cross-region data movement. Override per-invocation with
agents-cli data-ingestion --vector-search-location <region>
.
Verifying your agent works: Use
agents-cli run "test prompt"
for quick smoke tests, then
for systematic validation. Do NOT write pytest tests that assert on LLM response content — that belongs in eval.
Scaffold as Reference
When you need specific files (Terraform, CI/CD workflows, Dockerfile) but don't want to scaffold the current project directly, create a temporary reference project in
:
bash
agents-cli scaffold create /tmp/ref-project \
--agent adk \
--deployment-target cloud_run
Inspect the generated files, adapt what you need, and copy into the actual project. Delete the reference project when done.
This is useful for:
- Non-standard project structures that can't handle
- Cherry-picking specific infrastructure files
- Understanding what the CLI generates before committing to it
Critical Rules
- NEVER skip requirements clarification — load
/google-agents-cli-workflow
Phase 0 and clarify the user's intent before running
- NEVER change the model in existing code unless explicitly asked
- NEVER before — the CLI creates the directory; pre-creating it causes enhance mode instead of create mode
- NEVER create a Git repo or push to remote without asking — confirm repo name, public vs private, and whether the user wants it created at all
- Always ask before choosing CI/CD runner — present GitHub Actions and Cloud Build as options, don't default silently
- Agent Runtime clears session_type — if deploying to , remove any setting from your code
- Start with for quick iteration — add deployment later with
- Project names must be ≤26 characters, lowercase, letters/numbers/hyphens only
- NEVER write A2A code from scratch — the A2A Python API surface (import paths, schema, signature) is non-trivial and changes across versions. Always use to scaffold A2A projects.
Examples
Using scaffold as reference:
User says: "I need a Dockerfile for my non-standard project"
Actions:
- Create temp project:
agents-cli scaffold create /tmp/ref --agent adk --deployment-target cloud_run
- Copy relevant files (Dockerfile, etc.) from /tmp/ref
- Delete temp project
Result: Infrastructure files adapted to the actual project
A2A project:
User says: "Build me a Python agent that exposes A2A and deploys to Cloud Run"
Actions:
- Follow the standard flow (understand requirements, choose architecture, scaffold)
agents-cli scaffold create my-a2a-agent --agent adk_a2a --deployment-target cloud_run --prototype
Result: Valid A2A imports and Dockerfile — no manual A2A code written.
Troubleshooting
command not found
See
/google-agents-cli-workflow
→
Setup section.
Related Skills
/google-agents-cli-workflow
— Development workflow, coding guidelines, and the build-evaluate-deploy lifecycle
/google-agents-cli-adk-code
— ADK Python API quick reference for writing agent code
/google-agents-cli-deploy
— Deployment targets, CI/CD pipelines, and production workflows
- — Evaluation methodology, evalset schema, and the eval-fix loop