Manage Serverless Project
Perform day-2 operations on Elastic Cloud Serverless projects using the Serverless REST API.
Prerequisites and permissions
- Ensure is configured. If not, run skill first.
- Updating project settings requires Admin or Editor role on the target project.
- This skill does not perform a separate role pre-check. Attempt the requested operation and let the API enforce
authorization. If the API returns an authorization error (for example, ), stop and ask the user to
verify the provided API key permissions.
Manual setup fallback (when is unavailable)
If this skill is installed standalone and
is not available, instruct the user to configure Cloud
environment variables manually before running commands. Never ask the user to paste API keys in chat.
| Variable | Required | Description |
|---|
| Yes | Elastic Cloud API key used for project management operations. |
| No | Cloud API base URL (default: https://api.elastic-cloud.com
). |
Note: If
is missing, or the user does not have a Cloud API key yet, direct the user to generate one
at
Elastic Cloud API keys, then configure it locally using the steps below.
Preferred method (agent-friendly): create a
file in the project root:
bash
EC_API_KEY=your-api-key
EC_BASE_URL=https://api.elastic-cloud.com
All
scripts auto-load
from the working directory.
Alternative: export directly in the terminal:
bash
export EC_API_KEY="<your-cloud-api-key>"
export EC_BASE_URL="https://api.elastic-cloud.com"
Terminal exports may not be visible to sandboxed agents running in separate shell sessions, so prefer
when using
an agent.
Critical principles
- Never display secrets in chat. Do not echo, log, or repeat API keys, passwords, or credentials in conversation
messages or agent thinking. Direct the user to the file instead. The admin password must
never appear in chat history, thinking traces, or agent output — even when using it to create an API key, pass it
directly via shell variable substitution without echoing.
- Confirm before destructive actions. Always ask the user to confirm before deleting a project or resetting
credentials.
- Credentials are saved to file. After a credential reset, the script writes the new password to
automatically. The password is redacted from stdout. Never read or display the contents of
in chat.
- Admin credentials are for API key creation only. The password saved by and
exists solely to bootstrap a scoped API key — never use it for direct Elasticsearch operations.
excludes admin credentials by default; pass only for key creation.
- Always prefer API keys. Do not proceed with Elasticsearch operations until an is set. If
only admin credentials are available, create a scoped API key via . If that skill is not
installed, ask the user to install it or create the key manually in Kibana > Stack Management > API keys.
- Identify projects by type and ID. Every command requires both and (except , which only needs
).
- Two kinds of API keys. This skill uses the Cloud API key () for project management operations
(list, get, update, delete). Elasticsearch operations require a separate Elasticsearch API key
() that authenticates against the project's Elasticsearch endpoint. Do not confuse the two.
Workflow: Connect to an existing project
Use this workflow when the user asks to query or manage a project the agent did not create in the current session. It
resolves the project, saves its endpoints, and ensures working Elasticsearch credentials before proceeding.
This workflow only applies to Elastic Cloud Serverless projects. If the user's Elasticsearch instance is
self-managed or Elastic Cloud Hosted, this skill does not apply — skip it and proceed with the relevant skill directly.
If unsure, ask the user: "Is your Elasticsearch instance an Elastic Cloud Serverless project?"
text
Connect to Existing Project:
- [ ] Step 1: Resolve the project
- [ ] Step 2: Get project details and load credentials
- [ ] Step 3: Acquire Elasticsearch credentials
Step 1: Resolve the project
Ask the user for the project name if not already provided. Infer the project type from the user's request:
| User says | |
|---|
| "search project", "elasticsearch project", vector search | |
| "observability project", "o11y", logs, metrics, traces, APM | |
| "security project", "SIEM", detections, endpoint protection | |
If the type is ambiguous, list all three types to find the project.
bash
python3 skills/cloud/manage-project/scripts/manage-project.py list \
--type elasticsearch
Match the user's reference (name, partial name, or alias) against the list results. If multiple projects match or none
match, present the candidates and ask the user to pick.
Step 2: Get project details and load credentials
Once a single project is identified, check whether
already has entries for this project (from a
previous session). If so, load them with
:
bash
eval $(python3 skills/cloud/manage-project/scripts/manage-project.py load-credentials \
--name "<project-name>")
This sets all saved environment variables for the project — endpoints and any previously created Elasticsearch API keys
— in a single command. Admin credentials (
/
) are intentionally excluded.
Later sections for the same project automatically overwrite earlier values, so the most recent credentials always win.
If
reports no matching entries, fetch the project details from the API and export endpoints manually:
bash
python3 skills/cloud/manage-project/scripts/manage-project.py get \
--type elasticsearch \
--id <project-id>
Then export the endpoint URLs from the response. The available endpoints depend on the project type.
All project types:
bash
export ELASTICSEARCH_URL="<elasticsearch_endpoint>"
export KIBANA_URL="<kibana_endpoint>"
Observability projects (additional):
bash
export APM_URL="<apm_endpoint>"
export INGEST_URL="<ingest_endpoint>"
Security projects (additional):
bash
export INGEST_URL="<ingest_endpoint>"
Step 3: Acquire Elasticsearch credentials
If
set
, verify the credentials work:
bash
curl -H "Authorization: ApiKey ${ELASTICSEARCH_API_KEY}" \
"${ELASTICSEARCH_URL}/_security/_authenticate"
Confirm the response contains a valid
and
"authentication_type": "api_key"
before proceeding. If
verification succeeds, skip the rest of this step.
If no credentials were loaded, or verification fails, ask the user: "Do you have an existing Elasticsearch API key for
this project?"
If yes — have the user add it to
(see "Credential file format"). Do not accept keys in chat.
Reload and verify:
bash
eval $(python3 skills/cloud/manage-project/scripts/manage-project.py load-credentials \
--name "<project-name>")
curl -H "Authorization: ApiKey ${ELASTICSEARCH_API_KEY}" \
"${ELASTICSEARCH_URL}/_security/_authenticate"
If no — follow this recovery path:
-
Confirm with the user, then reset the admin bootstrap credentials:
bash
python3 skills/cloud/manage-project/scripts/manage-project.py reset-credentials \
--type elasticsearch \
--id <project-id>
The new password is saved to
with the project name in the header. Direct the user to that file
— do not display its contents.
-
Load credentials
with so the admin password is available for API key creation:
bash
eval $(python3 skills/cloud/manage-project/scripts/manage-project.py load-credentials \
--name "<project-name>" --include-admin)
Use the admin credentials to create a scoped Elasticsearch API key via
if available. If that
skill is not installed, ask the user to install it or create the key manually in
Kibana > Stack Management > API
keys. Scope the key to only the privileges the user needs.
-
After creating the API key, save it to
using the project-specific header format (see
"Credential file format" below). Then reload
without to drop admin credentials from the
environment and verify:
bash
eval $(python3 skills/cloud/manage-project/scripts/manage-project.py load-credentials \
--name "<project-name>")
curl -H "Authorization: ApiKey ${ELASTICSEARCH_API_KEY}" \
"${ELASTICSEARCH_URL}/_security/_authenticate"
Confirm the response shows a valid
and
"authentication_type": "api_key"
before proceeding.
Credential file format
See references/credential-file-format.md for the full format specification.
Workflow: Load project credentials
bash
eval $(python3 skills/cloud/manage-project/scripts/manage-project.py load-credentials \
--name "<project-name>")
Or by project ID:
bash
eval $(python3 skills/cloud/manage-project/scripts/manage-project.py load-credentials \
--id <project-id>)
Parses
, merges all sections for the matching project, and prints
statements. Admin
credentials (
/
) are excluded by default — only endpoints and API keys
are exported. Add
when you need admin credentials to create an API key.
Workflow: List projects
bash
python3 skills/cloud/manage-project/scripts/manage-project.py list \
--type elasticsearch
Use
or
to list other project types.
Workflow: Get project details
bash
python3 skills/cloud/manage-project/scripts/manage-project.py get \
--type elasticsearch \
--id <project-id>
Workflow: Update a project
bash
python3 skills/cloud/manage-project/scripts/manage-project.py update \
--type elasticsearch \
--id <project-id> \
--name "new-project-name"
Only the fields provided are updated (PATCH semantics). Supported fields:
,
,
,
,
,
,
.
Alias
The alias is an RFC-1035 domain label (lowercase alphanumeric and hyphens, max 50 chars) that becomes part of the
project's endpoint URLs. Changing the alias changes all endpoint URLs, which breaks existing clients pointing to the
old URLs. Warn the user about this before applying.
bash
python3 skills/cloud/manage-project/scripts/manage-project.py update \
--type elasticsearch \
--id <project-id> \
--alias "prod-search"
Tags
Tags are key-value metadata pairs for team tracking, cost attribution, and organization. Pass
for each
tag. Multiple tags can be set in a single update.
bash
python3 skills/cloud/manage-project/scripts/manage-project.py update \
--type elasticsearch \
--id <project-id> \
--tag env:prod \
--tag team:search
Tags are sent as
in the API request. Setting tags replaces all existing tags on the project — include
any existing tags the user wants to keep.
Elasticsearch search_lake settings
For Elasticsearch projects, two fields control query performance and data caching in the Search AI Lake. Ingested data
is stored in cost-efficient general storage. A cache layer on top provides faster search speed for recent and frequently
queried data — this cached data is considered search-ready.
| Flag | Range | Description |
|---|
| 28–3000 | Query performance level. Higher values improve performance but increase cost |
| 1–180 | Days of data eligible for boosted caching (default: 7) |
Search Power
Search Power controls the speed of searches by provisioning more or fewer query resources. Common presets (matching the
Cloud UI):
| Value | Preset | Behavior |
|---|
| 28 | On-demand | Autoscales with lower baseline. More variable latency, reduced max throughput |
| 100 | Performant | Consistently low latency, autoscales for moderately high throughput |
| 250 | High availability | Optimized for high-throughput scenarios, maintains low latency at high volumes |
When the user asks for a preset by name, map it to the corresponding value. Custom values within 28–3000 are also valid.
Warn the user about cost implications before updating . Higher values increase VCU consumption and may
result in higher bills. Confirm the new value with the user before applying.
Search Boost Window
Non-time-series data is always search-ready. The boost window determines how much
time-series data (documents with a
field) is also kept in the fast cache layer. Increasing the window means a larger portion of time-series
data becomes search-ready, which improves query speed for recent data but increases the search-ready data volume.
Security data retention settings
For security projects, two fields control how long data is retained in the Search AI Lake. Retention is configured per
data stream, but these project-level settings enforce global boundaries.
| Flag | Unit | Description |
|---|
| days | Maximum retention period for any data stream in the project |
| days | Default retention applied to data streams without a custom one |
- Maximum retention — enforces an upper bound across all data streams. When lowered, it replaces the retention for
any stream that currently has a longer period. Data older than the new maximum is permanently deleted.
- Default retention — automatically applied to data streams that do not have a custom retention period set. Does not
affect streams with an existing custom retention.
Warn the user before reducing . Lowering the maximum permanently deletes data older than the new
limit. Confirm the new value with the user before applying.
Workflow: Reset project credentials
Always confirm with the user before resetting.
bash
python3 skills/cloud/manage-project/scripts/manage-project.py reset-credentials \
--type elasticsearch \
--id <project-id>
The new password is saved to
automatically. Tell the user to open that file — do not display its
contents in chat.
Workflow: Delete a project
Always confirm with the user before deleting.
bash
python3 skills/cloud/manage-project/scripts/manage-project.py delete \
--type elasticsearch \
--id <project-id>
Workflow: Resume a suspended project
Projects can be automatically suspended after their trial period expires. Resume with:
bash
python3 skills/cloud/manage-project/scripts/manage-project.py resume \
--type elasticsearch \
--id <project-id>
After resuming, poll the project status until the phase changes from
to
.
Examples
List all security projects
bash
python3 skills/cloud/manage-project/scripts/manage-project.py list --type security
Update search power on an Elasticsearch project
bash
python3 skills/cloud/manage-project/scripts/manage-project.py update \
--type elasticsearch \
--id abc123def456abc123def456abc12345 \
--search-power 500
Set data retention on a security project
bash
python3 skills/cloud/manage-project/scripts/manage-project.py update \
--type security \
--id abc123def456abc123def456abc12345 \
--max-retention-days 396 \
--default-retention-days 396
Change project alias
bash
python3 skills/cloud/manage-project/scripts/manage-project.py update \
--type elasticsearch \
--id abc123def456abc123def456abc12345 \
--alias "prod-search"
Tag a project for cost tracking
bash
python3 skills/cloud/manage-project/scripts/manage-project.py update \
--type observability \
--id abc123def456abc123def456abc12345 \
--tag env:prod \
--tag team:platform
Rename an observability project
bash
python3 skills/cloud/manage-project/scripts/manage-project.py update \
--type observability \
--id abc123def456abc123def456abc12345 \
--name "production-logs"
Connect to an existing project (credentials on file)
text
User: I want to run a query against my search project "prod-search"
Agent: Let me find that project.
[runs: list --type elasticsearch]
I found "prod-search" (ID: abc123...). Loading saved credentials...
[runs: eval $(load-credentials --name prod-search)]
Credentials loaded. Verifying...
[runs: curl -H "Authorization: ApiKey ..." "${ELASTICSEARCH_URL}/_security/_authenticate"]
Verified — authenticated as user "my-service" via API key. You're all set.
Connect to a project (no saved credentials)
text
User: let's ingest some logs into my o11y project "siem-prod"
Agent: Let me find that project.
[runs: list --type observability]
Found "siem-prod" (ID: def456...). No saved credentials found for this project.
[runs: get --type observability --id def456...]
Endpoints set. Since there are no Elasticsearch credentials on file, I can reset the
admin bootstrap password to get initial access. Shall I proceed?
User: yes
Agent: [runs: reset-credentials --type observability --id def456...]
Credentials saved to .elastic-credentials — open that file to retrieve the password.
[runs: eval $(load-credentials --name siem-prod --include-admin)]
Admin credentials loaded for API key creation. Creating a scoped Elasticsearch API key
now — what access does it need?
User: read-only on logs-*
Agent: [creates scoped API key via elasticsearch-authn using admin credentials]
[appends API key to .elastic-credentials with project header]
[runs: eval $(load-credentials --name siem-prod)]
[verifies against ELASTICSEARCH_URL]
API key created and verified. Admin credentials removed from environment.
Delete a project after confirmation
text
User: delete my elasticsearch project abc123...
Agent: Are you sure you want to delete project abc123...? This cannot be undone.
User: yes
bash
python3 skills/cloud/manage-project/scripts/manage-project.py delete \
--type elasticsearch \
--id abc123def456abc123def456abc12345
Guidelines
- Run the skill first if is not set.
- Use the skill to create new projects — this skill handles existing projects only.
- When the user refers to a project the agent did not create, follow the "Connect to an existing project" workflow.
- Deletion is permanent. Always confirm with the user before proceeding.
- After resetting credentials, remind the user to update any stored passwords or environment variables.
- Warn about cost implications before increasing . Confirm the new value with the user first.
- Warn about data loss before reducing . Data older than the new maximum is permanently deleted.
- Warn users that changing a project alias changes all endpoint URLs, which breaks existing clients.
- Setting tags replaces all existing tags. Retrieve current tags with first and include any the user wants to
keep.
Script reference
| Command | Description |
|---|
| List projects by type |
| Get project details by ID |
| Update project name, alias, tags, or search_lake settings |
| Reset project credentials (new password) |
| Delete a project |
| Resume a suspended project |
| Load a project's saved credentials from |
| Flag | Commands | Description |
|---|
| list, get, update, reset-credentials, delete, resume | Project type: , , |
| get, update, reset-credentials, delete, resume, load-credentials | Project ID |
| update, load-credentials | Project name (update: new name; load-credentials: lookup) |
| update | New project alias |
| update | Tag as KEY:VALUE (repeatable, replaces all tags) |
| update | Search power 28–3000 (elasticsearch only) |
| update | Boost window 1–180 days (elasticsearch only) |
| update | Max data retention in days (security only) |
| update | Default data retention in days (security only) |
| load-credentials | Include admin username/password (API key bootstrapping only) |
| reset-credentials | Seconds to wait for credential propagation (0 to skip) |
Environment variables
| Variable | Required | Description |
|---|
| Yes | Elastic Cloud API key (project management operations) |
| No | Cloud API base URL (default: https://api.elastic-cloud.com
) |
| Output | Elasticsearch URL (set after resolving a project for downstream skills) |
| Output | Kibana URL (set after resolving a project for downstream skills) |
| Output | APM endpoint (observability projects only) |
| Output | OTLP ingest endpoint (observability and security projects) |
| Output | Elasticsearch API key (for stack-level operations) |
Additional resources
- For full API details, request/response schemas, and project-type options, see the
Serverless Projects API
- For official documentation on Search AI Lake settings, data retention, and project features, see
Project settings