Loading...
Loading...
Automatic LLM provider failover with fallback chains, inspired by OpenClaw/ZeroClaw model configuration.
npx skill4agent add winsorllc/upgraded-carnival model-failover# Comma-separated list of providers (in fallback order)
export LLM_PROVIDER_CHAIN="anthropic:claude-3-5-sonnet-20241022,openai:gpt-4o-mini,google:gemini-1.5-flash"
# API keys for each provider
export ANTHROPIC_API_KEY="sk-ant-..."
export OPENAI_API_KEY="sk-..."
export GOOGLE_API_KEY="..."{baseDir}/model-failover.js chat "Your message here"{baseDir}/model-failover.js add-provider anthropic claude-3-5-sonnet-20241022{baseDir}/model-failover.js remove-provider openai{baseDir}/model-failover.js list{baseDir}/model-failover.js health{baseDir}/model-failover.js reset| Environment Variable | Description | Default |
|---|---|---|
| Comma-separated | |
| Anthropic API key | - |
| OpenAI API key | - |
| Google API key | - |
| Custom provider API key | - |
| Max retries per provider | 2 |
| Delay between retries | 1000 |
provider:modelanthropicopenaigooglecustomOPENAI_BASE_URL