Loading...
Loading...
Compare original and translation side by side
Skill by ara.so — Daily 2026 Skills collection.
http://127.0.0.1:8088/由ara.so开发的技能——属于Daily 2026技能合集。
http://127.0.0.1:8088/pip install copaw
copaw init --defaults # non-interactive setup with sensible defaults
copaw app # starts the web Console + backendpip install copaw
copaw init --defaults # 使用合理默认值的非交互式初始化
copaw app # 启动Web控制台 + 后端服务curl -fsSL https://copaw.agentscope.io/install.sh | bashcurl -fsSL https://copaw.agentscope.io/install.sh | bash
**Windows CMD:**
```cmd
curl -fsSL https://copaw.agentscope.io/install.bat -o install.bat && install.batirm https://copaw.agentscope.io/install.ps1 | iexcopaw init --defaults
copaw app
**Windows CMD:**
```cmd
curl -fsSL https://copaw.agentscope.io/install.bat -o install.bat && install.batirm https://copaw.agentscope.io/install.ps1 | iexcopaw init --defaults
copaw appgit clone https://github.com/agentscope-ai/CoPaw.git
cd CoPaw
pip install -e ".[dev]"
copaw init --defaults
copaw appgit clone https://github.com/agentscope-ai/CoPaw.git
cd CoPaw
pip install -e ".[dev]"
copaw init --defaults
copaw appcopaw init # interactive workspace setup
copaw init --defaults # non-interactive setup
copaw app # start the Console (http://127.0.0.1:8088/)
copaw app --port 8090 # use a custom port
copaw --help # list all commandscopaw init # 交互式工作区初始化
copaw init --defaults # 非交互式初始化
copaw app # 启动控制台(地址:http://127.0.0.1:8088/)
copaw app --port 8090 # 使用自定义端口
copaw --help # 查看所有命令copaw init~/.copaw/workspace/~/.copaw/workspace/
├── config.yaml # agent, provider, channel configuration
├── skills/ # custom skill files (auto-loaded)
│ └── my_skill.py
├── memory/ # conversation memory storage
└── logs/ # runtime logscopaw init~/.copaw/workspace/~/.copaw/workspace/
├── config.yaml # Agent、提供商、渠道配置文件
├── skills/ # 自定义技能文件(自动加载)
│ └── my_skill.py
├── memory/ # 对话记忆存储目录
└── logs/ # 运行时日志目录config.yamlconfig.yamlcopaw initcopaw initproviders:
- id: openai-main
type: openai
api_key: ${OPENAI_API_KEY} # use env var reference
model: gpt-4o
base_url: https://api.openai.com/v1
- id: local-ollama
type: ollama
model: llama3.2
base_url: http://localhost:11434providers:
- id: openai-main
type: openai
api_key: ${OPENAI_API_KEY} # 使用环境变量引用
model: gpt-4o
base_url: https://api.openai.com/v1
- id: local-ollama
type: ollama
model: llama3.2
base_url: http://localhost:11434agent:
name: CoPaw
language: en # en, zh, ja, etc.
provider_id: openai-main
context_limit: 8000agent:
name: CoPaw
language: en # 支持en、zh、ja等
provider_id: openai-main
context_limit: 8000channels:
- type: dingtalk
app_key: ${DINGTALK_APP_KEY}
app_secret: ${DINGTALK_APP_SECRET}
agent_id: ${DINGTALK_AGENT_ID}
mention_only: true # only respond when @mentioned in groupschannels:
- type: dingtalk
app_key: ${DINGTALK_APP_KEY}
app_secret: ${DINGTALK_APP_SECRET}
agent_id: ${DINGTALK_AGENT_ID}
mention_only: true # 仅在群聊中被@时响应channels:
- type: feishu
app_id: ${FEISHU_APP_ID}
app_secret: ${FEISHU_APP_SECRET}
mention_only: falsechannels:
- type: feishu
app_id: ${FEISHU_APP_ID}
app_secret: ${FEISHU_APP_SECRET}
mention_only: falsechannels:
- type: discord
token: ${DISCORD_BOT_TOKEN}
mention_only: truechannels:
- type: discord
token: ${DISCORD_BOT_TOKEN}
mention_only: truechannels:
- type: telegram
token: ${TELEGRAM_BOT_TOKEN}channels:
- type: telegram
token: ${TELEGRAM_BOT_TOKEN}channels:
- type: qq
uin: ${QQ_UIN}
password: ${QQ_PASSWORD}channels:
- type: qq
uin: ${QQ_UIN}
password: ${QQ_PASSWORD}channels:
- type: mattermost
url: ${MATTERMOST_URL}
token: ${MATTERMOST_TOKEN}
team: my-teamchannels:
- type: mattermost
url: ${MATTERMOST_URL}
token: ${MATTERMOST_TOKEN}
team: my-teamchannels:
- type: matrix
homeserver: ${MATRIX_HOMESERVER}
user_id: ${MATRIX_USER_ID}
access_token: ${MATRIX_ACCESS_TOKEN}channels:
- type: matrix
homeserver: ${MATRIX_HOMESERVER}
user_id: ${MATRIX_USER_ID}
access_token: ${MATRIX_ACCESS_TOKEN}~/.copaw/workspace/skills/~/.copaw/workspace/skills/undefinedundefinedapi_key = os.environ["OPENWEATHER_API_KEY"]
url = f"https://api.openweathermap.org/data/2.5/weather"
resp = requests.get(url, params={"q": city, "appid": api_key, "units": "metric"})
resp.raise_for_status()
data = resp.json()
temp = data["main"]["temp"]
desc = data["weather"][0]["description"]
return f"{city}: {temp}°C, {desc}"undefinedapi_key = os.environ["OPENWEATHER_API_KEY"]
url = f"https://api.openweathermap.org/data/2.5/weather"
resp = requests.get(url, params={"q": city, "appid": api_key, "units": "metric"})
resp.raise_for_status()
data = resp.json()
temp = data["main"]["temp"]
desc = data["weather"][0]["description"]
return f"{city}: {temp}°C, {desc}"undefinedundefinedundefinedasync with httpx.AsyncClient(timeout=15) as client:
resp = await client.get(url)
text = resp.text[:4000] # truncate for context limit
return f"Content preview from {url}:\n{text}"undefinedasync with httpx.AsyncClient(timeout=15) as client:
resp = await client.get(url)
text = resp.text[:4000] # 截断内容以符合上下文限制
return f"Content preview from {url}:\n{text}"undefinedundefinedundefined
---
---config.yamlcron:
- id: daily-digest
schedule: "0 8 * * *" # every day at 08:00
skill: get_weather
skill_args:
city: "Tokyo"
channel_id: dingtalk-main # matches a channel id below
message_template: "Good morning! Today's weather: {result}"
- id: hourly-news
schedule: "0 * * * *"
skill: fetch_tech_news
channel_id: discord-mainconfig.yamlcron:
- id: daily-digest
schedule: "0 8 * * *" # 每天08:00执行
skill: get_weather
skill_args:
city: "Tokyo"
channel_id: dingtalk-main # 与下方渠道ID匹配
message_template: "Good morning! Today's weather: {result}"
- id: hourly-news
schedule: "0 * * * *"
skill: fetch_tech_news
channel_id: discord-mainundefinedundefined
```yaml
```yamlproviders:
- id: lmstudio-local
type: lmstudio
model: lmstudio-community/Meta-Llama-3-8B-Instruct-GGUF
base_url: http://localhost:1234/v1providers:
- id: lmstudio-local
type: lmstudio
model: lmstudio-community/Meta-Llama-3-8B-Instruct-GGUF
base_url: http://localhost:1234/v1pip install "copaw[llamacpp]"providers:
- id: llamacpp-local
type: llamacpp
model_path: /path/to/model.ggufpip install "copaw[llamacpp]"providers:
- id: llamacpp-local
type: llamacpp
model_path: /path/to/model.ggufconfig.yamlagent:
tool_guard:
enabled: true
risk_patterns:
- "rm -rf"
- "DROP TABLE"
- "os.system"
auto_approve_low_risk: trueconfig.yamlagent:
tool_guard:
enabled: true
risk_patterns:
- "rm -rf"
- "DROP TABLE"
- "os.system"
auto_approve_low_risk: trueundefinedundefined
---
---copaw appconfig.yaml${VAR_NAME}undefinedcopaw appconfig.yaml${VAR_NAME}undefined
---
---undefinedundefined
```python
```pythontoday = datetime.date.today().strftime("%A, %B %d")
# Add your own data sources here
return f"Good morning! Today is {today}. Have a productive day!"undefinedtoday = datetime.date.today().strftime("%A, %B %d")
# 在此添加自定义数据源
return f"Good morning! Today is {today}. Have a productive day!"undefinedundefinedundefinedundefinedundefinedundefinedundefinedpath = os.path.expanduser(file_path)
if not os.path.exists(path):
return f"File not found: {path}"
with open(path, "r", encoding="utf-8", errors="ignore") as f:
content = f.read(8000)
return f"File: {path}\nSize: {os.path.getsize(path)} bytes\nContent preview:\n{content}"
---path = os.path.expanduser(file_path)
if not os.path.exists(path):
return f"File not found: {path}"
with open(path, "r", encoding="utf-8", errors="ignore") as f:
content = f.read(8000)
return f"File: {path}\nSize: {os.path.getsize(path)} bytes\nContent preview:\n{content}"
---undefinedundefinedundefinedundefined~/.copaw/workspace/skills/SKILL_NAMESKILL_DESCRIPTIONSKILL_SCHEMA~/.copaw/workspace/logs/copaw app~/.copaw/workspace/skills/SKILL_NAMESKILL_DESCRIPTIONSKILL_SCHEMA~/.copaw/workspace/logs/copaw appconfig.yamlmention_only: trueSend Messagesconfig.yamlmention_only: trueSend Messagesundefinedundefined
- For Ollama: confirm `ollama serve` is running and `base_url` matches
- For OpenAI-compatible APIs: verify `base_url` ends with `/v1`
- LLM calls auto-retry with exponential backoff — transient failures resolve automatically
- 对于Ollama:确认`ollama serve`已启动且`base_url`匹配
- 对于兼容OpenAI的API:验证`base_url`以`/v1`结尾
- LLM调用会自动指数退避重试——临时故障会自动恢复undefinedundefined
Or set in environment:
```bash
export PYTHONIOENCODING=utf-8
或在环境变量中设置:
```bash
export PYTHONIOENCODING=utf-8undefinedundefined
---
---