Loading...
Loading...
Migrate hardcoded prompts to Langfuse for version control and deployment-free iteration. Use when user wants to externalize prompts, move prompts to Langfuse, or set up prompt management.
npx skill4agent add langfuse/skills langfuse-prompt-migrationecho $LANGFUSE_PUBLIC_KEY # pk-...
echo $LANGFUSE_SECRET_KEY # sk-...
echo $LANGFUSE_HOST # https://cloud.langfuse.com or self-hosted1. Scan codebase for prompts
2. Analyze templating compatibility
3. Propose structure (names, subprompts, variables)
4. User approves
5. Create prompts in Langfuse
6. Refactor code to use get_prompt()
7. Link prompts to traces (if tracing enabled)
8. Verify application works| Framework | Look for |
|---|---|
| OpenAI | |
| Anthropic | |
| LangChain | |
| Vercel AI | |
| Raw | Multi-line strings near LLM calls |
{{variable}}| Template Feature | Langfuse Native | Action |
|---|---|---|
| ✅ | Direct migration |
| ⚠️ | Convert to |
| ❌ | Move logic to code |
| ❌ | Apply filter in code |
Contains {% if %}, {% for %}, or filters?
├─ No → Direct migration
└─ Yes → Choose:
├─ Option A (RECOMMENDED): Move logic to code, pass pre-computed values
└─ Option B: Store raw template, compile client-side with Jinja2
└─ ⚠️ Loses: Playground preview, UI experiments# Instead of {% if user.is_premium %}...{% endif %} in prompt
# Use {{tier_message}} and compute value in code before compile()# Instead of {% for tool in tools %}...{% endfor %} in prompt
# Use {{tools_list}} and format the list in code before compile()| Rule | Example | Bad |
|---|---|---|
| Lowercase, hyphenated | | |
| Feature-based | | |
| Hierarchical for related | | |
Prefix subprompts with | | |
| Make Variable | Keep Hardcoded |
|---|---|
User-specific ( | Output format instructions |
Dynamic content ( | Safety guardrails |
Per-request ( | Persona/personality |
Environment-specific ( | Static examples |
Found N prompts across M files:
src/chat.py:
- System prompt (47 lines) → 'chat-assistant'
src/support/triage.py:
- Triage prompt (34 lines) → 'support/triage'
⚠️ Contains {% if %} - will simplify
Subprompts to extract:
- '_base-personality' - used by: chat-assistant, support/triage
Variables to add:
- {{user_name}} - hardcoded in 2 prompts
Proceed?langfuse.create_prompt()nameprompttype"text""chat"labels["production"]configproductionstaginglatestprompt = langfuse.get_prompt("name", label="production")
messages = prompt.compile(var1=value1, var2=value2)label="production"latest.compile()@observe()langfuse.trace()from langfuse.openai import openai| Setup | How to Link |
|---|---|
| |
| Manual tracing | |
| OpenAI integration | |
productionlabel="production"| Issue | Solution |
|---|---|
| Check name spelling |
| Variables not replaced | Use |
| Subprompt not resolved | Must exist with same label |
| Old prompt cached | Restart app |