Loading...
Loading...
Found 3 Skills
Configure Ollama as embedding provider for GrepAI. Use this skill for local, private embedding generation.
Configure auto-configure Ollama when user needs local LLM deployment, free AI alternatives, or wants to eliminate hosted API costs. Trigger phrases: "install ollama", "local AI", "free LLM", "self-hosted AI", "replace OpenAI", "no API costs". Use when appropriate context detected. Trigger with relevant phrases based on skill purpose.
Configure a Mac mini as a reliable local LLM server with remote access, observability, and power-safe operation.