Loading...
Loading...
Use when implementing ANY Apple Intelligence or on-device AI feature. Covers Foundation Models, @Generable, LanguageModelSession, structured output, Tool protocol, iOS 26 AI integration.
npx skill4agent add charleswiltgen/axiom axiom-ai| Developer Intent | Route To |
|---|---|
| On-device text generation (Apple Intelligence) | Stay here → Foundation Models skills |
| Custom ML model deployment (PyTorch, TensorFlow) | See skills/ios-ml.md → CoreML conversion, compression |
| Computer vision (image analysis, OCR, segmentation) | /skill axiom-vision → Vision framework |
| Cloud API integration (OpenAI, etc.) | /skill axiom-networking → URLSession patterns |
| System AI features (Writing Tools, Genmoji) | No custom code needed — these are system-provided |
awaitskills/foundation-models.mdskills/foundation-models-ref.mdskills/foundation-models-diag.mdfoundation-models-auditor/axiom:audit foundation-models| Thought | Reality |
|---|---|
| "Foundation Models is just LanguageModelSession" | Foundation Models has @Generable, Tool protocol, streaming, and guardrails. foundation-models covers all. |
| "I'll figure out the AI patterns as I go" | AI APIs have specific error handling and fallback requirements. foundation-models prevents runtime failures. |
| "I've used LLMs before, this is similar" | Apple's on-device models have unique constraints (guardrails, context limits). foundation-models is Apple-specific. |
skills/foundation-models.mdskills/foundation-models-diag.mdskills/foundation-models-ref.mdskills/foundation-models.mdskills/foundation-models.mdaxiom-concurrencyfoundation-models-auditorskills/ios-ml.md