Loading...
Loading...
Expert in photorealistic and stylized VR avatar systems for Apple Vision Pro, Meta Quest, and cross-platform metaverse. Specializes in facial tracking (52+ blend shapes), subsurface scattering, Persona-style generation, Photon networking, and real-time LOD. Activate on 'VR avatar', 'Vision Pro Persona', 'Meta avatar', 'facial tracking', 'blend shapes', 'avatar networking', 'photorealistic avatar'. NOT for 2D profile pictures (use image generation), non-VR game characters (use game engine tools), static 3D models (use modeling tools), or motion capture hardware setup.
npx skill4agent add erichowens/some_claude_skills vr-avatar-engineer| MCP | Purpose |
|---|---|
| Stability AI | Generate avatar concept art, texture references |
| Firecrawl | Research Meta/Apple SDKs, avatar papers |
| WebFetch | Fetch ARKit, Meta SDK documentation |
| Topic | Novice | Expert |
|---|---|---|
| Blend shapes | "Just use morph targets" | Knows ARKit has 52 specific shapes; Meta has different set; mapping required |
| Skin rendering | "Just use PBR" | SSS is essential; different models for different skin tones |
| Eye tracking | "Point eyes at target" | Saccades, microsaccades, blink patterns make presence |
| Networking | "Send all data every frame" | Delta compression, interpolation, dead reckoning |
| Frame rate | "60fps is fine" | Quest: 72/90/120hz modes; Vision Pro: 90hz minimum; dropped frames = nausea |
| LOD | "Lower poly for distance" | Foveated rendering integration, dynamic LOD based on gaze |
// ARKit face tracking to blend shape weights
func mapARKitToAvatar(faceAnchor: ARFaceAnchor) -> [String: Float] {
let arkit = faceAnchor.blendShapes
// Direct mappings (ARKit names → avatar shapes)
var weights: [String: Float] = [:]
weights["jawOpen"] = arkit[.jawOpen]?.floatValue ?? 0
weights["mouthSmileLeft"] = arkit[.mouthSmileLeft]?.floatValue ?? 0
weights["mouthSmileRight"] = arkit[.mouthSmileRight]?.floatValue ?? 0
weights["eyeBlinkLeft"] = arkit[.eyeBlinkLeft]?.floatValue ?? 0
weights["eyeBlinkRight"] = arkit[.eyeBlinkRight]?.floatValue ?? 0
// Derived expressions (combinations)
let smile = ((weights["mouthSmileLeft"] ?? 0) + (weights["mouthSmileRight"] ?? 0)) / 2
weights["expression_happy"] = smile
return weights
}// Photon PUN2 - efficient avatar sync
public struct AvatarState : INetworkStruct {
// Pack position: 3 floats → 6 bytes (half precision)
public Half3 Position;
// Pack rotation: quaternion → 4 bytes (compressed)
public CompressedQuaternion Rotation;
// Blend shapes: 52 weights → 52 bytes (uint8 each, 0-255 → 0-1)
public fixed byte BlendShapes[52];
// Eye gaze: 2 directions → 4 bytes
public Half2 LeftEyeGaze;
public Half2 RightEyeGaze;
// Total: ~70 bytes per update (vs 400+ uncompressed)
}// Simplified SSS for real-time (pre-integrated)
float3 SubsurfaceScattering(float3 normal, float3 light, float3 view,
float curvature, float3 skinColor, float melanin) {
float NdotL = dot(normal, light);
// Wrap lighting for soft terminator
float wrap = 0.5;
float diffuse = saturate((NdotL + wrap) / (1 + wrap));
// Pre-integrated scattering lookup (curvature-based)
float2 sssUV = float2(NdotL * 0.5 + 0.5, curvature);
float3 sss = tex2D(_SSSLookup, sssUV).rgb;
// Melanin affects scattering color
float3 scatterColor = lerp(float3(1, 0.4, 0.25), float3(0.8, 0.5, 0.4), melanin);
return skinColor * diffuse + sss * scatterColor * (1 - diffuse);
}| Platform | Frame Rate | Avatar Poly Budget | Texture Budget |
|---|---|---|---|
| Quest 2 | 72-90 fps | 10-15k tris | 512×512 |
| Quest 3 | 90-120 fps | 20-30k tris | 1024×1024 |
| Vision Pro | 90 fps | 50-100k tris | 2048×2048 |
| PCVR | 90-144 fps | 100k+ tris | 4096×4096 |