Loading...
Loading...
Build, debug, and optimize RealityKit scenes for visionOS, including entity/component setup, rendering, animation, physics, audio, input, attachments, and custom systems. Use when implementing RealityKit features or troubleshooting ECS behavior on visionOS.
npx skill4agent add tomkrikorian/visionosagents realitykit-visionos-developerComponentCodableComponent.registerComponent()RealityViewEntity(named:)Entity(contentsOf:)ViewAttachmentComponentRealityViewSystemEntityQueryQueryPredicateupdate(context:)SystemDependency| Component | When to Use |
|---|---|
| When rendering 3D geometry with meshes and materials on entities. |
| When experiencing depth fighting (z-fighting) issues with overlapping geometry or need to control draw order. |
| When creating fade effects, making entities semi-transparent, or implementing visibility transitions. |
| When optimizing performance in large scenes by reducing render quality for distant objects. |
| When debugging rendering issues, visualizing model geometry, or inspecting bounding boxes during development. |
| When rendering many copies of the same mesh efficiently (trees, crowds, particle-like objects). |
| When implementing facial animation, character expressions, or morphing mesh deformations. |
| Component | When to Use |
|---|---|
| When making entities interactive (tappable, draggable) or handling user input events. |
| When implementing built-in drag, rotate, and scale interactions with hand gestures or trackpad. |
| When implementing custom gesture recognition beyond what ManipulationComponent provides. |
| When providing visual feedback when users look at or hover over interactive entities. |
| When making entities accessible to screen readers, VoiceOver, or other assistive technologies. |
| When creating 2D sprites, text labels, or UI elements that should always face the viewer. |
| Component | When to Use |
|---|---|
| When anchoring virtual content to detected planes, tracked images, hand locations, or world targets. |
| When accessing the underlying ARKit anchor data for an anchored entity. |
| When accessing scene understanding data like detected objects or room reconstruction. |
| When defining regions where content can automatically dock or snap into place. |
| When implementing lazy loading of external entity assets or referencing entities in other files. |
| When attaching an entity's transform to another entity for hierarchical positioning. |
| Component | When to Use |
|---|---|
| When configuring a perspective camera with depth and field of view for 3D scenes. |
| When configuring an orthographic camera without perspective distortion for 2D-like views. |
| When implementing custom camera projection transforms for specialized rendering needs. |
| Component | When to Use |
|---|---|
| When adding an omnidirectional point light that radiates in all directions from a position. |
| When adding a directional light with parallel rays (like sunlight) for consistent scene lighting. |
| When adding a cone-shaped spotlight for focused, directional lighting effects. |
| When applying environment lighting from HDR textures for realistic reflections and ambient lighting. |
| When enabling entities to receive and respond to image-based lighting in the scene. |
| When adding grounding shadows to visually anchor floating content to surfaces. |
| When enabling real-time dynamic shadows cast by light sources onto entities. |
| When configuring environment lighting behavior, intensity, or blending modes. |
| When implementing reflection probes for accurate reflections in virtual environments. |
| Component | When to Use |
|---|---|
| When playing 3D positioned audio that changes based on listener position and orientation. |
| When playing non-directional ambient audio that doesn't change with listener position. |
| When playing channel-based audio content (stereo, surround, etc.) without spatialization. |
| When storing and managing multiple audio resources for reuse across entities. |
| When applying reverb effects to an entity's audio for spatial acoustic simulation. |
| When grouping audio sources for centralized mixing control and volume management. |
| Component | When to Use |
|---|---|
| When storing multiple animations (idle, walk, run) on a single entity for character animation. |
| When implementing character movement with physics, collision, and ground detection. |
| When storing runtime state (velocity, grounded status) for a character controller. |
| When providing skeletal pose data for skeletal animation and bone transformations. |
| When implementing inverse kinematics for procedural animation (e.g., reaching, pointing). |
| When integrating ARKit body tracking data to animate entities based on real-world body poses. |
| Component | When to Use |
|---|---|
| When defining collision shapes for hit testing, raycasting, or physics interactions. |
| When adding physical behavior (mass, gravity, forces) to entities for physics simulation. |
| When controlling linear and angular velocity of physics bodies programmatically. |
| When configuring global physics simulation parameters like gravity or timestep. |
| When emitting particle effects (smoke, sparks, debris) from an entity position. |
| When applying force fields (gravity wells, explosions) that affect multiple physics bodies. |
| When creating joints (hinges, springs) between physics bodies for articulated structures. |
| When defining geometric attachment points for connecting entities at specific locations. |
| Component | When to Use |
|---|---|
| When creating portals that render a separate world or scene through an opening. |
| When designating an entity hierarchy as a separate renderable world for portal rendering. |
| When controlling behavior (teleportation, scene switching) when entities cross portal boundaries. |
| When blending virtual content with the real environment for mixed reality experiences. |
| Component | When to Use |
|---|---|
| When embedding SwiftUI views into 3D space for interactive UI elements or labels. |
| When presenting SwiftUI modals, sheets, or system UI from an entity interaction. |
| When rendering 3D text directly on entities without using SwiftUI views. |
| When displaying images or textures on entities in 3D space. |
| When playing video content on entity surfaces using AVPlayer. |
| Component | When to Use |
|---|---|
| When synchronizing entity state, transforms, and components across networked multiplayer sessions. |
| When marking entities as temporary, non-persistent, and excluded from network synchronization. |
| System/API | When to Use |
|---|---|
| When creating custom systems for continuous, per-frame behavior or custom components for per-entity state. |
RealityView { content in
do {
let entity = try await Entity(named: "Scene")
content.add(entity)
} catch {
print("Failed to load entity: \(error)")
}
}let entity = ModelEntity(mesh: .generateBox(size: 0.1))
entity.components.set(CollisionComponent(shapes: [.generateBox(size: [0.1, 0.1, 0.1])]))
entity.components.set(InputTargetComponent())
entity.components.set(ManipulationComponent())import RealityKit
struct SpinComponent: Component, Codable {
var speed: Float
}
struct SpinSystem: System {
static let query = EntityQuery(where: .has(SpinComponent.self))
init(scene: Scene) {}
func update(context: SceneUpdateContext) {
for entity in context.entities(matching: Self.query, updatingSystemWhen: .rendering) {
guard let spin = entity.components[SpinComponent.self] else { continue }
entity.transform.rotation *= simd_quatf(angle: spin.speed * Float(context.deltaTime), axis: [0, 1, 0])
}
}
}
SpinSystem.registerSystem()ARViewRealityViewCollisionComponentInputTargetComponentRealityViewSystemboxsphereplanecylindercone