Loading...
Loading...
Expert in drone systems, computer vision, and autonomous navigation. Specializes in flight control, SLAM, object detection, sensor fusion, and path planning. Activate on "drone", "UAV", "SLAM", "visual odometry", "PID control", "MAVLink", "Pixhawk", "path planning", "A*", "RRT", "EKF", "sensor fusion", "optical flow", "ByteTrack". NOT for domain-specific inspection tasks like fire detection, roof damage assessment, or thermal analysis (use drone-inspection-specialist), GPU shader optimization (use metal-shader-expert), or general image classification without drone context (use clip-aware-embeddings).
npx skill4agent add erichowens/some_claude_skills drone-cv-expertUser mentions drones or UAVs?
├─ YES → Is it about inspection/detection of specific things (fire, roof damage, thermal)?
│ ├─ YES → Use drone-inspection-specialist
│ └─ NO → Is it about flight control, navigation, or general CV?
│ ├─ YES → Use THIS SKILL (drone-cv-expert)
│ └─ NO → Is it about GPU rendering/shaders?
│ ├─ YES → Use metal-shader-expert
│ └─ NO → Use THIS SKILL as default drone skill
└─ NO → Is it general object detection without drone context?
├─ YES → Use clip-aware-embeddings or other CV skill
└─ NO → Probably not a drone question| Altitude | Speed | Resolution | FPS | Rationale |
|---|---|---|---|---|
| <30m | Slow | 1920x1080 | 30 | Detail needed |
| 30-100m | Medium | 1280x720 | 30 | Balance |
| >100m | Fast | 640x480 | 60 | Speed priority |
Thread 1: Camera capture (async)
Thread 2: Object detection (GPU)
Thread 3: Tracking + state estimation
Thread 4: Control commands| Constraint | Model | Notes |
|---|---|---|
| Latency critical | YOLOv8n | 6ms inference |
| Balanced | YOLOv8s | 15ms, better accuracy |
| Accuracy first | YOLOv8x | 50ms, highest mAP |
| Edge device | YOLOv8n + TensorRT | 3ms on Jetson |
| Problem | Classical Approach | Deep Learning | When to Use Each |
|---|---|---|---|
| Feature tracking | KLT optical flow | FlowNet | Classical: Real-time, limited compute. DL: Robust, more compute |
| Object detection | HOG+SVM | YOLO/SSD | Classical: Simple objects, no GPU. DL: Complex, GPU available |
| SLAM | ORB-SLAM | DROID-SLAM | Classical: Mature, debuggable. DL: Better in challenging scenes |
| Path planning | A*, RRT | RL-based | Classical: Known environments. DL: Complex, dynamic |
| Message | Purpose | Frequency |
|---|---|---|
| HEARTBEAT | Connection alive | 1 Hz |
| ATTITUDE | Roll/pitch/yaw | 10-100 Hz |
| LOCAL_POSITION_NED | Position | 10-50 Hz |
| GPS_RAW_INT | Raw GPS | 1-10 Hz |
| SET_POSITION_TARGET | Commands | As needed |
| Matrix | High Values | Low Values |
|---|---|---|
| Q (process noise) | Trust measurements more | Trust model more |
| R (measurement noise) | Trust model more | Trust measurements more |
| P (initial covariance) | Uncertain initial state | Confident initial state |
| Frame | Origin | Axes | Use |
|---|---|---|---|
| NED | Takeoff point | North-East-Down | Navigation |
| ENU | Takeoff point | East-North-Up | ROS standard |
| Body | Drone CG | Forward-Right-Down | Control |
| Camera | Lens center | Right-Down-Forward | Vision |
references/navigation-algorithms.mdsensor-fusion-ekf.mdobject-detection-tracking.md| Tool | Strengths | Weaknesses | Best For |
|---|---|---|---|
| Gazebo | ROS integration, physics | Graphics quality | ROS development |
| AirSim | Photorealistic, CV-focused | Windows-centric | Vision algorithms |
| Webots | Multi-robot, accessible | Less drone-specific | Swarm simulations |
| MATLAB/Simulink | Control design | Not real-time | Controller tuning |