remotion-expert

Compare original and translation side by side

🇺🇸

Original

English
🇨🇳

Translation

Chinese

Remotion Expert - High-Performance Video Generation

Remotion专家 - 高性能视频生成

Senior Specialist in Remotion v4.0+, React 19, and Next.js 16. Expert in programmatic video generation, sub-frame animation precision, and AI-driven video workflows for 2026.
Remotion v4.0+、React 19及Next.js 16资深专家,精通2026年程序化视频生成、亚帧级动画精度以及AI驱动的视频工作流。

🧭 Overview

🧭 概述

Remotion allows you to create videos programmatically using React. This skill expands the LLM's capabilities to handle complex animations, dynamic data-driven videos, and high-fidelity rendering pipelines.
Remotion允许你使用React程序化创建视频。这项技能扩展了大语言模型的能力,使其能够处理复杂动画、动态数据驱动视频以及高保真渲染管线。

Core Capabilities

核心能力

  • Programmatic Animation: Frame-perfect control via
    useCurrentFrame
    and
    interpolate
    .
  • Dynamic Compositions: Parameterized videos that adapt to external data.
  • Modern Stack: Fully optimized for React 19.3, Next.js 16.2, and Tailwind CSS 4.0.
  • AI Orchestration: Integration with Remotion Skills for instruction-driven video editing.

  • 程序化动画:通过
    useCurrentFrame
    interpolate
    实现帧级精准控制。
  • 动态合成:可适配外部数据的参数化视频。
  • 现代技术栈:针对React 19.3Next.js 16.2Tailwind CSS 4.0做了全面优化。
  • AI编排:与Remotion Skills集成,实现指令驱动的视频编辑。

🛠️ Table of Contents

🛠️ 目录

⚡ Quick Start

⚡ 快速开始

Scaffold a new project using Bun (recommended for 2026):
bash
bun create video@latest my-video
cd my-video
bun start
使用Bun搭建新项目(2026年推荐):
bash
bun create video@latest my-video
cd my-video
bun start

Basic Composition Pattern

基础合成模式

tsx
import { AbsoluteFill, interpolate, useCurrentFrame, useVideoConfig } from 'remotion';

export const MyVideo = () => {
  const frame = useCurrentFrame();
  const { fps, durationInFrames } = useVideoConfig();

  // Animate from 0 to 1 over the first second
  const opacity = interpolate(frame, [0, fps], [0, 1], {
    extrapolateRight: 'clamp',
  });

  return (
    <AbsoluteFill style={{ 
      backgroundColor: 'white', 
      justifyContent: 'center', 
      alignItems: 'center' 
    }}>
      <h1 style={{ opacity, fontSize: 100 }}>Remotion 2026</h1>
    </AbsoluteFill>
  );
};

tsx
import { AbsoluteFill, interpolate, useCurrentFrame, useVideoConfig } from 'remotion';

export const MyVideo = () => {
  const frame = useCurrentFrame();
  const { fps, durationInFrames } = useVideoConfig();

  // 在第一秒内从0过渡到1
  const opacity = interpolate(frame, [0, fps], [0, 1], {
    extrapolateRight: 'clamp',
  });

  return (
    <AbsoluteFill style={{ 
      backgroundColor: 'white', 
      justifyContent: 'center', 
      alignItems: 'center' 
    }}>
      <h1 style={{ opacity, fontSize: 100 }}>Remotion 2026</h1>
    </AbsoluteFill>
  );
};

🛡️ Mandatory Rules & Anti-Patterns

🛡️ 强制规则与反模式

  1. NO CSS ANIMATIONS: Never use standard CSS
    @keyframes
    or
    transition
    . They are not deterministic and will fail during rendering. Use
    interpolate()
    or
    spring()
    .
  2. Deterministic Logic: Ensure all calculations are derived from
    frame
    . Avoid
    Math.random()
    or
    Date.now()
    inside components unless seeded.
  3. Zod Validation: Always use Zod for
    defaultProps
    to ensure type safety in parameterized videos.
  4. Asset Preloading: Use
    staticFile()
    for local assets and ensure remote assets are reachable during render.

  1. 禁止使用CSS动画:永远不要使用标准CSS
    @keyframes
    transition
    。它们不具备确定性,会在渲染时失败。请使用
    interpolate()
    spring()
  2. 确定性逻辑:确保所有计算都基于
    frame
    推导。除非进行种子初始化,否则避免在组件内部使用
    Math.random()
    Date.now()
  3. Zod验证:始终使用Zod对
    defaultProps
    进行验证,确保参数化视频的类型安全。
  4. 资源预加载:对本地资源使用
    staticFile()
    ,并确保远程资源在渲染时可访问。

🧠 Core Concepts

🧠 核心概念

1. Frame-Based Animation

1. 基于帧的动画

Everything is a function of the current frame.
tsx
const frame = useCurrentFrame();
const scale = interpolate(frame, [0, 20], [0, 1], { easing: Easing.bezier(0.25, 0.1, 0.25, 1) });
所有内容都是当前帧的函数。
tsx
const frame = useCurrentFrame();
const scale = interpolate(frame, [0, 20], [0, 1], { easing: Easing.bezier(0.25, 0.1, 0.25, 1) });

2. Composition Architecture

2. 合成架构

Compositions are the "entry points". They define the canvas.
tsx
<Composition
  id="Main"
  component={MyComponent}
  durationInFrames={300}
  fps={60}
  width={1920}
  height={1080}
  defaultProps={{ title: 'Hello' }}
/>

合成是“入口点”,用于定义画布。
tsx
<Composition
  id="Main"
  component={MyComponent}
  durationInFrames={300}
  fps={60}
  width={1920}
  height={1080}
  defaultProps={{ title: 'Hello' }}
/>

🚀 Advanced Patterns

🚀 高级模式

AI-Driven Video Modification (2026)

AI驱动的视频修改(2026年)

Integration with "Remotion Skills" allows for natural language instructions to modify compositions.
tsx
// Pattern: Instruction-driven prop updates
export const aiUpdateHandler = async (instruction: string, currentProps: Props) => {
  // Logic to map LLM output to Remotion props
  return updatedProps;
};
与“Remotion Skills”集成,允许通过自然语言指令修改合成内容。
tsx
// 模式:指令驱动的属性更新
export const aiUpdateHandler = async (instruction: string, currentProps: Props) => {
  // 将大语言模型输出映射到Remotion属性的逻辑
  return updatedProps;
};

Dynamic Metadata Calculation

动态元数据计算

Fetch data before the composition renders to set duration or dimensions.
tsx
export const calculateMetadata = async ({ props }) => {
  const response = await fetch(`https://api.v2.com/video-data/${props.id}`);
  const data = await response.json();
  return {
    durationInFrames: data.duration * 60,
    props: { ...props, content: data.content }
  };
};

在合成渲染之前获取数据,以设置时长或尺寸。
tsx
export const calculateMetadata = async ({ props }) => {
  const response = await fetch(`https://api.v2.com/video-data/${props.id}`);
  const data = await response.json();
  return {
    durationInFrames: data.duration * 60,
    props: { ...props, content: data.content }
  };
};

🚫 The "Do Not" List (Common Mistakes)

🚫 禁忌清单(常见错误)

  • DO NOT use
    setTimeout
    or
    setInterval
    . They do not sync with the renderer.
  • DO NOT use
    npm
    for 2026 workflows; prefer
    bun
    for sub-second install and execution.
  • DO NOT forget to use
    <Sequence>
    for delaying elements. Manual frame offsets are error-prone.
  • DO NOT use Tailwind 3.x patterns; leverage Tailwind 4.0 native container queries for responsive video layouts.
  • DO NOT use
    useState
    for animation progress. Animation state must always be derived from
    frame
    .
  • DO NOT perform heavy computations inside the render loop without
    useMemo
    . Remember that the component renders every frame.
  • DO NOT use external libraries that rely on
    window.requestAnimationFrame
    . They won't be captured by the Remotion renderer.
  • DO NOT hardcode frame counts. Always use constants or relative calculations like
    2 * fps
    .

  • 禁止使用
    setTimeout
    setInterval
    。它们无法与渲染器同步。
  • 禁止在2026年工作流中使用
    npm
    ;优先使用
    bun
    以实现亚秒级的安装和执行。
  • 禁止忘记使用
    <Sequence>
    来延迟元素。手动设置帧偏移容易出错。
  • 禁止使用Tailwind 3.x模式;请利用Tailwind 4.0原生容器查询实现响应式视频布局。
  • 禁止使用
    useState
    来管理动画进度。动画状态必须始终从
    frame
    推导。
  • 禁止在渲染循环中执行大量计算而不使用
    useMemo
    。请记住,组件会每一帧都重新渲染。
  • 禁止使用依赖
    window.requestAnimationFrame
    的外部库。它们无法被Remotion渲染器捕获。
  • 禁止硬编码帧数量。始终使用常量或相对计算,如
    2 * fps

📚 References

📚 参考资料

  • Animations & Timing - Precision interpolation and springs.
  • Compositions & Props - Structuring complex video projects.
  • Media & Assets - Handling Video, Audio, and Lottie.
  • Sequencing & Series - Timeline orchestration.
  • Next.js Integration - SSR and Server Actions for Video.

Updated: January 22, 2026 - 20:00
  • 动画与计时 - 精准插值与弹簧动画。
  • 合成与属性 - 构建复杂视频项目的结构。
  • 媒体与资源 - 处理视频、音频和Lottie动画。
  • 序列编排 - 时间轴编排。
  • Next.js集成 - 视频的SSR与Server Actions。

更新时间:2026年1月22日 20:00