Loading...
Loading...
Upload, sync, and manage files across cloud storage providers using rclone. Use when uploading files (images, videos, documents) to S3, Cloudflare R2, Backblaze B2, Google Drive, Dropbox, or any S3-compatible storage. Triggers on "upload to S3", "sync to cloud", "rclone", "backup files", "upload video/image to bucket", or requests to transfer files to remote storage.
npx skill4agent add everyinc/compound-engineering-plugin rclone# Check if rclone is installed
command -v rclone >/dev/null 2>&1 && echo "rclone installed: $(rclone version | head -1)" || echo "NOT INSTALLED"
# List configured remotes
rclone listremotes 2>/dev/null || echo "NO REMOTES CONFIGURED"# macOS
brew install rclone
# Linux (script install)
curl https://rclone.org/install.sh | sudo bash
# Or via package manager
sudo apt install rclone # Debian/Ubuntu
sudo dnf install rclone # Fedorarclone config| Provider | Type | Key Settings |
|---|---|---|
| AWS S3 | | access_key_id, secret_access_key, region |
| Cloudflare R2 | | access_key_id, secret_access_key, endpoint (account_id.r2.cloudflarestorage.com) |
| Backblaze B2 | | account (keyID), key (applicationKey) |
| DigitalOcean Spaces | | access_key_id, secret_access_key, endpoint (region.digitaloceanspaces.com) |
| Google Drive | | OAuth flow (opens browser) |
| Dropbox | | OAuth flow (opens browser) |
rclone config create r2 s3 \
provider=Cloudflare \
access_key_id=YOUR_ACCESS_KEY \
secret_access_key=YOUR_SECRET_KEY \
endpoint=ACCOUNT_ID.r2.cloudflarestorage.com \
acl=privaterclone config create aws s3 \
provider=AWS \
access_key_id=YOUR_ACCESS_KEY \
secret_access_key=YOUR_SECRET_KEY \
region=us-east-1rclone copy /path/to/file.mp4 remote:bucket/path/ --progressrclone copy /path/to/folder remote:bucket/folder/ --progressrclone sync /local/path remote:bucket/path/ --progressrclone ls remote:bucket/
rclone lsd remote:bucket/ # directories onlyrclone copy /path remote:bucket/ --dry-run| Flag | Purpose |
|---|---|
| Show transfer progress |
| Preview without transferring |
| Verbose output |
| Parallel transfers (default 4) |
| Bandwidth limit (e.g., |
| Compare by checksum, not size/time |
| Exclude patterns |
| Include only matching |
| Skip files smaller than SIZE |
| Skip files larger than SIZE |
# S3 multipart upload (automatic for >200MB)
rclone copy large_video.mp4 remote:bucket/ --s3-chunk-size=64M --progress
# Resume interrupted transfers
rclone copy /path remote:bucket/ --progress --retries=5# Check file exists and matches
rclone check /local/file remote:bucket/file
# Get file info
rclone lsl remote:bucket/path/to/file# Test connection
rclone lsd remote:
# Debug connection issues
rclone lsd remote: -vv
# Check config
rclone config show remote