Surf APIs
Unified pay-per-use API at
. No signup, no API keys - just pay per call.
10 composite REST endpoints + 9 MCP tools. Payments in USDC on Solana, Base, or Tempo (MPP).
| Surface | Endpoints | Price |
|---|
| Twitter | search, tweet, user | $0.005/call |
| Reddit | search, post, subreddit, user | $0.005/call |
| Web | search | $0.01/call |
| Web | crawl | $0.002/call |
| Inference | 15 models | $0.001 - dynamic |
Try it
Test any endpoint with
x402-proxy:
bash
# Twitter user profile + recent tweets ($0.005)
npx x402-proxy https://surf.cascade.fyi/api/v1/twitter/user/cascade_fyi
# Search Reddit ($0.005)
npx x402-proxy -X POST -H "Content-Type: application/json" \
-d '{"query":"x402 protocol"}' \
https://surf.cascade.fyi/api/v1/reddit/search
# Chat with Kimi K2.5 ($0.004)
npx x402-proxy -X POST -H "Content-Type: application/json" \
-d '{"model":"moonshotai/kimi-k2.5","messages":[{"role":"user","content":"Hello"}]}' \
https://surf.cascade.fyi/api/v1/inference/completions
# Web search ($0.01)
npx x402-proxy -X POST -H "Content-Type: application/json" \
-d '{"query":"x402 protocol"}' \
https://surf.cascade.fyi/api/v1/web/search
# Crawl a page ($0.002)
npx x402-proxy -X POST -H "Content-Type: application/json" \
-d '{"url":"https://example.com"}' \
https://surf.cascade.fyi/api/v1/web/crawl
First run walks you through wallet setup automatically.
MCP Server
Unified MCP server at
with 9 tools. Supports tool filtering via
query param.
Add to Claude Code:
bash
npx x402-proxy mcp add surf https://surf.cascade.fyi/mcp
Or add a filtered subset:
bash
npx x402-proxy mcp add surf "https://surf.cascade.fyi/mcp?tools=surf_twitter_search,surf_web_crawl"
Endpoint Reference
All data endpoints support both
(JSON body) and
(path/query params). OpenAPI spec at
https://surf.cascade.fyi/openapi.json
.
GET convenience routes:
/api/v1/twitter/user/:ref
,
/api/v1/twitter/tweet/:ref
,
,
/api/v1/reddit/subreddit/:name
,
Twitter
All $0.005/call. MCP tools use the same params and return the same data.
POST /api/v1/twitter/search | MCP:
Search tweets with 50+ advanced operators. Returns ~20 tweets per page with engagement summary.
| Param | Type | Required | Default | Description |
|---|
| string | yes | | Search query (e.g. from:elonmusk AI min_faves:100
) |
| | | no | | Sort order |
| string | no | | Pagination cursor |
| | no | | Only tweets on or after this date |
| | no | | Only tweets before this date |
Search operators:
,
,
,
,
,
,
,
,
POST /api/v1/twitter/tweet | MCP:
Fetch a tweet with full thread context (all conversation participants), parent tweet, and optionally replies/quotes.
| Param | Type | Required | Default | Description |
|---|
| string | yes | | Tweet ID or URL |
| array | no | | , , or both |
POST /api/v1/twitter/user | MCP:
Fetch user profile with ~20 recent tweets per page.
| Param | Type | Required | Default | Description |
|---|
| string | yes | | Username or @username |
| boolean | no | | Include replies in timeline |
| boolean | no | | Include mentions timeline |
| string | no | | Pagination cursor |
Enriched fields: thread context (full conversation up to 20 tweets), engagement_rate, content_type (original/reply/quote/retweet/media/link_share), topic extraction (hashtags, domains, mentions), auto-crawled article content from URLs in tweets.
Reddit
All $0.005/call. MCP tools use the same params and return the same data.
POST /api/v1/reddit/search | MCP:
Search posts across Reddit with sort and time filters.
| Param | Type | Required | Default | Description |
|---|
| string | yes | | Search query |
| | | | | | no | | Sort order |
| | | | | | | no | | Time range |
| integer | no | | Max results (1-100) |
| string | no | | Pagination cursor |
| | no | | Only posts on or after this date |
| | no | | Only posts before this date |
POST /api/v1/reddit/post | MCP:
Fetch a post with comments, depth/sort control.
| Param | Type | Required | Default | Description |
|---|
| string | yes | | Post ID or Reddit URL |
| | | | | | | no | | Comment sort |
| integer | no | | Max comments (0-200) |
| integer | no | | Max nesting depth (0-10) |
POST /api/v1/reddit/subreddit | MCP:
Fetch subreddit info and top posts.
| Param | Type | Required | Default | Description |
|---|
| string | yes | | Subreddit name (e.g. ) |
| | | | | no | | Post sort |
| | | | | | | no | | Time range |
| integer | no | | Max posts (1-100) |
POST /api/v1/reddit/user | MCP:
Fetch a user profile with recent posts and comments.
| Param | Type | Required | Default | Description |
|---|
| string | yes | | Reddit username (e.g. ) |
| boolean | no | | Include recent posts |
| boolean | no | | Include recent comments |
| integer | no | | Max posts/comments (1-100) |
Enriched fields: domain, stickied, locked, edited, distinguished, awards, crosspost_parent, comment link context.
Web
POST /api/v1/web/search | MCP:
| $0.01/call
Semantic web search powered by Exa. Returns titles, URLs, snippets, and highlights.
| Param | Type | Required | Default | Description |
|---|
| string | yes | | Search query |
| | | no | | Search depth |
| string[] | no | | Restrict to specific domains |
| string[] | no | | Exclude specific domains |
| ISO string | no | | Only results published after this date |
| ISO string | no | | Only results published before this date |
| enum | no | | , , , , , , , |
POST /api/v1/web/crawl | MCP:
| $0.002/call
Extract content from web pages as markdown, HTML, or text. Supports PDF extraction.
| Param | Type | Required | Default | Description |
|---|
| string | one of url/urls | | Single URL to crawl |
| string[] | one of url/urls | | Multiple URLs (max 20, one payment covers all) |
| | | | no | | Output format |
| string | no | | CSS selector for targeted extraction |
| boolean | no | | Use proxy for blocked sites |
Enriched fields (search): published_date, author, score, query-relevant highlights, autoprompt query rewriting.
Inference
POST /api/v1/inference/completions | OpenAI-compatible chat completion
Model list at
GET /api/v1/inference/models
.
| Param | Type | Required | Description |
|---|
| string | yes | Model identifier (see table below) |
| array | yes | Chat messages with and |
| boolean | no | Enable SSE streaming |
| integer | no | Max tokens to generate (affects dynamic pricing) |
| integer | no | Preferred over max_tokens for Anthropic models |
| number | no | Sampling temperature (0-2) |
| number | no | Nucleus sampling |
| array | no | Tool/function definitions |
Models (15):
| Model | Price | Notes |
|---|
qwen/qwen-2.5-7b-instruct
| $0.001 flat | Lightweight, fast utility |
| $0.004 flat | Strong reasoning, code, long context |
| from $0.006 | Dynamic - fast general-purpose, 196K context |
| from $0.007 | Dynamic - best-in-class tool calling, 2M context |
| from $0.012 | Dynamic - MoE 230B/10B active, strong coding |
| from $0.030 | Dynamic - strongest open-weight coding model |
| from $0.032 | Dynamic - xAI flagship, lowest hallucination rate |
x-ai/grok-4.20-multi-agent-beta
| from $0.064 | Dynamic - multi-agent (4-16 parallel agents) |
anthropic/claude-sonnet-4.5
| from $0.10 | Dynamic - varies by token usage |
anthropic/claude-sonnet-4.6
| from $0.10 | Dynamic - varies by token usage |
anthropic/claude-opus-4.5
| from $0.17 | Dynamic - varies by token usage |
anthropic/claude-opus-4.6
| from $0.17 | Dynamic - varies by token usage |
x-ai/grok-4.1-fast:online
| from $0.007 | grok-4.1-fast + live X/Twitter & web search |
x-ai/grok-4.20-beta:online
| from $0.037 | grok-4.20-beta + live X/Twitter & web search |
x-ai/grok-4.20-multi-agent-beta:online
| from $0.074 | multi-agent + live X/Twitter & web search |
Flat models charge a fixed price per request. Dynamic models price based on input size, max_tokens, and model rates. The
variants include live X/Twitter + web search via xAI native tools.
Streaming: Set
for SSE. Parse
lines, stop on
.
typescript
const res = await fetchX402("https://surf.cascade.fyi/api/v1/inference/completions", {
method: "POST",
headers: { "Content-Type": "application/json" },
body: JSON.stringify({
model: "moonshotai/kimi-k2.5",
messages: [{ role: "user", content: "Write a haiku" }],
stream: true,
}),
});
const reader = res.body!.getReader();
const decoder = new TextDecoder();
while (true) {
const { done, value } = await reader.read();
if (done) break;
const chunk = decoder.decode(value);
for (const line of chunk.split("\n")) {
if (line.startsWith("data: ") && line !== "data: [DONE]") {
const data = JSON.parse(line.slice(6));
process.stdout.write(data.choices[0]?.delta?.content ?? "");
}
}
}
Rate limits: 20 requests per 60 seconds per wallet. Duplicate payment headers are rejected.
Integrate with @x402/fetch
bash
npm install @x402/fetch @x402/evm @x402/svm
Solana wallet
typescript
import { x402Client, wrapFetchWithPayment } from "@x402/fetch";
import { registerExactSvmScheme } from "@x402/svm/exact/client";
import { createKeyPairSignerFromBytes } from "@solana/kit";
import { base58 } from "@scure/base";
const svmSigner = await createKeyPairSignerFromBytes(base58.decode(process.env.SVM_PRIVATE_KEY!));
const client = new x402Client();
registerExactSvmScheme(client, { signer: svmSigner });
const fetchX402 = wrapFetchWithPayment(fetch, client);
Base (EVM) wallet
typescript
import { x402Client, wrapFetchWithPayment } from "@x402/fetch";
import { registerExactEvmScheme } from "@x402/evm/exact/client";
import { privateKeyToAccount } from "viem/accounts";
const signer = privateKeyToAccount(process.env.EVM_PRIVATE_KEY as `0x${string}`);
const client = new x402Client();
registerExactEvmScheme(client, { signer });
const fetchX402 = wrapFetchWithPayment(fetch, client);
Then use like normal fetch:
typescript
const res = await fetchX402("https://surf.cascade.fyi/api/v1/twitter/user/cascade_fyi");
const { data } = await res.json();
Debugging
bash
npx x402-proxy https://surf.cascade.fyi/api/v1/twitter/user/cascade_fyi | jq '.data'
npx x402-proxy wallet # addresses and balances
npx x402-proxy wallet history # payment log
Pagination
Paginated endpoints return
and
. Pass
in the next request:
typescript
let cursor: string | undefined;
const allTweets = [];
do {
const res = await fetchX402("https://surf.cascade.fyi/api/v1/twitter/user", {
method: "POST",
headers: { "Content-Type": "application/json" },
body: JSON.stringify({ ref: "cascade_fyi", cursor }),
});
const { data, meta } = await res.json();
allTweets.push(...data.tweets);
cursor = meta?.has_next_page ? meta.next_cursor : undefined;
} while (cursor);
Tips
- Use for LLM-friendly web crawl output
- Batch URLs with the array to crawl multiple pages in one paid request
- The inference API is OpenAI-compatible - existing code works by changing the base URL
- Use to fetch a Reddit post without comments (faster)
- Quote URLs containing or in shell commands to avoid glob expansion