bknd-bulk-operations

Compare original and translation side by side

🇺🇸

Original

English
🇨🇳

Translation

Chinese

Bulk Operations

批量操作

Perform large-scale insert, update, and delete operations efficiently using Bknd's bulk APIs.
使用Bknd的批量API高效执行大规模插入、更新和删除操作。

Prerequisites

前提条件

  • Bknd project running (local or deployed)
  • Entity exists (use
    bknd-create-entity
    first)
  • SDK configured or API endpoint known
  • 运行中的Bknd项目(本地或已部署)
  • 实体已存在(先使用
    bknd-create-entity
  • SDK已配置或已知API端点

When to Use UI Mode

何时使用UI模式

UI mode not recommended for bulk operations. Use Admin Panel only for:
  • Importing small CSV files (<100 records)
  • Manual data cleanup
不推荐对批量操作使用UI模式。仅在以下场景使用管理面板:
  • 导入小型CSV文件(<100条记录)
  • 手动清理数据

When to Use Code Mode

何时使用代码模式

  • Migrating data from another system
  • Seeding large datasets
  • Batch updates (publish all drafts, archive old records)
  • Mass deletion (cleanup, GDPR requests)
  • ETL pipelines
  • 从其他系统迁移数据
  • 填充大型数据集
  • 批量更新(发布所有草稿、归档旧记录)
  • 批量删除(清理数据、GDPR请求)
  • ETL管道

Code Approach

代码实现方式

Bulk Insert: createMany

批量插入:createMany

typescript
import { Api } from "bknd";

const api = new Api({ host: "http://localhost:7654" });

// Insert multiple records in one call
const { ok, data, error } = await api.data.createMany("products", [
  { name: "Product A", price: 10, stock: 100 },
  { name: "Product B", price: 20, stock: 50 },
  { name: "Product C", price: 15, stock: 75 },
]);

if (ok) {
  console.log(`Created ${data.length} products`);
  // data contains array of created records with IDs
}
typescript
import { Api } from "bknd";

const api = new Api({ host: "http://localhost:7654" });

// Insert multiple records in one call
const { ok, data, error } = await api.data.createMany("products", [
  { name: "Product A", price: 10, stock: 100 },
  { name: "Product B", price: 20, stock: 50 },
  { name: "Product C", price: 15, stock: 75 },
]);

if (ok) {
  console.log(`Created ${data.length} products`);
  // data contains array of created records with IDs
}

Bulk Update: updateMany

批量更新:updateMany

typescript
// Update all records matching where clause
const { ok, data } = await api.data.updateMany(
  "posts",
  { status: { $eq: "draft" } },     // where clause (required)
  { status: "archived" }             // update data
);

// Archive old posts
await api.data.updateMany(
  "posts",
  {
    status: { $eq: "published" },
    created_at: { $lt: "2024-01-01" },
  },
  { status: "archived" }
);

// Increment view count for multiple posts
await api.data.updateMany(
  "posts",
  { id: { $in: [1, 2, 3, 4, 5] } },
  { featured: true }
);
typescript
// Update all records matching where clause
const { ok, data } = await api.data.updateMany(
  "posts",
  { status: { $eq: "draft" } },     // where clause (required)
  { status: "archived" }             // update data
);

// Archive old posts
await api.data.updateMany(
  "posts",
  {
    status: { $eq: "published" },
    created_at: { $lt: "2024-01-01" },
  },
  { status: "archived" }
);

// Increment view count for multiple posts
await api.data.updateMany(
  "posts",
  { id: { $in: [1, 2, 3, 4, 5] } },
  { featured: true }
);

Bulk Delete: deleteMany

批量删除:deleteMany

typescript
// Delete all records matching where clause
const { ok, data } = await api.data.deleteMany("logs", {
  created_at: { $lt: "2023-01-01" },  // Delete old logs
});

// Delete by IDs
await api.data.deleteMany("temp_files", {
  id: { $in: [10, 11, 12, 13] },
});

// Delete archived items
await api.data.deleteMany("posts", {
  status: { $eq: "archived" },
  deleted_at: { $isnull: false },
});
Warning:
where
clause is required - prevents accidental delete-all.
typescript
// Delete all records matching where clause
const { ok, data } = await api.data.deleteMany("logs", {
  created_at: { $lt: "2023-01-01" },  // Delete old logs
});

// Delete by IDs
await api.data.deleteMany("temp_files", {
  id: { $in: [10, 11, 12, 13] },
});

// Delete archived items
await api.data.deleteMany("posts", {
  status: { $eq: "archived" },
  deleted_at: { $isnull: false },
});
警告:
where
子句为必填项 - 防止意外删除所有数据。

Chunked Processing

分块处理

For large datasets, process in chunks to avoid timeouts and memory issues:
对于大型数据集,采用分块处理以避免超时和内存问题:

Basic Chunking

基础分块

typescript
async function bulkInsertChunked(
  api: Api,
  entity: string,
  items: object[],
  chunkSize = 100
): Promise<object[]> {
  const results: object[] = [];

  for (let i = 0; i < items.length; i += chunkSize) {
    const chunk = items.slice(i, i + chunkSize);
    const { ok, data, error } = await api.data.createMany(entity, chunk);

    if (!ok) {
      throw new Error(`Chunk ${i / chunkSize + 1} failed: ${error.message}`);
    }

    results.push(...data);
  }

  return results;
}

// Usage
const products = generateProducts(5000);  // Large dataset
const created = await bulkInsertChunked(api, "products", products);
console.log(`Created ${created.length} products`);
typescript
async function bulkInsertChunked(
  api: Api,
  entity: string,
  items: object[],
  chunkSize = 100
): Promise<object[]> {
  const results: object[] = [];

  for (let i = 0; i < items.length; i += chunkSize) {
    const chunk = items.slice(i, i + chunkSize);
    const { ok, data, error } = await api.data.createMany(entity, chunk);

    if (!ok) {
      throw new Error(`Chunk ${i / chunkSize + 1} failed: ${error.message}`);
    }

    results.push(...data);
  }

  return results;
}

// Usage
const products = generateProducts(5000);  // Large dataset
const created = await bulkInsertChunked(api, "products", products);
console.log(`Created ${created.length} products`);

With Progress Callback

带进度回调

typescript
type ProgressCallback = (done: number, total: number, chunk: number) => void;

async function bulkInsertWithProgress(
  api: Api,
  entity: string,
  items: object[],
  onProgress?: ProgressCallback,
  chunkSize = 100
): Promise<{ success: object[]; failed: object[] }> {
  const success: object[] = [];
  const failed: object[] = [];
  const totalChunks = Math.ceil(items.length / chunkSize);

  for (let i = 0; i < items.length; i += chunkSize) {
    const chunkNum = Math.floor(i / chunkSize) + 1;
    const chunk = items.slice(i, i + chunkSize);

    const { ok, data, error } = await api.data.createMany(entity, chunk);

    if (ok) {
      success.push(...data);
    } else {
      failed.push(...chunk);
      console.warn(`Chunk ${chunkNum} failed:`, error.message);
    }

    onProgress?.(Math.min(i + chunkSize, items.length), items.length, chunkNum);
  }

  return { success, failed };
}

// Usage with progress
await bulkInsertWithProgress(
  api,
  "products",
  products,
  (done, total, chunk) => {
    const percent = Math.round((done / total) * 100);
    console.log(`Progress: ${percent}% (chunk ${chunk})`);
  }
);
typescript
type ProgressCallback = (done: number, total: number, chunk: number) => void;

async function bulkInsertWithProgress(
  api: Api,
  entity: string,
  items: object[],
  onProgress?: ProgressCallback,
  chunkSize = 100
): Promise<{ success: object[]; failed: object[] }> {
  const success: object[] = [];
  const failed: object[] = [];
  const totalChunks = Math.ceil(items.length / chunkSize);

  for (let i = 0; i < items.length; i += chunkSize) {
    const chunkNum = Math.floor(i / chunkSize) + 1;
    const chunk = items.slice(i, i + chunkSize);

    const { ok, data, error } = await api.data.createMany(entity, chunk);

    if (ok) {
      success.push(...data);
    } else {
      failed.push(...chunk);
      console.warn(`Chunk ${chunkNum} failed:`, error.message);
    }

    onProgress?.(Math.min(i + chunkSize, items.length), items.length, chunkNum);
  }

  return { success, failed };
}

// Usage with progress
await bulkInsertWithProgress(
  api,
  "products",
  products,
  (done, total, chunk) => {
    const percent = Math.round((done / total) * 100);
    console.log(`Progress: ${percent}% (chunk ${chunk})`);
  }
);

Parallel Chunk Processing

并行分块处理

Process multiple chunks concurrently (use with caution):
typescript
async function bulkInsertParallel(
  api: Api,
  entity: string,
  items: object[],
  chunkSize = 100,
  concurrency = 3
): Promise<object[]> {
  const chunks: object[][] = [];
  for (let i = 0; i < items.length; i += chunkSize) {
    chunks.push(items.slice(i, i + chunkSize));
  }

  const results: object[] = [];

  // Process in batches of concurrent requests
  for (let i = 0; i < chunks.length; i += concurrency) {
    const batch = chunks.slice(i, i + concurrency);
    const promises = batch.map((chunk) =>
      api.data.createMany(entity, chunk)
    );

    const responses = await Promise.all(promises);
    for (const { ok, data } of responses) {
      if (ok) results.push(...data);
    }
  }

  return results;
}
并发处理多个分块(谨慎使用):
typescript
async function bulkInsertParallel(
  api: Api,
  entity: string,
  items: object[],
  chunkSize = 100,
  concurrency = 3
): Promise<object[]> {
  const chunks: object[][] = [];
  for (let i = 0; i < items.length; i += chunkSize) {
    chunks.push(items.slice(i, i + chunkSize));
  }

  const results: object[] = [];

  // Process in batches of concurrent requests
  for (let i = 0; i < chunks.length; i += concurrency) {
    const batch = chunks.slice(i, i + concurrency);
    const promises = batch.map((chunk) =>
      api.data.createMany(entity, chunk)
    );

    const responses = await Promise.all(promises);
    for (const { ok, data } of responses) {
      if (ok) results.push(...data);
    }
  }

  return results;
}

REST API Approach

REST API方式

Bulk Insert

批量插入

bash
curl -X POST http://localhost:7654/api/data/products \
  -H "Content-Type: application/json" \
  -d '[
    {"name": "Product A", "price": 10},
    {"name": "Product B", "price": 20}
  ]'
bash
curl -X POST http://localhost:7654/api/data/products \
  -H "Content-Type: application/json" \
  -d '[
    {"name": "Product A", "price": 10},
    {"name": "Product B", "price": 20}
  ]'

Bulk Update

批量更新

bash
curl -X PATCH http://localhost:7654/api/data/posts \
  -H "Content-Type: application/json" \
  -d '{
    "where": {"status": {"$eq": "draft"}},
    "data": {"status": "archived"}
  }'
bash
curl -X PATCH http://localhost:7654/api/data/posts \
  -H "Content-Type: application/json" \
  -d '{
    "where": {"status": {"$eq": "draft"}},
    "data": {"status": "archived"}
  }'

Bulk Delete

批量删除

bash
curl -X DELETE http://localhost:7654/api/data/logs \
  -H "Content-Type: application/json" \
  -d '{"where": {"created_at": {"$lt": "2023-01-01"}}}'
bash
curl -X DELETE http://localhost:7654/api/data/logs \
  -H "Content-Type: application/json" \
  -d '{"where": {"created_at": {"$lt": "2023-01-01"}}}'

Server-Side Seeding

服务端数据填充

For initial data population, use the seed function:
typescript
import { App, em, entity, text, number } from "bknd";

const schema = em({
  products: entity("products", {
    name: text().required(),
    price: number().required(),
    stock: number({ default_value: 0 }),
  }),
});

new App({
  ...schema,
  options: {
    seed: async (ctx) => {
      // Check if already seeded
      const { data } = await ctx.em.repo("products").count();
      if (data.count > 0) return;

      // Bulk insert via mutator
      await ctx.em.mutator("products").insertMany([
        { name: "Widget", price: 9.99, stock: 100 },
        { name: "Gadget", price: 19.99, stock: 50 },
        { name: "Gizmo", price: 14.99, stock: 75 },
      ]);

      console.log("Seeded products");
    },
  },
});
对于初始数据填充,使用seed函数:
typescript
import { App, em, entity, text, number } from "bknd";

const schema = em({
  products: entity("products", {
    name: text().required(),
    price: number().required(),
    stock: number({ default_value: 0 }),
  }),
});

new App({
  ...schema,
  options: {
    seed: async (ctx) => {
      // Check if already seeded
      const { data } = await ctx.em.repo("products").count();
      if (data.count > 0) return;

      // Bulk insert via mutator
      await ctx.em.mutator("products").insertMany([
        { name: "Widget", price: 9.99, stock: 100 },
        { name: "Gadget", price: 19.99, stock: 50 },
        { name: "Gizmo", price: 14.99, stock: 75 },
      ]);

      console.log("Seeded products");
    },
  },
});

Error Handling Strategies

错误处理策略

All-or-Nothing (Fail Fast)

全有或全无(快速失败)

Stop on first error:
typescript
async function bulkInsertStrict(api: Api, entity: string, items: object[]) {
  for (let i = 0; i < items.length; i += 100) {
    const chunk = items.slice(i, i + 100);
    const { ok, error } = await api.data.createMany(entity, chunk);

    if (!ok) {
      throw new Error(`Failed at chunk ${i / 100 + 1}: ${error.message}`);
    }
  }
}
首次出错即停止:
typescript
async function bulkInsertStrict(api: Api, entity: string, items: object[]) {
  for (let i = 0; i < items.length; i += 100) {
    const chunk = items.slice(i, i + 100);
    const { ok, error } = await api.data.createMany(entity, chunk);

    if (!ok) {
      throw new Error(`Failed at chunk ${i / 100 + 1}: ${error.message}`);
    }
  }
}

Best Effort (Continue on Error)

尽力而为(出错后继续)

Collect failures, continue processing:
typescript
async function bulkInsertBestEffort(api: Api, entity: string, items: object[]) {
  const results = { success: [] as object[], failed: [] as object[] };

  for (let i = 0; i < items.length; i += 100) {
    const chunk = items.slice(i, i + 100);
    const { ok, data } = await api.data.createMany(entity, chunk);

    if (ok) {
      results.success.push(...data);
    } else {
      results.failed.push(...chunk);
    }
  }

  return results;
}
收集失败记录,继续处理:
typescript
async function bulkInsertBestEffort(api: Api, entity: string, items: object[]) {
  const results = { success: [] as object[], failed: [] as object[] };

  for (let i = 0; i < items.length; i += 100) {
    const chunk = items.slice(i, i + 100);
    const { ok, data } = await api.data.createMany(entity, chunk);

    if (ok) {
      results.success.push(...data);
    } else {
      results.failed.push(...chunk);
    }
  }

  return results;
}

Individual Fallback

单个记录回退

Fall back to individual inserts on chunk failure:
typescript
async function bulkInsertWithFallback(api: Api, entity: string, items: object[]) {
  const success: object[] = [];
  const failed: object[] = [];

  for (let i = 0; i < items.length; i += 100) {
    const chunk = items.slice(i, i + 100);
    const { ok, data } = await api.data.createMany(entity, chunk);

    if (ok) {
      success.push(...data);
    } else {
      // Fall back to individual inserts
      for (const item of chunk) {
        const { ok: itemOk, data: itemData } = await api.data.createOne(
          entity,
          item
        );
        if (itemOk) {
          success.push(itemData);
        } else {
          failed.push(item);
        }
      }
    }
  }

  return { success, failed };
}
分块失败时回退到单个记录插入:
typescript
async function bulkInsertWithFallback(api: Api, entity: string, items: object[]) {
  const success: object[] = [];
  const failed: object[] = [];

  for (let i = 0; i < items.length; i += 100) {
    const chunk = items.slice(i, i + 100);
    const { ok, data } = await api.data.createMany(entity, chunk);

    if (ok) {
      success.push(...data);
    } else {
      // Fall back to individual inserts
      for (const item of chunk) {
        const { ok: itemOk, data: itemData } = await api.data.createOne(
          entity,
          item
        );
        if (itemOk) {
          success.push(itemData);
        } else {
          failed.push(item);
        }
      }
    }
  }

  return { success, failed };
}

Common Patterns

常见模式

Data Migration

数据迁移

typescript
async function migrateData(
  sourceApi: Api,
  targetApi: Api,
  entity: string,
  transform?: (record: object) => object
) {
  let offset = 0;
  const limit = 100;
  let total = 0;

  while (true) {
    const { data, meta } = await sourceApi.data.readMany(entity, {
      limit,
      offset,
    });

    if (data.length === 0) break;

    const transformed = transform ? data.map(transform) : data;
    await targetApi.data.createMany(entity, transformed);

    total += data.length;
    offset += limit;

    console.log(`Migrated ${total}/${meta.total} records`);
  }

  return total;
}
typescript
async function migrateData(
  sourceApi: Api,
  targetApi: Api,
  entity: string,
  transform?: (record: object) => object
) {
  let offset = 0;
  const limit = 100;
  let total = 0;

  while (true) {
    const { data, meta } = await sourceApi.data.readMany(entity, {
      limit,
      offset,
    });

    if (data.length === 0) break;

    const transformed = transform ? data.map(transform) : data;
    await targetApi.data.createMany(entity, transformed);

    total += data.length;
    offset += limit;

    console.log(`Migrated ${total}/${meta.total} records`);
  }

  return total;
}

Conditional Bulk Update

条件批量更新

typescript
// Publish all posts by specific author
await api.data.updateMany(
  "posts",
  {
    author_id: { $eq: authorId },
    status: { $eq: "draft" },
  },
  { status: "published", published_at: new Date().toISOString() }
);

// Mark inactive users
const thirtyDaysAgo = new Date(Date.now() - 30 * 24 * 60 * 60 * 1000)
  .toISOString();

await api.data.updateMany(
  "users",
  { last_login: { $lt: thirtyDaysAgo } },
  { status: "inactive" }
);
typescript
// Publish all posts by specific author
await api.data.updateMany(
  "posts",
  {
    author_id: { $eq: authorId },
    status: { $eq: "draft" },
  },
  { status: "published", published_at: new Date().toISOString() }
);

// Mark inactive users
const thirtyDaysAgo = new Date(Date.now() - 30 * 24 * 60 * 60 * 1000)
  .toISOString();

await api.data.updateMany(
  "users",
  { last_login: { $lt: thirtyDaysAgo } },
  { status: "inactive" }
);

Soft Delete Cleanup

软删除清理

typescript
// Permanently delete soft-deleted records older than 30 days
const thirtyDaysAgo = new Date(Date.now() - 30 * 24 * 60 * 60 * 1000)
  .toISOString();

await api.data.deleteMany("posts", {
  deleted_at: { $lt: thirtyDaysAgo, $isnull: false },
});
typescript
// Permanently delete soft-deleted records older than 30 days
const thirtyDaysAgo = new Date(Date.now() - 30 * 24 * 60 * 60 * 1000)
  .toISOString();

await api.data.deleteMany("posts", {
  deleted_at: { $lt: thirtyDaysAgo, $isnull: false },
});

Bulk Update Relations

批量更新关联关系

typescript
// Add tag to multiple posts
const postIds = [1, 2, 3, 4, 5];

for (const postId of postIds) {
  await api.data.updateOne("posts", postId, {
    tags: { $add: [newTagId] },
  });
}
Note: Bknd doesn't support bulk relation updates in a single call. Loop through records.
typescript
// Add tag to multiple posts
const postIds = [1, 2, 3, 4, 5];

for (const postId of postIds) {
  await api.data.updateOne("posts", postId, {
    tags: { $add: [newTagId] },
  });
}
注意:Bknd不支持在单个调用中批量更新关联关系,需循环处理每条记录。

Transaction-like Pattern

类事务模式

Bknd doesn't have explicit transactions. Use this pattern for related operations:
typescript
async function createOrderWithItems(
  api: Api,
  orderData: object,
  items: object[]
) {
  // Create order
  const { ok, data: order, error } = await api.data.createOne("orders", orderData);
  if (!ok) throw new Error(`Order failed: ${error.message}`);

  // Create order items
  const itemsWithOrder = items.map((item) => ({
    ...item,
    order: { $set: order.id },
  }));

  const { ok: itemsOk, error: itemsError } = await api.data.createMany(
    "order_items",
    itemsWithOrder
  );

  if (!itemsOk) {
    // Rollback: delete the order
    await api.data.deleteOne("orders", order.id);
    throw new Error(`Items failed, order rolled back: ${itemsError.message}`);
  }

  return order;
}
Bknd没有显式的事务支持,可使用以下模式处理关联操作:
typescript
async function createOrderWithItems(
  api: Api,
  orderData: object,
  items: object[]
) {
  // Create order
  const { ok, data: order, error } = await api.data.createOne("orders", orderData);
  if (!ok) throw new Error(`Order failed: ${error.message}`);

  // Create order items
  const itemsWithOrder = items.map((item) => ({
    ...item,
    order: { $set: order.id },
  }));

  const { ok: itemsOk, error: itemsError } = await api.data.createMany(
    "order_items",
    itemsWithOrder
  );

  if (!itemsOk) {
    // Rollback: delete the order
    await api.data.deleteOne("orders", order.id);
    throw new Error(`Items failed, order rolled back: ${itemsError.message}`);
  }

  return order;
}

React Integration

React集成

Bulk Import Component

批量导入组件

tsx
import { useApp } from "bknd/react";
import { useState } from "react";

function BulkImport({ entity }: { entity: string }) {
  const { api } = useApp();
  const [progress, setProgress] = useState(0);
  const [status, setStatus] = useState<"idle" | "importing" | "done">("idle");

  async function handleFileUpload(e: React.ChangeEvent<HTMLInputElement>) {
    const file = e.target.files?.[0];
    if (!file) return;

    const text = await file.text();
    const items = JSON.parse(text);  // Assume JSON array

    setStatus("importing");
    setProgress(0);

    const chunkSize = 100;
    for (let i = 0; i < items.length; i += chunkSize) {
      const chunk = items.slice(i, i + chunkSize);
      await api.data.createMany(entity, chunk);
      setProgress(Math.round(((i + chunkSize) / items.length) * 100));
    }

    setStatus("done");
  }

  return (
    <div>
      <input type="file" accept=".json" onChange={handleFileUpload} />
      {status === "importing" && <p>Importing... {progress}%</p>}
      {status === "done" && <p>Import complete!</p>}
    </div>
  );
}
tsx
import { useApp } from "bknd/react";
import { useState } from "react";

function BulkImport({ entity }: { entity: string }) {
  const { api } = useApp();
  const [progress, setProgress] = useState(0);
  const [status, setStatus] = useState<"idle" | "importing" | "done">("idle");

  async function handleFileUpload(e: React.ChangeEvent<HTMLInputElement>) {
    const file = e.target.files?.[0];
    if (!file) return;

    const text = await file.text();
    const items = JSON.parse(text);  // Assume JSON array

    setStatus("importing");
    setProgress(0);

    const chunkSize = 100;
    for (let i = 0; i < items.length; i += chunkSize) {
      const chunk = items.slice(i, i + chunkSize);
      await api.data.createMany(entity, chunk);
      setProgress(Math.round(((i + chunkSize) / items.length) * 100));
    }

    setStatus("done");
  }

  return (
    <div>
      <input type="file" accept=".json" onChange={handleFileUpload} />
      {status === "importing" && <p>导入中... {progress}%</p>}
      {status === "done" && <p>导入完成!</p>}
    </div>
  );
}

Bulk Delete with Confirmation

带确认的批量删除

tsx
function BulkDeleteButton({
  entity,
  where,
  onComplete,
}: {
  entity: string;
  where: object;
  onComplete: () => void;
}) {
  const { api } = useApp();
  const [loading, setLoading] = useState(false);

  async function handleDelete() {
    // Get count first
    const { data } = await api.data.count(entity, where);
    const confirmed = window.confirm(
      `Delete ${data.count} records? This cannot be undone.`
    );

    if (!confirmed) return;

    setLoading(true);
    await api.data.deleteMany(entity, where);
    setLoading(false);
    onComplete();
  }

  return (
    <button onClick={handleDelete} disabled={loading}>
      {loading ? "Deleting..." : "Delete All Matching"}
    </button>
  );
}
tsx
function BulkDeleteButton({
  entity,
  where,
  onComplete,
}: {
  entity: string;
  where: object;
  onComplete: () => void;
}) {
  const { api } = useApp();
  const [loading, setLoading] = useState(false);

  async function handleDelete() {
    // Get count first
    const { data } = await api.data.count(entity, where);
    const confirmed = window.confirm(
      `确定要删除${data.count}条记录吗?此操作不可撤销。`
    );

    if (!confirmed) return;

    setLoading(true);
    await api.data.deleteMany(entity, where);
    setLoading(false);
    onComplete();
  }

  return (
    <button onClick={handleDelete} disabled={loading}>
      {loading ? "删除中..." : "删除所有匹配记录"}
    </button>
  );
}

Performance Tips

性能优化建议

  1. Optimal chunk size: 100-500 records per chunk (balance speed vs memory)
  2. Avoid parallel writes to same entity (can cause locks)
  3. Use server-side seed for initial large datasets
  4. Index fields used in bulk update/delete where clauses
  5. Monitor memory when processing very large datasets client-side
  1. 最优分块大小: 每个分块100-500条记录(平衡速度与内存占用)
  2. 避免对同一实体并行写入(可能导致锁冲突)
  3. 使用服务端seed填充初始大型数据集
  4. 为批量更新/删除的where子句中使用的字段建立索引
  5. 客户端处理超大型数据集时监控内存占用

Common Pitfalls

常见陷阱

Missing Where Clause on deleteMany

deleteMany缺少Where子句

Problem: Attempt to delete all records blocked.
Fix: Always provide where clause:
typescript
// Wrong - no where clause
await api.data.deleteMany("posts");  // Error!

// Correct
await api.data.deleteMany("posts", { status: { $eq: "archived" } });

// To delete all (intentionally):
await api.data.deleteMany("posts", { id: { $gt: 0 } });
问题: 尝试删除所有记录会被阻止。
解决方法: 始终提供where子句:
typescript
// 错误示例 - 无where子句
await api.data.deleteMany("posts");  // 报错!

// 正确示例
await api.data.deleteMany("posts", { status: { $eq: "archived" } });

// 若需故意删除所有数据:
await api.data.deleteMany("posts", { id: { $gt: 0 } });

Memory Issues with Large Datasets

大型数据集导致内存问题

Problem: Out of memory when processing millions of records.
Fix: Process in chunks, avoid loading all at once:
typescript
// Wrong - loads everything
const { data } = await api.data.readMany("logs", { limit: 1000000 });
await api.data.deleteMany("logs", { id: { $in: data.map((d) => d.id) } });

// Correct - delete directly with where
await api.data.deleteMany("logs", { created_at: { $lt: cutoffDate } });
问题: 处理数百万条记录时出现内存不足。
解决方法: 分块处理,避免一次性加载所有数据:
typescript
// 错误示例 - 一次性加载所有数据
const { data } = await api.data.readMany("logs", { limit: 1000000 });
await api.data.deleteMany("logs", { id: { $in: data.map((d) => d.id) } });

// 正确示例 - 直接使用where子句删除
await api.data.deleteMany("logs", { created_at: { $lt: cutoffDate } });

No Unique Constraint Handling

未处理唯一约束

Problem: Bulk insert fails on duplicate key.
Fix: Deduplicate before insert or use upsert pattern:
typescript
// Deduplicate by email before insert
const uniqueItems = [...new Map(items.map((i) => [i.email, i])).values()];
await api.data.createMany("users", uniqueItems);
问题: 批量插入时因重复键失败。
解决方法: 插入前去重或使用更新插入模式:
typescript
// 按邮箱去重后插入
const uniqueItems = [...new Map(items.map((i) => [i.email, i])).values()];
await api.data.createMany("users", uniqueItems);

Timeout on Very Large Operations

超大型操作超时

Problem: Request times out on huge bulk operation.
Fix: Use smaller chunks with longer delays:
typescript
for (let i = 0; i < items.length; i += 50) {
  await api.data.createMany(entity, items.slice(i, i + 50));
  await new Promise((r) => setTimeout(r, 100));  // Small delay
}
问题: 超大批量操作请求超时。
解决方法: 使用更小的分块并增加延迟:
typescript
for (let i = 0; i < items.length; i += 50) {
  await api.data.createMany(entity, items.slice(i, i + 50));
  await new Promise((r) => setTimeout(r, 100));  // 短暂延迟
}

DOs and DON'Ts

注意事项

DO:
  • Use chunking for large datasets (>100 records)
  • Provide where clause for updateMany/deleteMany
  • Track progress for user feedback
  • Handle partial failures gracefully
  • Use server-side seed for initial data
DON'T:
  • Load millions of records into memory
  • Run parallel bulk writes to same entity
  • Assume bulk operations are atomic
  • Forget to handle unique constraint errors
  • Skip confirmation for destructive bulk deletes
建议:
  • 大型数据集(>100条记录)使用分块处理
  • updateMany/deleteMany操作必须提供where子句
  • 为用户反馈跟踪处理进度
  • 优雅处理部分失败情况
  • 初始数据填充使用服务端seed
禁止:
  • 将数百万条记录加载到内存中
  • 对同一实体执行并行批量写入
  • 假设批量操作是原子性的
  • 忽略唯一约束错误处理
  • 破坏性批量删除跳过确认步骤

Related Skills

相关技能

  • bknd-crud-create - Single record insertion
  • bknd-crud-update - Single record updates
  • bknd-crud-delete - Single record deletion
  • bknd-seed-data - Server-side initial data population
  • bknd-query-filter - Build where clauses for bulk operations
  • bknd-crud-create - 单条记录插入
  • bknd-crud-update - 单条记录更新
  • bknd-crud-delete - 单条记录删除
  • bknd-seed-data - 服务端初始数据填充
  • bknd-query-filter - 为批量操作构建where子句