d1-do-storage

Apply Cloudflare D1 and Durable Object SQLite storage best practices with standard Workers APIs and Drizzle ORM. Use when implementing queries, write flows, schema changes, retries, and performance-sensitive storage paths.

Safety Notice

This listing is imported from skills.sh public index metadata. Review upstream SKILL.md and repository scripts before running.

Copy this and send it to your AI assistant to learn

Install skill "d1-do-storage" with this command: npx skills add sillyvan/d1-do-storage-best-practices/sillyvan-d1-do-storage-best-practices-d1-do-storage

D1 + Durable Object Storage Best Practices

Use this skill for Cloudflare storage design and query implementation with:

  • D1 Worker bindings (env.DB)
  • Durable Objects SQLite storage (ctx.storage.sql)
  • Drizzle ORM on D1 (drizzle-orm/d1)

Quick Reference Selector

  • D1/SQLite query tuning, indexes, and schema checks -> references/sqlite-d1-best-practices.md
  • D1/DO error normalization and tagging -> references/storage-error-classification.md
  • Retry behavior and idempotent write rules -> references/storage-retry-idempotency.md
  • SQLite-backed Durable Object SQL, transactions, lifecycle, and PITR -> references/durable-object-sqlite-patterns.md

When to Use

  • Adding or refactoring D1 queries
  • Building multi-statement write flows
  • Porting raw SQL to Drizzle (or mixed raw SQL + Drizzle)
  • Defining retry/error handling for D1 or Durable Object storage
  • Reviewing performance regressions due to DB round trips

Non-Negotiable Rules

1. Prefer prepare + bind and batch

  • Use env.DB.prepare(...).bind(...) for normal D1 queries.
  • Use env.DB.batch([...]) (or Drizzle db.batch([...])) for related statements to reduce round trips.
  • Avoid env.DB.exec() for normal app paths; reserve it for one-shot admin/maintenance jobs.

2. Treat D1 transactions as unsupported in app code

  • Do not use SQL BEGIN TRANSACTION, COMMIT, or SAVEPOINT in D1 Worker paths.
  • Do not use Drizzle db.transaction(...) on D1.
  • Use batch for atomic multi-statement units.
  • If code must run on D1, assume there is no user-managed transaction support.

3. Use Durable Objects for serialized coordination

  • Use SQLite-backed Durable Objects for per-entity coordination/serialization.
  • Persist important state in storage, not only in memory (objects can be evicted/restarted).
  • Prefer the SQLite backend for new Durable Object classes.

4. Design for D1 limits and throughput

  • A single D1 database processes queries one-at-a-time; optimize query duration.
  • Add indexes for high-volume lookup/filter columns.
  • Chunk large updates/deletes (for example 500-1000 rows per batch).
  • Retry only when idempotent and when the error is retryable.

5. Use SQLite-backed Durable Object storage correctly

  • Only SQLite-backed DO classes can use ctx.storage.sql and PITR.
  • In DO SQL paths, do not execute BEGIN TRANSACTION / COMMIT / SAVEPOINT with sql.exec().
  • Use ctx.storage.transactionSync() for synchronous SQL-only transaction blocks.
  • Use ctx.storage.transaction() for async KV-style transaction flows when needed.
  • Initialize critical state with ctx.blockConcurrencyWhile(...) and persist data needed after eviction.
  • For full teardown, call both ctx.storage.deleteAlarm() and ctx.storage.deleteAll().

Recommended Workflow

  1. Confirm data placement:
  • D1 for shared relational data.
  • Durable Object SQLite for single-key/per-entity coordination and strongly-consistent object-local state.
  1. Build statements with parameters:
  • Standard API: prepare().bind()
  • Drizzle: query builder with placeholders/typed values
  1. Collapse round trips:
  • Convert related statements into a single batch call.
  1. Enforce transaction rule:
  • No SQL BEGIN/COMMIT/SAVEPOINT on D1 paths.
  • No Drizzle db.transaction() for D1.
  1. Validate production safety:
  • Add indexes for read paths.
  • Make write retries idempotent.
  • Add chunking for large write/migration operations.
  1. Apply SQLite/D1 performance rules:
  • Use workload-first schema/query design (read/write mix and hottest queries first).
  • Validate index coverage with PRAGMA index_list, PRAGMA index_info, and sqlite_master.
  • Run PRAGMA optimize after schema/index changes.
  • Prefer cursor pagination and avoid function-wrapped predicates on indexed columns.
  • Review index count regularly; do not over-index write-heavy tables.
  • See the SQLite checklist in references/sqlite-d1-best-practices.md.
  1. If using Durable Objects SQLite:
  • Keep DO SQL access on ctx.storage.sql only for SQLite-backed classes.
  • Use sql.exec(..., bindings) for parameterized SQL and inspect rowsRead / rowsWritten on cursors for cost/perf signals.
  • Use PITR bookmarks for recovery workflows (production only; not supported in local development).
  • See references/durable-object-sqlite-patterns.md.

Core Patterns

Standard D1 API (env.DB.batch)

const insertUser = env.DB
  .prepare("INSERT INTO users (id, email) VALUES (?, ?)")
  .bind(id, email);

const insertProfile = env.DB
  .prepare("INSERT INTO profiles (user_id, display_name) VALUES (?, ?)")
  .bind(id, displayName);

await env.DB.batch([insertUser, insertProfile]);

Drizzle on D1 (db.batch)

import { drizzle } from "drizzle-orm/d1";

const db = drizzle(env.DB);

await db.batch([
  db.insert(users).values({ id, email }),
  db.insert(profiles).values({ userId: id, displayName }),
]);

Avoid on D1

// Do not do this on D1:
await db.transaction(async (tx) => {
  // ...
});

References

Source Transparency

This detail page is rendered from real SKILL.md content. Trust labels are metadata-based hints, not a safety guarantee.

Related Skills

Related by shared tags or category signals.

General

ll-feishu-audio

飞书语音交互技能。支持语音消息自动识别、AI 处理、语音回复全流程。需要配置 FEISHU_APP_ID 和 FEISHU_APP_SECRET 环境变量。使用 faster-whisper 进行语音识别,Edge TTS 进行语音合成,自动转换 OPUS 格式并通过飞书发送。适用于飞书平台的语音对话场景。

Archived SourceRecently Updated
General

test_skill

import json import tkinter as tk from tkinter import messagebox, simpledialog

Archived SourceRecently Updated
General

51mee-resume-profile

简历画像。触发场景:用户要求生成候选人画像;用户想了解候选人的多维度标签和能力评估。

Archived SourceRecently Updated
General

51mee-resume-parse

简历解析。触发场景:用户上传简历文件要求解析、提取结构化信息。

Archived SourceRecently Updated