Create Tasks Document
Creates a tasks document based on the requirements and design documents. This command reads both documents and generates an implementation plan with tracked tasks.
When to use
Use this skill when the user needs to:
-
Create an implementation plan from existing requirements and design
-
Generate a task breakdown for development work
-
Plan the order of implementation with dependencies
Instructions
Step 1: Locate Documents
-
If <args> contains a spec name, look for:
-
Requirements at .specs/<spec-name>/requirements.md
-
Research at .specs/<spec-name>/research.md (optional but recommended)
-
Design at .specs/<spec-name>/design.md
-
If no spec name provided, list available specs in .specs/ and use the AskUserQuestion tool to let the user choose
-
Read and analyze all available documents
Step 2: Analyze the Design
Before creating tasks:
-
Review the architecture and components from the design
-
Review research.md chosen solutions to understand the rationale behind design decisions
-
Identify dependencies between components
-
Determine the optimal order of implementation
-
Note checkpoints for verification
Step 3: Verify Against the Codebase
Do not blindly trust the documents — cross-check key assumptions against the real codebase:
-
Check existing code — verify that files, modules, and APIs mentioned in the design actually exist and match the described structure
-
Validate assumptions — if the design references specific patterns, frameworks, or utilities, confirm they are present and used as described
-
Detect drift — if the codebase has changed since the documents were written, note discrepancies and adjust tasks accordingly
-
Identify missing context — look for related code, tests, or configs that the documents may have overlooked but that the tasks should account for
If you find significant discrepancies between the documents and the codebase, mention them in the Notes section of the tasks document.
Step 4: Create the Tasks Document
Create the document at .specs/<spec-name>/tasks.md with this structure:
Implementation Plan: [Feature Name]
Overview
[Brief description of the implementation and goals]
Tasks
-
1. [Major Task Name]
-
1.1 [Subtask Name]
- [Detailed description of what to do]
- [File to create/modify:
path/to/file.ts] - [Key implementation points]
- Requirements: X.X, X.X
-
1.2 [Subtask Name]
- [Details]
- Requirements: X.X
-
-
2. Checkpoint - [Verification point]
- [What to verify]
- [Run tests for the group:
test command or path] - [Run existing tests for affected files to check for regressions]
-
3. [Next Major Task]
- 3.1 [Subtask]
- [Details]
- Requirements: X.X
- 3.1 [Subtask]
[Continue with all tasks]
- N. Final checkpoint - Complete verification
- Run the full test suite to ensure all new and existing tests pass
- Run tests for all files affected by the implementation to check for regressions
- Verify all requirements are met
Notes
- [Important notes about the implementation]
- [Dependencies or constraints]
- [Any special considerations]
Task Structure Guidelines
Group related tasks - Major tasks contain related subtasks
Include file paths - Specify which files to create/modify
Reference requirements - Link each task to requirements with Requirements: X.X
Add test tasks per group - Each major task group MUST end with a subtask for writing tests covering the implemented functionality. Use the test strategy from the design document to determine test types (unit, integration, e2e) and coverage expectations
Add checkpoints - Include verification points after major milestones (see Checkpoint Guidelines)
Order by dependencies - Tasks that depend on others come later
Be specific - Each subtask should be actionable and clear
Full-stack data flow trace - When a task introduces a new field, entity, or data attribute, it MUST include subtasks for EVERY layer in the data flow. Missing even one layer causes bugs that require follow-up fix sessions. Use this checklist:
-
Schema/model definition
-
Database migration (if applicable)
-
Query/mutation that reads or writes the field
-
API response type/DTO that exposes the field
-
Frontend type/interface that receives the field
-
UI component that renders or edits the field
-
Validation (if the field has constraints)
If a single subtask spans multiple layers, explicitly list every file path — do not rely on the implementer to infer which files need changes.
Task Types
-
Implementation tasks - Create or modify code
-
Configuration tasks - Update configs, dependencies
-
Testing tasks - Write unit/integration tests
-
Cleanup tasks - Remove old code, update references
-
Checkpoint tasks - Verify implementation milestones
Checkpoint Guidelines
Every checkpoint task MUST include:
-
Run new tests — execute the tests written for the preceding task group
-
Run affected tests — execute existing tests for files that were created or modified in the group to catch regressions
-
Verify functionality — describe what to check manually or programmatically
Checkbox States
-
[ ]
-
Pending (not started)
-
[-]
-
Pending tasks
-
[x]
-
Completed
Step 5: Confirm with User
After creating the document, show the user:
-
The location of the created file
-
A summary of the task breakdown
-
Total number of tasks and estimated checkpoints
-
Use the AskUserQuestion tool to ask if they want to make changes or proceed, with options like "Looks good, start execution", "I want to make changes", "Review tasks first"
Arguments
- <args>
- The spec name (e.g., "user-auth", "payment-flow")
If not provided, list available specs and ask the user to choose.