nlm-skill

NotebookLM CLI & MCP Expert

Safety Notice

This listing is imported from skills.sh public index metadata. Review upstream SKILL.md and repository scripts before running.

Copy this and send it to your AI assistant to learn

Install skill "nlm-skill" with this command: npx skills add jacob-bd/notebooklm-mcp-cli/jacob-bd-notebooklm-mcp-cli-nlm-skill

NotebookLM CLI & MCP Expert

This skill provides comprehensive guidance for using NotebookLM via both the nlm CLI and MCP tools.

Tool Detection (CRITICAL - Read First!)

ALWAYS check which tools are available before proceeding:

  • Check for MCP tools: Look for tools starting with mcp__notebooklm-mcp__* or mcp_notebooklm_*

  • If BOTH MCP tools AND CLI are available: ASK the user which they prefer to use before proceeding

  • If only MCP tools are available: Use them directly (refer to tool docstrings for parameters)

  • If only CLI is available: Use nlm CLI commands via Bash

Decision Logic:

has_mcp_tools = check_available_tools() # Look for mcp__notebooklm-mcp__* or mcp_notebooklm_* has_cli = check_bash_available() # Can run nlm commands

if has_mcp_tools and has_cli: # ASK USER: "I can use either MCP tools or the nlm CLI. Which do you prefer?" user_preference = ask_user() else if has_mcp_tools: # Use MCP tools directly mcp__notebooklm-mcp__notebook_list() else: # Use CLI via Bash bash("nlm notebook list")

This skill documents BOTH approaches. Choose the appropriate one based on tool availability and user preference.

Quick Reference

Run nlm --ai to get comprehensive AI-optimized documentation - this provides a complete view of all CLI capabilities.

nlm --help # List all commands nlm <command> --help # Help for specific command nlm --ai # Full AI-optimized documentation (RECOMMENDED) nlm --version # Check installed version

Critical Rules (Read First!)

  • Always authenticate first: Run nlm login before any operations

  • Sessions expire in ~20 minutes: Re-run nlm login if commands start failing

  • ⚠️ ALWAYS ASK USER BEFORE DELETE: Before executing ANY delete command, ask the user for explicit confirmation. Deletions are irreversible. Show what will be deleted and warn about permanent data loss.

  • --confirm is REQUIRED: All generation and delete commands need --confirm or -y (CLI) or confirm=True (MCP)

  • Research requires --notebook-id : The flag is mandatory, not positional

  • Capture IDs from output: Create/start commands return IDs needed for subsequent operations

  • Use aliases: Simplify long UUIDs with nlm alias set <name> <uuid>

  • Check aliases before creating: Run nlm alias list before creating a new alias to avoid conflicts with existing names.

  • DO NOT launch REPL: Never use nlm chat start

  • it opens an interactive REPL that AI tools cannot control. Use nlm notebook query for one-shot Q&A instead.

  • Choose output format wisely: Default output (no flags) is compact and token-efficient—use it for status checks. Use --quiet to capture IDs for piping. Only use --json when you need to parse specific fields programmatically.

  • Use --help when unsure: Run nlm <command> --help to see available options and flags for any command.

Workflow Decision Tree

Use this to determine the right sequence of commands:

User wants to... │ ├─► Work with NotebookLM for the first time │ └─► nlm login → nlm notebook create "Title" │ ├─► Add content to a notebook │ ├─► From a URL/webpage → nlm source add <nb-id> --url "https://..." │ ├─► From YouTube → nlm source add <nb-id> --url "https://youtube.com/..." │ ├─► From pasted text → nlm source add <nb-id> --text "content" --title "Title" │ ├─► From Google Drive → nlm source add <nb-id> --drive <doc-id> --type doc │ └─► Discover new sources → nlm research start "query" --notebook-id <nb-id> │ ├─► Generate content from sources │ ├─► Podcast/Audio → nlm audio create <nb-id> --confirm │ ├─► Written summary → nlm report create <nb-id> --confirm │ ├─► Study materials → nlm quiz/flashcards create <nb-id> --confirm │ ├─► Visual content → nlm mindmap/slides/infographic create <nb-id> --confirm │ ├─► Video → nlm video create <nb-id> --confirm │ └─► Extract data → nlm data-table create <nb-id> "description" --confirm │ ├─► Ask questions about sources │ └─► nlm notebook query <nb-id> "question" │ (Use --conversation-id for follow-ups) │ ⚠️ Do NOT use nlm chat start - it's a REPL for humans only │ ├─► Check generation status │ └─► nlm studio status <nb-id> │ └─► Manage/cleanup ├─► List notebooks → nlm notebook list ├─► List sources → nlm source list <nb-id> ├─► Delete source → nlm source delete <source-id> --confirm └─► Delete notebook → nlm notebook delete <nb-id> --confirm

Command Categories

  1. Authentication

MCP Authentication

If using MCP tools and encountering authentication errors:

Run the CLI authentication (works for both CLI and MCP)

nlm login

Then reload tokens in MCP

mcp__notebooklm-mcp__refresh_auth()

Or manually save cookies via MCP (fallback):

Extract cookies from Chrome DevTools and save

mcp__notebooklm-mcp__save_auth_tokens(cookies="<cookie_header>")

CLI Authentication

nlm login                           # Launch browser, extract cookies (primary method)
nlm login --check                   # Validate current session
nlm login --profile work            # Use named profile for multiple accounts
nlm login --provider openclaw --cdp-url http://127.0.0.1:18800  # External CDP provider
nlm login switch &#x3C;profile>          # Switch the default profile
nlm login profile list              # List all profiles with email addresses
nlm login profile delete &#x3C;name>     # Delete a profile
nlm login profile rename &#x3C;old> &#x3C;new> # Rename a profile

Multi-Profile Support: Each profile gets its own isolated browser session (supports Chrome, Arc, Brave, Edge, Chromium, and more), so you can be logged into multiple Google accounts simultaneously.

Session lifetime: ~20 minutes. Re-authenticate when commands fail with auth errors.

Switching MCP Accounts: The MCP server always uses the active default profile. If you need to switch which Google account the MCP server is communicating with, you MUST use the CLI: run nlm login switch &#x3C;name>
. Your next MCP tool call will instantly use the new account.

Note: Both MCP and CLI share the same authentication backend, so authenticating with one works for both.

2. Notebook Management

MCP Tools

Use tools: notebook_list
, notebook_create
, notebook_get
, notebook_describe
, notebook_query
, notebook_rename
, notebook_delete
. All accept notebook_id
 parameter. Delete requires confirm=True
.

CLI Commands

nlm notebook list                      # List all notebooks
nlm notebook list --json               # JSON output for parsing
nlm notebook list --quiet              # IDs only (for scripting)
nlm notebook create "Title"            # Create notebook, returns ID
nlm notebook get &#x3C;id>                  # Get notebook details
nlm notebook describe &#x3C;id>             # AI-generated summary + suggested topics
nlm notebook query &#x3C;id> "question"     # One-shot Q&#x26;A with sources
nlm notebook rename &#x3C;id> "New Title"   # Rename notebook
nlm notebook delete &#x3C;id> --confirm     # PERMANENT deletion

3. Source Management

MCP Tools

Use source_add
 with these source_type
 values:

- url
 - Web page or YouTube URL (url
 param)

- text
 - Pasted content (text
 + title
 params)

- file
 - Local file upload (file_path
 param)

- drive
 - Google Drive doc (document_id
 + doc_type
 params)

Other tools: source_list_drive
, source_describe
, source_get_content
, source_rename
, source_sync_drive
 (requires confirm=True
), source_delete
 (requires confirm=True
).

CLI Commands

# Adding sources
nlm source add &#x3C;nb-id> --url "https://..."           # Web page
nlm source add &#x3C;nb-id> --url "https://youtube.com/..." # YouTube video
nlm source add &#x3C;nb-id> --text "content" --title "X"  # Pasted text
nlm source add &#x3C;nb-id> --drive &#x3C;doc-id>              # Drive doc (auto-detect type)
nlm source add &#x3C;nb-id> --drive &#x3C;doc-id> --type slides # Explicit type

# Listing and viewing
nlm source list &#x3C;nb-id>                # Table of sources
nlm source list &#x3C;nb-id> --drive        # Show Drive sources with freshness
nlm source list &#x3C;nb-id> --drive -S     # Skip freshness checks (faster)
nlm source get &#x3C;source-id>             # Source metadata
nlm source describe &#x3C;source-id>        # AI summary + keywords
nlm source content &#x3C;source-id>         # Raw text content
nlm source content &#x3C;source-id> -o file.txt  # Export to file

# Drive sync (for stale sources)
nlm source stale &#x3C;nb-id>               # List outdated Drive sources
nlm source sync &#x3C;nb-id> --confirm      # Sync all stale sources
nlm source sync &#x3C;nb-id> --source-ids &#x3C;ids> --confirm  # Sync specific

# Rename
nlm source rename &#x3C;source-id> "New Title" --notebook &#x3C;nb-id>
nlm rename source &#x3C;source-id> "New Title" --notebook &#x3C;nb-id>  # verb-first

# Deletion
nlm source delete &#x3C;source-id> --confirm

Drive types: doc
, slides
, sheets
, pdf

4. Research (Source Discovery)

Research finds NEW sources from the web or Google Drive.

MCP Tools

Use research_start
 with:

- source
: web
 or drive

- mode
: fast
 (~30s) or deep
 (~5min, web only)

Workflow: research_start
 → poll research_status
 → research_import

CLI Commands

# Start research (--notebook-id is REQUIRED)
nlm research start "query" --notebook-id &#x3C;id>              # Fast web (~30s)
nlm research start "query" --notebook-id &#x3C;id> --mode deep  # Deep web (~5min)
nlm research start "query" --notebook-id &#x3C;id> --source drive  # Drive search

# Check progress
nlm research status &#x3C;nb-id>                   # Poll until done (5min max)
nlm research status &#x3C;nb-id> --max-wait 0      # Single check, no waiting
nlm research status &#x3C;nb-id> --task-id &#x3C;tid>   # Check specific task
nlm research status &#x3C;nb-id> --full            # Full details

# Import discovered sources
nlm research import &#x3C;nb-id> &#x3C;task-id>            # Import all
nlm research import &#x3C;nb-id> &#x3C;task-id> --indices 0,2,5  # Import specific

Modes: fast
 (~30s, ~10 sources) | deep
 (~5min, ~40+ sources, web only)

5. Content Generation (Studio)

MCP Tools (Unified Creation)

Use studio_create
 with artifact_type
 and type-specific options. All require confirm=True
.

artifact_type
Key Options

audio

audio_format
: deep_dive/brief/critique/debate, audio_length
: short/default/long

video

video_format
: explainer/brief, visual_style
: auto_select/classic/whiteboard/kawaii/anime/watercolor/retro_print/heritage/paper_craft

report

report_format
: Briefing Doc/Study Guide/Blog Post/Create Your Own, custom_prompt

quiz

question_count
, difficulty
: easy/medium/hard

flashcards

difficulty
: easy/medium/hard

mind_map

title

slide_deck

slide_format
: detailed_deck/presenter_slides, slide_length
: short/default

infographic

orientation
: landscape/portrait/square, detail_level
: concise/standard/detailed, infographic_style
: auto_select/sketch_note/professional/bento_grid/editorial/instructional/bricks/clay/anime/kawaii/scientific

data_table

description
 (REQUIRED)

Common options: source_ids
, language
 (BCP-47 code), focus_prompt

Revise Slides: Use studio_revise
 to revise individual slides in an existing slide deck.

- Requires artifact_id
 (from studio_status
) and slide_instructions

- Creates a NEW artifact — the original is not modified

- Slide numbers are 1-based (slide 1 = first slide)

- Poll studio_status
 after calling to check when the new deck is ready

CLI Commands

All generation commands share these flags:

- --confirm
 or -y
: REQUIRED to execute

- --source-ids &#x3C;id1,id2>
: Limit to specific sources

- --language &#x3C;code>
: BCP-47 code (en, es, fr, de, ja)

# Audio (Podcast)
nlm audio create &#x3C;id> --confirm
nlm audio create &#x3C;id> --format deep_dive --length default --confirm
nlm audio create &#x3C;id> --format brief --focus "key topic" --confirm
# Formats: deep_dive, brief, critique, debate
# Lengths: short, default, long

# Report
nlm report create &#x3C;id> --confirm
nlm report create &#x3C;id> --format "Study Guide" --confirm
nlm report create &#x3C;id> --format "Create Your Own" --prompt "Custom..." --confirm
# Formats: "Briefing Doc", "Study Guide", "Blog Post", "Create Your Own"

# Quiz
nlm quiz create &#x3C;id> --confirm
nlm quiz create &#x3C;id> --count 5 --difficulty 3 --confirm
nlm quiz create &#x3C;id> --count 10 --difficulty 3 --focus "Focus on key concepts" --confirm
# Count: number of questions (default: 2)
# Difficulty: 1-5 (1=easy, 5=hard)
# Focus: optional text to guide quiz generation

# Flashcards
nlm flashcards create &#x3C;id> --confirm
nlm flashcards create &#x3C;id> --difficulty hard --confirm
nlm flashcards create &#x3C;id> --difficulty medium --focus "Focus on definitions" --confirm
# Difficulty: easy, medium, hard
# Focus: optional text to guide flashcard generation

# Mind Map
nlm mindmap create &#x3C;id> --confirm
nlm mindmap create &#x3C;id> --title "Topic Overview" --confirm
nlm mindmap list &#x3C;id>  # List existing mind maps

# Slides
nlm slides create &#x3C;id> --confirm
nlm slides create &#x3C;id> --format presenter --length short --confirm
# Formats: detailed, presenter | Lengths: short, default
nlm slides revise &#x3C;artifact-id> --slide '1 Make the title larger' --confirm
# Creates a NEW deck with revisions. Original unchanged.

# Infographic
nlm infographic create &#x3C;id> --confirm
nlm infographic create &#x3C;id> --orientation portrait --detail detailed --style professional --confirm
# Orientations: landscape, portrait, square
# Detail: concise, standard, detailed
# Styles: auto_select, sketch_note, professional, bento_grid, editorial, instructional, bricks, clay, anime, kawaii, scientific

# Video
nlm video create &#x3C;id> --confirm
nlm video create &#x3C;id> --format brief --style whiteboard --confirm
# Formats: explainer, brief
# Styles: auto_select, classic, whiteboard, kawaii, anime, watercolor, retro_print, heritage, paper_craft

# Data Table
nlm data-table create &#x3C;id> "Extract all dates and events" --confirm
# DESCRIPTION is required as second argument

6. Studio (Artifact Management)

MCP Tools

Use studio_status
 to check progress (or rename with action="rename"
). Use download_artifact
 with artifact_type
 and output_path
. Use export_artifact
 with export_type
: docs/sheets. Delete with studio_delete
 (requires confirm=True
).

CLI Commands

# Check status
nlm studio status &#x3C;nb-id>                          # List all artifacts
nlm studio status &#x3C;nb-id> --full                   # Show full details (including custom prompts)
nlm studio status &#x3C;nb-id> --json                   # JSON output

# Download artifacts
nlm download audio &#x3C;nb-id> --output podcast.mp3
nlm download video &#x3C;nb-id> --output video.mp4
nlm download report &#x3C;nb-id> --output report.md
nlm download slide-deck &#x3C;nb-id> --output slides.pdf           # PDF (default)
nlm download slide-deck &#x3C;nb-id> --output slides.pptx --format pptx  # PPTX
nlm download quiz &#x3C;nb-id> --output quiz.json --format json

# Export to Google Docs/Sheets
nlm export sheets &#x3C;nb-id> &#x3C;artifact-id> --title "My Data Table"
nlm export docs &#x3C;nb-id> &#x3C;artifact-id> --title "My Report"

# Delete artifact
nlm studio delete &#x3C;nb-id> &#x3C;artifact-id> --confirm

Status values: completed
 (✓), in_progress
 (●), failed
 (✗)

Prompt Extraction: The studio_status
 tool returns a custom_instructions
 field for each artifact. This contains the original focus prompt or custom instructions used to generate that artifact (e.g., the prompt for a "Create Your Own" report, or the focus topic for an Audio Overview). This is useful for retrieving the exact prompt that generated a successful artifact.

Renaming Resources

Rename a Source

MCP Tool: source_rename(notebook_id, source_id, new_title)

CLI:

nlm source rename &#x3C;source-id> "New Title" --notebook &#x3C;notebook-id>
nlm rename source &#x3C;source-id> "New Title" --notebook &#x3C;notebook-id>  # verb-first

Rename a Studio Artifact

MCP Tools

Use studio_status
 with action="rename"
, artifact_id
, and new_title
.

CLI Commands

nlm studio rename &#x3C;artifact-id> "New Title"
nlm rename studio &#x3C;artifact-id> "New Title"  # verb-first alternative

Server Info (Version Check)

MCP Tools

Use server_info
 to get version and check for updates:

mcp__notebooklm-mcp__server_info()
# Returns: version, latest_version, update_available, update_command

CLI Commands

nlm --version  # Shows version and update availability

7. Chat Configuration and Notes

MCP Tools

Use chat_configure
 with goal
: default/learning_guide/custom. Use note
 with action
: create/list/update/delete. Delete requires confirm=True
.

CLI Commands

⚠️ AI TOOLS: DO NOT USE nlm chat start
 - It launches an interactive REPL that cannot be controlled programmatically. Use nlm notebook query
 for one-shot Q&#x26;A instead.

For human users at a terminal:

nlm chat start &#x3C;nb-id>  # Launch interactive REPL

REPL Commands:

- /sources
 - List available sources

- /clear
 - Reset conversation context

- /help
 - Show commands

- /exit
 - Exit REPL

Configure chat behavior (works for both REPL and query):

nlm chat configure &#x3C;id> --goal default
nlm chat configure &#x3C;id> --goal learning_guide
nlm chat configure &#x3C;id> --goal custom --prompt "Act as a tutor..."
nlm chat configure &#x3C;id> --response-length longer  # longer, default, shorter

Notes management:

nlm note create &#x3C;nb-id> "Content" --title "Title"
nlm note list &#x3C;nb-id>
nlm note update &#x3C;nb-id> &#x3C;note-id> --content "New content"
nlm note delete &#x3C;nb-id> &#x3C;note-id> --confirm

8. Notebook Sharing

MCP Tools

Use notebook_share_status
 to check, notebook_share_public
 to enable/disable public link, notebook_share_invite
 with email
 and role
: viewer/editor.

CLI Commands

# Check sharing status
nlm share status &#x3C;nb-id>

# Enable/disable public link
nlm share public &#x3C;nb-id>          # Enable
nlm share public &#x3C;nb-id> --off    # Disable

# Invite collaborator
nlm share invite &#x3C;nb-id> user@example.com
nlm share invite &#x3C;nb-id> user@example.com --role editor

9. Aliases (UUID Shortcuts)

Simplify long UUIDs:

nlm alias set myproject abc123-def456...  # Create alias (auto-detects type)
nlm alias get myproject                    # Resolve to UUID
nlm alias list                             # List all aliases
nlm alias delete myproject                 # Remove alias

# Use aliases anywhere
nlm notebook get myproject
nlm source list myproject
nlm audio create myproject --confirm

10. Configuration

CLI-only commands for managing settings:

nlm config show                              # Show current config
nlm config get &#x3C;key>                         # Get specific setting
nlm config set &#x3C;key> &#x3C;value>                 # Update setting
nlm config set output.format json            # Change default output

# For switching profiles, prefer the simpler command:
nlm login switch work                        # Switch default profile

Available Settings:

Key
Default
Description

output.format

table

Default output format (table, json)

output.color

true

Enable colored output

output.short_ids

true

Show shortened IDs

auth.browser

auto

Preferred browser for login (auto, chrome, arc, brave, edge, chromium, vivaldi, opera)

auth.default_profile

default

Profile to use when --profile
 not specified

11. Skill Management

Manage the NotebookLM skill installation for various AI assistants:

nlm skill list                              # Show installation status
nlm skill update                            # Update all outdated skills
nlm skill update &#x3C;tool>                     # Update specific skill (e.g., claude-code)
nlm skill install &#x3C;tool>                    # Install skill
nlm skill uninstall &#x3C;tool>                  # Uninstall skill

Verb-first aliases: nlm update skill
, nlm list skills
, nlm install skill

Output Formats

Most list commands support multiple formats:

Flag
Description

(none)
Rich table (human-readable)

--json

JSON output (for parsing)

--quiet

IDs only (for piping)

--title

"ID: Title" format

--url

"ID: URL" format (sources only)

--full

All columns/details

12. Batch Operations

Perform the same action across multiple notebooks at once.

MCP Tools

Use batch
 with action
 parameter. Select notebooks by notebook_names
, tags
, or all=True
.

batch(action="query", query="What are the key findings?", notebook_names="AI Research, Dev Tools")
batch(action="add_source", source_url="https://example.com", tags="ai,research")
batch(action="create", titles="Project A, Project B, Project C")
batch(action="delete", notebook_names="Old Project", confirm=True)
batch(action="studio", artifact_type="audio", tags="research", confirm=True)

CLI Commands

nlm batch query "What are the key takeaways?" --notebooks "id1,id2"
nlm batch query "Summarize" --tags "ai,research"      # Query by tag
nlm batch query "Summarize" --all                      # Query ALL notebooks
nlm batch add-source --url "https://..." --notebooks "id1,id2"
nlm batch create "Project A, Project B, Project C"     # Create multiple
nlm batch delete --notebooks "id1,id2" --confirm       # Delete multiple
nlm batch studio --type audio --tags "research" --confirm  # Generate across notebooks

13. Cross-Notebook Query

Query multiple notebooks and get aggregated answers with per-notebook citations.

MCP Tools

cross_notebook_query(query="Compare approaches", notebook_names="Notebook A, Notebook B")
cross_notebook_query(query="Summarize", tags="ai,research")
cross_notebook_query(query="Everything", all=True)

CLI Commands

nlm cross query "What features are discussed?" --notebooks "id1,id2"
nlm cross query "Compare approaches" --tags "ai,research"
nlm cross query "Summarize everything" --all

14. Pipelines

Define and execute multi-step notebook workflows. Three built-in pipelines plus support for custom YAML pipelines.

MCP Tools

pipeline(action="list")  # List available pipelines
pipeline(action="run", notebook_id="...", pipeline_name="ingest-and-podcast", input_url="https://...")

CLI Commands

nlm pipeline list                                         # List available pipelines
nlm pipeline run &#x3C;notebook> ingest-and-podcast --url "https://..."
nlm pipeline run &#x3C;notebook> research-and-report --url "https://..."
nlm pipeline run &#x3C;notebook> multi-format                  # Audio + report + flashcards

Built-in pipelines: ingest-and-podcast
, research-and-report
, multi-format

Create custom pipelines: add YAML files to ~/.notebooklm-mcp-cli/pipelines/

15. Tags &#x26; Smart Select

Tag notebooks for organization and use tags to target batch operations.

MCP Tools

tag(action="add", notebook_id="...", tags="ai,research,llm")
tag(action="remove", notebook_id="...", tags="ai")
tag(action="list")                           # List all tagged notebooks
tag(action="select", query="ai research")    # Find notebooks by tag match

CLI Commands

nlm tag add &#x3C;notebook> --tags "ai,research,llm"           # Add tags
nlm tag add &#x3C;notebook> --tags "ai" --title "My Notebook"  # With display title
nlm tag remove &#x3C;notebook> --tags "ai"                     # Remove tags
nlm tag list                                              # List all tagged notebooks
nlm tag select "ai research"                              # Find notebooks by tag match

Common Patterns

Pattern 1: Research → Podcast Pipeline

nlm notebook create "AI Research 2026"   # Capture ID
nlm alias set ai &#x3C;notebook-id>
nlm research start "agentic AI trends" --notebook-id ai --mode deep
nlm research status ai --max-wait 300    # Wait up to 5 min
nlm research import ai &#x3C;task-id>         # Import all sources
nlm audio create ai --format deep_dive --confirm
nlm studio status ai                     # Check generation progress

Pattern 2: Quick Content Ingestion

nlm source add &#x3C;id> --url "https://example1.com"
nlm source add &#x3C;id> --url "https://example2.com"
nlm source add &#x3C;id> --text "My notes..." --title "Notes"
nlm source list &#x3C;id>

Pattern 3: Study Materials Generation

nlm report create &#x3C;id> --format "Study Guide" --confirm
nlm quiz create &#x3C;id> --count 10 --difficulty 3 --focus "Exam prep" --confirm
nlm flashcards create &#x3C;id> --difficulty medium --focus "Core terms" --confirm

Pattern 4: Drive Document Workflow

nlm source add &#x3C;id> --drive 1KQH3eW0hMBp7WK... --type slides
# ... time passes, document is edited ...
nlm source stale &#x3C;id>                    # Check freshness
nlm source sync &#x3C;id> --confirm           # Sync if stale

Pattern 5: Batch &#x26; Cross-Notebook Workflow

# Tag notebooks for organization
nlm tag add &#x3C;id1> --tags "ai,research"
nlm tag add &#x3C;id2> --tags "ai,product"

# Query across tagged notebooks
nlm cross query "What are the main conclusions?" --tags "ai"

# Batch generate podcasts for all tagged notebooks
nlm batch studio --type audio --tags "ai" --confirm

# Run a pipeline on a single notebook
nlm pipeline run &#x3C;id> ingest-and-podcast --url "https://example.com"

Error Recovery

Error
Cause
Solution

"Cookies have expired"
Session timeout
nlm login

"authentication may have expired"
Session timeout
nlm login

"Notebook not found"
Invalid ID
nlm notebook list

"Source not found"
Invalid ID
nlm source list &#x3C;nb-id>

"Rate limit exceeded"
Too many calls
Wait 30s, retry

"Research already in progress"
Pending research
Use --force
 or import first

Browser doesn't launch
Port conflict
Close browser, retry

Rate Limiting

Wait between operations to avoid rate limits:

- Source operations: 2 seconds

- Content generation: 5 seconds

- Research operations: 2 seconds

- Query operations: 2 seconds

Advanced Reference

For detailed information, see:

- references/command_reference.md: Complete command signatures

- references/troubleshooting.md: Detailed error handling

- references/workflows.md: End-to-end task sequences

Source Transparency

This detail page is rendered from real SKILL.md content. Trust labels are metadata-based hints, not a safety guarantee.

Related Skills

Related by shared tags or category signals.

Coding

nlm-cli-skill

No summary provided by upstream source.

Repository SourceNeeds Review
General

universal-skills-manager

No summary provided by upstream source.

Repository SourceNeeds Review
Coding

OPC Landing Page Manager

Landing page strategy, copywriting, design, and code generation for solo entrepreneurs. From product idea to a complete, self-contained, conversion-optimized...

Registry SourceRecently Updated
Coding

OPC Product Manager

Product spec generation for solo entrepreneurs. Turns a one-sentence idea into a build-ready spec that AI coding agents (Claude Code, etc.) can execute direc...

Registry SourceRecently Updated