pytest-best-practices

Expert guidance for writing high-quality pytest tests. Use when writing tests, setting up fixtures, parametrizing, mocking, or reviewing test code.

Safety Notice

This listing is imported from skills.sh public index metadata. Review upstream SKILL.md and repository scripts before running.

Copy this and send it to your AI assistant to learn

Install skill "pytest-best-practices" with this command: npx skills add cfircoo/claude-code-toolkit/cfircoo-claude-code-toolkit-pytest-best-practices

<objective> Provide pytest best practices and patterns for writing maintainable, efficient tests. </objective>

<essential_principles>

Test Independence

  • Each test must run in isolation - no shared state between tests
  • Use fixtures for setup/teardown, never class-level mutable state
  • Tests should pass regardless of execution order

Naming Conventions

  • Files: test_*.py or *_test.py
  • Functions: test_<description>()
  • Classes: Test<ClassName>
  • Fixtures: descriptive lowercase_with_underscores

Directory Structure

tests/
├── conftest.py          # Shared fixtures
├── unit/
│   └── test_module.py
├── integration/
│   └── test_api.py
└── fixtures/            # Test data files

Core Testing Rules

  • Use plain assert statements (pytest provides detailed failure messages)
  • One logical assertion per test when practical
  • Test edge cases: empty inputs, boundaries, invalid data, errors
  • Keep tests focused and readable

</essential_principles>

<quick_reference>

PatternUse Case
@pytest.fixtureSetup/teardown, dependency injection
@pytest.mark.parametrizeRun test with multiple inputs
@pytest.mark.skipSkip test temporarily
@pytest.mark.xfailExpected failure (known bug)
pytest.raises(Exception)Test exception raising
pytest.approx(value)Float comparison
mocker.patch()Mock dependencies
conftest.pyShare fixtures across modules

Common Commands

pytest -v                    # Verbose
pytest -x                    # Stop on first failure
pytest --lf                  # Run last failed
pytest -k "pattern"          # Match test names
pytest -m "marker"           # Run marked tests
pytest --cov=src             # Coverage report

</quick_reference>

<routing>

Based on what you're doing, read the relevant reference:

TaskReference
Setting up fixtures, scopes, factoriesreferences/fixtures.md
Parametrizing tests, multiple inputsreferences/parametrization.md
Mocking, patching, faking dependenciesreferences/mocking.md
Markers, exceptions, assertions, asyncreferences/patterns.md
</routing> <dependencies> ```bash pip install pytest pytest-asyncio pytest-mock pytest-cov pytest-xdist ``` </dependencies>

Source Transparency

This detail page is rendered from real SKILL.md content. Trust labels are metadata-based hints, not a safety guarantee.

Related Skills

Related by shared tags or category signals.

Coding

sqlalchemy-postgres

No summary provided by upstream source.

Repository SourceNeeds Review
Coding

generate-prd

No summary provided by upstream source.

Repository SourceNeeds Review
Coding

create-plans

No summary provided by upstream source.

Repository SourceNeeds Review