python-patterns

Python Development Patterns

Safety Notice

This listing is imported from skills.sh public index metadata. Review upstream SKILL.md and repository scripts before running.

Copy this and send it to your AI assistant to learn

Install skill "python-patterns" with this command: npx skills add oldwinter/skills/oldwinter-skills-python-patterns

Python Development Patterns

Idiomatic Python patterns and best practices for building robust, efficient, and maintainable applications.

When to Activate

  • Writing new Python code

  • Reviewing Python code

  • Refactoring existing Python code

  • Designing Python packages/modules

Core Principles

  1. Readability Counts

Python prioritizes readability. Code should be obvious and easy to understand.

Good: Clear and readable

def get_active_users(users: list[User]) -> list[User]: """Return only active users from the provided list.""" return [user for user in users if user.is_active]

Bad: Clever but confusing

def get_active_users(u): return [x for x in u if x.a]

  1. Explicit is Better Than Implicit

Avoid magic; be clear about what your code does.

Good: Explicit configuration

import logging

logging.basicConfig( level=logging.INFO, format='%(asctime)s - %(name)s - %(levelname)s - %(message)s' )

Bad: Hidden side effects

import some_module some_module.setup() # What does this do?

  1. EAFP - Easier to Ask Forgiveness Than Permission

Python prefers exception handling over checking conditions.

Good: EAFP style

def get_value(dictionary: dict, key: str) -> Any: try: return dictionary[key] except KeyError: return default_value

Bad: LBYL (Look Before You Leap) style

def get_value(dictionary: dict, key: str) -> Any: if key in dictionary: return dictionary[key] else: return default_value

Type Hints

Basic Type Annotations

from typing import Optional, List, Dict, Any

def process_user( user_id: str, data: Dict[str, Any], active: bool = True ) -> Optional[User]: """Process a user and return the updated User or None.""" if not active: return None return User(user_id, data)

Modern Type Hints (Python 3.9+)

Python 3.9+ - Use built-in types

def process_items(items: list[str]) -> dict[str, int]: return {item: len(item) for item in items}

Python 3.8 and earlier - Use typing module

from typing import List, Dict

def process_items(items: List[str]) -> Dict[str, int]: return {item: len(item) for item in items}

Type Aliases and TypeVar

from typing import TypeVar, Union

Type alias for complex types

JSON = Union[dict[str, Any], list[Any], str, int, float, bool, None]

def parse_json(data: str) -> JSON: return json.loads(data)

Generic types

T = TypeVar('T')

def first(items: list[T]) -> T | None: """Return the first item or None if list is empty.""" return items[0] if items else None

Protocol-Based Duck Typing

from typing import Protocol

class Renderable(Protocol): def render(self) -> str: """Render the object to a string."""

def render_all(items: list[Renderable]) -> str: """Render all items that implement the Renderable protocol.""" return "\n".join(item.render() for item in items)

Error Handling Patterns

Specific Exception Handling

Good: Catch specific exceptions

def load_config(path: str) -> Config: try: with open(path) as f: return Config.from_json(f.read()) except FileNotFoundError as e: raise ConfigError(f"Config file not found: {path}") from e except json.JSONDecodeError as e: raise ConfigError(f"Invalid JSON in config: {path}") from e

Bad: Bare except

def load_config(path: str) -> Config: try: with open(path) as f: return Config.from_json(f.read()) except: return None # Silent failure!

Exception Chaining

def process_data(data: str) -> Result: try: parsed = json.loads(data) except json.JSONDecodeError as e: # Chain exceptions to preserve the traceback raise ValueError(f"Failed to parse data: {data}") from e

Custom Exception Hierarchy

class AppError(Exception): """Base exception for all application errors.""" pass

class ValidationError(AppError): """Raised when input validation fails.""" pass

class NotFoundError(AppError): """Raised when a requested resource is not found.""" pass

Usage

def get_user(user_id: str) -> User: user = db.find_user(user_id) if not user: raise NotFoundError(f"User not found: {user_id}") return user

Context Managers

Resource Management

Good: Using context managers

def process_file(path: str) -> str: with open(path, 'r') as f: return f.read()

Bad: Manual resource management

def process_file(path: str) -> str: f = open(path, 'r') try: return f.read() finally: f.close()

Custom Context Managers

from contextlib import contextmanager

@contextmanager def timer(name: str): """Context manager to time a block of code.""" start = time.perf_counter() yield elapsed = time.perf_counter() - start print(f"{name} took {elapsed:.4f} seconds")

Usage

with timer("data processing"): process_large_dataset()

Context Manager Classes

class DatabaseTransaction: def init(self, connection): self.connection = connection

def __enter__(self):
    self.connection.begin_transaction()
    return self

def __exit__(self, exc_type, exc_val, exc_tb):
    if exc_type is None:
        self.connection.commit()
    else:
        self.connection.rollback()
    return False  # Don't suppress exceptions

Usage

with DatabaseTransaction(conn): user = conn.create_user(user_data) conn.create_profile(user.id, profile_data)

Comprehensions and Generators

List Comprehensions

Good: List comprehension for simple transformations

names = [user.name for user in users if user.is_active]

Bad: Manual loop

names = [] for user in users: if user.is_active: names.append(user.name)

Complex comprehensions should be expanded

Bad: Too complex

result = [x * 2 for x in items if x > 0 if x % 2 == 0]

Good: Use a generator function

def filter_and_transform(items: Iterable[int]) -> list[int]: result = [] for x in items: if x > 0 and x % 2 == 0: result.append(x * 2) return result

Generator Expressions

Good: Generator for lazy evaluation

total = sum(x * x for x in range(1_000_000))

Bad: Creates large intermediate list

total = sum([x * x for x in range(1_000_000)])

Generator Functions

def read_large_file(path: str) -> Iterator[str]: """Read a large file line by line.""" with open(path) as f: for line in f: yield line.strip()

Usage

for line in read_large_file("huge.txt"): process(line)

Data Classes and Named Tuples

Data Classes

from dataclasses import dataclass, field from datetime import datetime

@dataclass class User: """User entity with automatic init, repr, and eq.""" id: str name: str email: str created_at: datetime = field(default_factory=datetime.now) is_active: bool = True

Usage

user = User( id="123", name="Alice", email="alice@example.com" )

Data Classes with Validation

@dataclass class User: email: str age: int

def __post_init__(self):
    # Validate email format
    if "@" not in self.email:
        raise ValueError(f"Invalid email: {self.email}")
    # Validate age range
    if self.age < 0 or self.age > 150:
        raise ValueError(f"Invalid age: {self.age}")

Named Tuples

from typing import NamedTuple

class Point(NamedTuple): """Immutable 2D point.""" x: float y: float

def distance(self, other: 'Point') -> float:
    return ((self.x - other.x) ** 2 + (self.y - other.y) ** 2) ** 0.5

Usage

p1 = Point(0, 0) p2 = Point(3, 4) print(p1.distance(p2)) # 5.0

Decorators

Function Decorators

import functools import time

def timer(func: Callable) -> Callable: """Decorator to time function execution.""" @functools.wraps(func) def wrapper(*args, **kwargs): start = time.perf_counter() result = func(*args, **kwargs) elapsed = time.perf_counter() - start print(f"{func.name} took {elapsed:.4f}s") return result return wrapper

@timer def slow_function(): time.sleep(1)

slow_function() prints: slow_function took 1.0012s

Parameterized Decorators

def repeat(times: int): """Decorator to repeat a function multiple times.""" def decorator(func: Callable) -> Callable: @functools.wraps(func) def wrapper(*args, **kwargs): results = [] for _ in range(times): results.append(func(*args, **kwargs)) return results return wrapper return decorator

@repeat(times=3) def greet(name: str) -> str: return f"Hello, {name}!"

greet("Alice") returns ["Hello, Alice!", "Hello, Alice!", "Hello, Alice!"]

Class-Based Decorators

class CountCalls: """Decorator that counts how many times a function is called.""" def init(self, func: Callable): functools.update_wrapper(self, func) self.func = func self.count = 0

def __call__(self, *args, **kwargs):
    self.count += 1
    print(f"{self.func.__name__} has been called {self.count} times")
    return self.func(*args, **kwargs)

@CountCalls def process(): pass

Each call to process() prints the call count

Concurrency Patterns

Threading for I/O-Bound Tasks

import concurrent.futures import threading

def fetch_url(url: str) -> str: """Fetch a URL (I/O-bound operation).""" import urllib.request with urllib.request.urlopen(url) as response: return response.read().decode()

def fetch_all_urls(urls: list[str]) -> dict[str, str]: """Fetch multiple URLs concurrently using threads.""" with concurrent.futures.ThreadPoolExecutor(max_workers=10) as executor: future_to_url = {executor.submit(fetch_url, url): url for url in urls} results = {} for future in concurrent.futures.as_completed(future_to_url): url = future_to_url[future] try: results[url] = future.result() except Exception as e: results[url] = f"Error: {e}" return results

Multiprocessing for CPU-Bound Tasks

def process_data(data: list[int]) -> int: """CPU-intensive computation.""" return sum(x ** 2 for x in data)

def process_all(datasets: list[list[int]]) -> list[int]: """Process multiple datasets using multiple processes.""" with concurrent.futures.ProcessPoolExecutor() as executor: results = list(executor.map(process_data, datasets)) return results

Async/Await for Concurrent I/O

import asyncio

async def fetch_async(url: str) -> str: """Fetch a URL asynchronously.""" import aiohttp async with aiohttp.ClientSession() as session: async with session.get(url) as response: return await response.text()

async def fetch_all(urls: list[str]) -> dict[str, str]: """Fetch multiple URLs concurrently.""" tasks = [fetch_async(url) for url in urls] results = await asyncio.gather(*tasks, return_exceptions=True) return dict(zip(urls, results))

Package Organization

Standard Project Layout

myproject/ ├── src/ │ └── mypackage/ │ ├── init.py │ ├── main.py │ ├── api/ │ │ ├── init.py │ │ └── routes.py │ ├── models/ │ │ ├── init.py │ │ └── user.py │ └── utils/ │ ├── init.py │ └── helpers.py ├── tests/ │ ├── init.py │ ├── conftest.py │ ├── test_api.py │ └── test_models.py ├── pyproject.toml ├── README.md └── .gitignore

Import Conventions

Good: Import order - stdlib, third-party, local

import os import sys from pathlib import Path

import requests from fastapi import FastAPI

from mypackage.models import User from mypackage.utils import format_name

Good: Use isort for automatic import sorting

pip install isort

init.py for Package Exports

mypackage/init.py

"""mypackage - A sample Python package."""

version = "1.0.0"

Export main classes/functions at package level

from mypackage.models import User, Post from mypackage.utils import format_name

all = ["User", "Post", "format_name"]

Memory and Performance

Using slots for Memory Efficiency

Bad: Regular class uses dict (more memory)

class Point: def init(self, x: float, y: float): self.x = x self.y = y

Good: slots reduces memory usage

class Point: slots = ['x', 'y']

def __init__(self, x: float, y: float):
    self.x = x
    self.y = y

Generator for Large Data

Bad: Returns full list in memory

def read_lines(path: str) -> list[str]: with open(path) as f: return [line.strip() for line in f]

Good: Yields lines one at a time

def read_lines(path: str) -> Iterator[str]: with open(path) as f: for line in f: yield line.strip()

Avoid String Concatenation in Loops

Bad: O(n²) due to string immutability

result = "" for item in items: result += str(item)

Good: O(n) using join

result = "".join(str(item) for item in items)

Good: Using StringIO for building

from io import StringIO

buffer = StringIO() for item in items: buffer.write(str(item)) result = buffer.getvalue()

Python Tooling Integration

Essential Commands

Code formatting

black . isort .

Linting

ruff check . pylint mypackage/

Type checking

mypy .

Testing

pytest --cov=mypackage --cov-report=html

Security scanning

bandit -r .

Dependency management

pip-audit safety check

pyproject.toml Configuration

[project] name = "mypackage" version = "1.0.0" requires-python = ">=3.9" dependencies = [ "requests>=2.31.0", "pydantic>=2.0.0", ]

[project.optional-dependencies] dev = [ "pytest>=7.4.0", "pytest-cov>=4.1.0", "black>=23.0.0", "ruff>=0.1.0", "mypy>=1.5.0", ]

[tool.black] line-length = 88 target-version = ['py39']

[tool.ruff] line-length = 88 select = ["E", "F", "I", "N", "W"]

[tool.mypy] python_version = "3.9" warn_return_any = true warn_unused_configs = true disallow_untyped_defs = true

[tool.pytest.ini_options] testpaths = ["tests"] addopts = "--cov=mypackage --cov-report=term-missing"

Quick Reference: Python Idioms

Idiom Description

EAFP Easier to Ask Forgiveness than Permission

Context managers Use with for resource management

List comprehensions For simple transformations

Generators For lazy evaluation and large datasets

Type hints Annotate function signatures

Dataclasses For data containers with auto-generated methods

slots

For memory optimization

f-strings For string formatting (Python 3.6+)

pathlib.Path

For path operations (Python 3.4+)

enumerate

For index-element pairs in loops

Anti-Patterns to Avoid

Bad: Mutable default arguments

def append_to(item, items=[]): items.append(item) return items

Good: Use None and create new list

def append_to(item, items=None): if items is None: items = [] items.append(item) return items

Bad: Checking type with type()

if type(obj) == list: process(obj)

Good: Use isinstance

if isinstance(obj, list): process(obj)

Bad: Comparing to None with ==

if value == None: process()

Good: Use is

if value is None: process()

Bad: from module import *

from os.path import *

Good: Explicit imports

from os.path import join, exists

Bad: Bare except

try: risky_operation() except: pass

Good: Specific exception

try: risky_operation() except SpecificError as e: logger.error(f"Operation failed: {e}")

Remember: Python code should be readable, explicit, and follow the principle of least surprise. When in doubt, prioritize clarity over cleverness.

Source Transparency

This detail page is rendered from real SKILL.md content. Trust labels are metadata-based hints, not a safety guarantee.

Related Skills

Related by shared tags or category signals.

Coding

github-cli

No summary provided by upstream source.

Repository SourceNeeds Review
Coding

aws-cli

No summary provided by upstream source.

Repository SourceNeeds Review
Coding

argocd-cli

No summary provided by upstream source.

Repository SourceNeeds Review