token-usage-tracker
Quick start
- Configure defaults in
skill-config.json(timezone, log_folder). - Install scripts (examples provided) into your workspace and wire the interceptor into your message pipeline.
- Use
scripts/context_summarizer.pybefore sending large contexts to reduce token usage.
What this skill provides
- Logging:
token_tracker.pywrites per-call token usage to a JSONL log. Includes timestamp normalization. - Interceptor:
token_interceptor.pyexample that normalizes timestamps and forwards sanitized messages to the tracker. - Alerts:
token_alerts.pyexample for threshold-based alerts (no external posting by default). - Compression:
context_summarizer.pyproduces short summaries to reduce token payloads. - Utilities: migration and cleanup scripts (convert timestamps, dedupe log entries).
When to use
- Use this skill when you want transparent per-call token accounting, to keep token usage low, or to protect sensitive/verbose contexts by summarizing before sending to the model.
Files
- scripts/
- token_interceptor.py — example interceptor (normalizes timestamps)
- token_tracker.py — logging helper
- token_alerts.py — alert examples
- context_summarizer.py — compression helper
- migrate_timestamps.py — migration utility
- dedupe_log.py — dedupe utility
- references/
- examples/systemd/ — example unit files (install manually)
- skill-config.json — configurable defaults
- README.md — usage and install notes
Configuration
See skill-config.json for defaults. The skill exposes:
- timezone: default UTC
- log_folder: default ./skills/logs (relative to OpenClaw workspace)
- compression settings: summary_target_tokens, max_context_tokens
Security and installation
- The scripts are examples and safe by default. They do not change system state or install services automatically.
- Example systemd/unit files are provided in
references/systemd/— apply them manually after review.
License: MIT (adapt as you prefer)