LLM Usage¶
Compitum plays nicely with LLMs by emitting structured JSON certificates and by keeping core logic auditable. To give an LLM rich context about the repo without noise, use the lean snapshot.
Quick: Use the Snapshot¶
Download or point models to:
https://compitum.space/docs/repo_snapshot.jsonlRecommended note: “Use JSONL lines with
type,path,content. Build an index bypathand citepath:linein answers.”
Snapshot with Paleae¶
# In the repo root
curl -fsSL https://raw.githubusercontent.com/PaulTiffany/paleae/main/paleae.py -o paleae.py
python paleae.py --profile ai_optimized \
--include "^src/compitum/" \
--include "^(README|PHILOSOPHY|ACCESSIBILITY|SECURITY|CONTRIBUTING|SUPPORT|CHANGELOG)\.md$" \
--include "^docs/(Getting-Started|CLI|Invariants|Philosophy|API-Reference|LLM-Usage)\.md$"
Tip: Add exclusions to .paleaeignore to skip caches/artifacts: .venv*, __pycache__/, .pytest_cache/, .ruff_cache/, .mypy_cache/, .hypothesis/, data/, artifacts/, reports/.
Useful Entry Points for LLMs¶
src/compitum/router.py:1— routing loop and certificatesrc/compitum/energy.py:1— utility components, invariantssrc/compitum/constraints.py:1— feasibility and shadow pricessrc/compitum/boundary.py:1— tie/boundary diagnosticssrc/compitum/control.py:1— trust region and driftPHILOSOPHY.md:1— instantaneous RL, geometry, constraintsACCESSIBILITY.md:1— a11y standards for docs and outputs
Traces for Verification¶
compitum route --prompt "..." --trace > trace.json
Traces include utility_components, constraints, boundary_analysis, and drift_status so an LLM can reason with mechanistic signals instead of prose.
Build a Size-Budgeted Context Pack¶
Use the helper to create a compact snapshot that includes .paleaeignore and core code/docs:
python tools/build_llm_context_pack.py --out compitum_context.jsonl --target-size 1MB
Adjust --target-size (e.g., 800KB, 2MB). The script ensures .paleaeignore is present and trims low-priority files if needed.