decoct
Compress infrastructure configuration for LLM context windows. Strips platform defaults, redacts secrets, and annotates deviations from your design standards — saving 15–57% of tokens depending on platform and tier. 25 bundled schemas covering INI, YAML, and JSON input. Auto-detects Docker Compose, Kubernetes, Ansible, cloud-init, Terraform state, GitHub Actions, Traefik, and Prometheus.
The problem
Infrastructure operations increasingly involve feeding configuration into LLM context windows — for AI-assisted troubleshooting, agent-driven operations, architecture review, and code generation against live state.
But the data is full of noise. A typical Docker Compose file is packed with platform defaults the model already knows, system-managed metadata nobody asked about, and structural boilerplate. The things that actually matter — where your configuration deviates from your standards — are buried and indistinguishable from everything else.
The result: context windows stuffed with low-value tokens, expensive to run, and missing the intent that would make them useful.
The approach
To concoct is to build something up — mixing ingredients, adding complexity. Infrastructure configuration is concocted: layers of platform defaults, boilerplate, system-managed fields, and actual intent all mixed into one document.
To decoct is the opposite. It's an old term from chemistry meaning to extract the essence by boiling something down — simmering raw material until only the concentrated, useful compounds remain. It's also, we're aware, a slightly unusual word to say out loud in a professional setting — which is fine. Memorable is good.
More details coming soon.