HeadlinesBriefing favicon HeadlinesBriefing.com

Study Finds Precise AGENTS.md Patterns Boost AI Coding Agents

Hacker News •
×

Engineers at a large monorepo pulled dozens of AGENTS.md files and ran them through their internal AuggieBench suite. Comparing agent‑generated pull requests with golden PRs showed that the best files lifted code‑generation quality as much as moving from a Haiku model to an Opus model, while the worst degraded performance below a baseline with no documentation. The variance prompted a systematic analysis.

Researchers identified seven documentation patterns that helped agents. Progressive disclosure—keeping the main AGENTS.md to 100–150 lines and offloading details to focused reference files—added 10–15% across all metrics for modules of 100 core files. Procedural, numbered workflows cut missing‑file errors from 40% to 10% and raised correctness by 25%. Decision tables that pair each “don’t” with a concrete “do” improved best‑practice scores by a quarter.

Over‑documentation proved harmful: AGENTS.md files surrounded by hundreds of kilobytes of specs caused agents to wander through irrelevant pages, a phenomenon the team calls “context rot.” Likewise, long lists of warnings without paired solutions made the model overly cautious and slowed progress. Keeping docs modular and pairing every prohibition with an explicit alternative yields the most reliable agent output.