7 min read Copy URL Copied to clipboard! Jan 27, 2026
We expected skills to be the solution for teaching coding agents framework-specific knowledge. After building evals focused on Next.js 16 APIs, we found something unexpected.
A compressed 8KB docs index embedded directly in AGENTS.md achieved a 100% pass rate, while skills maxed out at 79% even with explicit instructions telling the agent to use them. Without those instructions, skills performed no better than having no documentation at all.
Here's what we tried, what we learned, and how you can set this up for your own Next.js projects.
Link to heading The problem we were trying to solve
AI coding agents rely on training data that becomes outdated. Next.js 16 introduces APIs like 'use cache' , connection() , and forbidden() that aren't in current model training data. When agents don't know these APIs, they generate incorrect code or fall back to older patterns.
The reverse can also be true, where you're running an older Next.js version and the model suggests newer APIs that don't exist in your project yet. We wanted to fix this by giving agents access to version-matched documentation.
Link to heading Two approaches for teaching agents framework knowledge
Before diving into results, a quick explanation of the two approaches we tested:
Skills are an open standard for packaging domain knowledge that coding agents can use. A skill bundles prompts, tools, and documentation that an agent can invoke on demand. The idea is that the agent recognizes when it needs framework-specific help, invokes the skill, and gets access to relevant docs.
... continue reading