Skip to content
Tech News
← Back to articles

An experimental guide to Answer Engine Optimization

read original get SEO Optimization Toolkit → more articles
Why This Matters

As AI-driven search and answer engines become more prevalent, traditional SEO strategies are evolving into Answer Engine Optimization (AEO), emphasizing content accessibility for AI models. This shift highlights the importance for businesses to adapt their content pipelines to ensure their information is easily understood and cited by AI systems, impacting visibility and traffic in the digital landscape.

Key Takeaways

Nearly 60% of Google searches now end without a click. AI referral traffic to major websites grew 357% year over year in 2025. When someone asks ChatGPT or Perplexity "what's the best Canadian hosting platform," the answer doesn't come from a ranked list of blue links. It comes from whatever the model already knows, or can fetch in real time.

Being the best result on a search engine results page may no longer be sufficient. You might also need to be the source an AI model can most easily understand and cite. That's the premise behind Answer Engine Optimization (AEO): making your content legible to the systems that are increasingly mediating how people find information.

Here's the problem AEO tries to solve. A browser downloads your JavaScript, hydrates your React tree, renders your components, and gives the user a fully interactive page. An AI answer engine downloads your HTML and tries to extract meaning from it. What it gets is something like this: a <div> with a class name that's a hash, containing another <div> , containing a <section> with Tailwind utility classes, wrapping a <h2> that finally has the text it's looking for. If the model is doing real-time retrieval (like Perplexity or ChatGPT with browsing), it has a time budget, and the harder you make it to find the content, the less likely you are to get cited.

So I rebuilt my content pipeline to fix this: moved everything into markdown, added middleware to serve it directly to AI agents, and layered in the metadata they need to cite you accurately. This post walks through each step, in case you want to try it too.

Fair warning: the standards here are drafts or proposals (llms.txt, Content-Signal), nobody knows which techniques will matter long term, and I can't yet measure direct impact on AI citations. I tried them anyway because the underlying trend feels real, it was a fun engineering challenge, and the results are useful regardless.

Step 1: put all your content in markdown

Move every page on your marketing site from JSX components into markdown files. That includes rich landing pages with hero sections, feature grids, comparison tables, and FAQ sections. The content directory becomes the single source of truth for all page content.

Markdown is the right foundation because it's the format AI models already understand best. They're trained on enormous amounts of it (docs, READMEs, blog posts, wikis), so the structure is already familiar to them. Compare that to HTML where the same information is buried in nested <div> s, CSS class hashes, and wrapper elements that exist only for styling.

Markdoc is Stripe's content authoring system, essentially markdown with a tag syntax for custom components. MDX solves a similar problem by letting you embed JSX directly in markdown, and either works well for this approach since LLMs handle both formats fine. If you're on Next.js App Router, use the core @markdoc/markdoc library rather than the @markdoc/next.js plugin, which targets Pages Router.

A landing page that used to be a React component full of hardcoded strings becomes something like this:

... continue reading