agency / ai

Your website has two audiences now. Most agencies build for one.

One is your team using AI tools to work faster. The other is the AI agents visiting your site. Agencies still building for neither are a problem.

A marketing lead asks Claude Code to draft a landing page from their CMS. Done in five minutes, on-brand, submitted for review. That same afternoon, a potential customer asks ChatGPT to summarize the top vendors in the category. The site gets quoted with wrong information. Or not at all.

Both of those interactions happen on most websites right now. Most agencies are building for neither.

Your website has two audiences now

For a decade the website audience was singular. A human with a browser. You designed for them, measured them, optimized for them. SEO was a separate concern but the reader was still human, eventually.

That has changed. Your site now has two audiences.

One is your own team, working with AI tools to move faster. A marketing manager drafting pages with Claude Code. A content person using Lovable to prototype a microsite. A designer asking Cursor to help update a component. This is already happening in most marketing teams we talk to.

The other is the agent layer. AI search, chat assistants, research agents that crawl the web to answer questions or complete tasks. ChatGPT, Perplexity, Claude, Gemini, and a growing list of specialist agents behind them. They do not read your page the way a human does. They read the HTML, follow headers, prefer markdown, and try to figure out what your site is for.

Both audiences grew fast. Both now meaningfully shape how your business gets found and how your team executes. And most sites are invisible to both.

What this means for your team

A team using AI tools is faster and more productive, but only if the system they are building into cooperates with those tools.

If your CMS is a black box with no API surface that an AI tool can reach, your team either works around it or stops using their tools. If the components are undocumented, the model guesses and gets them wrong. If the design tokens and layout rules live in one person’s head, the output drifts off-brand within two generations.

This is fixable. It is also boring infrastructure work that most agencies do not do because nobody has asked for it yet.

What it looks like in practice:

  • An MCP server for the CMS, so an AI tool can read the content model, query existing pages, and draft new ones directly into the system
  • A DESIGN.md at the repo root that documents tokens, layout rules, and voice
  • A SKILL.md for each component describing its props, its purpose, and when to use it
  • A content schema explicit enough for a model to understand without guessing

The payoff is concrete. A marketing team can draft a landing page in their AI tool, stay on-brand because the CMS is component-based, and publish without pulling a developer into every microsite campaign. That is a different tempo from what most companies have today.

What this means for visitors

The other half of the shift is agents visiting the site directly.

A B2B evaluator asking Claude for vendors in her category. A researcher using Perplexity to compile a briefing. An agent booking a service on behalf of its user. These are not edge cases anymore. OpenAI, Anthropic, Google, and Perplexity all publish user-agent strings for their crawlers, and that share of traffic only goes in one direction.

If your site is invisible to these agents, you are invisible to a growing share of high-intent demand. And “invisible” is close to the right word. The HTML loads. The page renders. But the site sends no signals that agents are welcome, let alone how to use it. No markdown version of the page. No bot rules in robots.txt. No Content-Signal headers. No skills index explaining what visitors can do.

The fix is not exotic either:

  • Markdown copies of every page generated at build time and served via content negotiation
  • Explicit AI bot rules in robots.txt with Content-Signal headers declaring what agents can do with the content
  • An llms.txt at the root pointing at the key pages
  • Link response headers on the homepage pointing at llms.txt, sitemap, and a skills index
  • An Agent Skills index declaring the two or three most useful actions a visitor can take, like booking an audit or browsing services

You can audit any site at isitagentready.com and see how it scores. Most marketing sites come in under 30 out of 100. That is not a technology problem. That is an agency problem.

Why most agencies are not building for this

Two reasons, both mundane.

First, most agencies heard “AI is coming for us” and went defensive. They shipped surface-level AI features into their pitch decks while quietly hoping clients would stick with the old model. They treated AI as a threat to their billable hours instead of as a change in how websites actually need to work. Defensive agencies build defensive websites.

Second, the infrastructure for both sides is specific and technical. It is not in the standard build checklist. An agency building a site the way they built sites in 2022 will not include any of this. The client does not know to ask for it because nobody has told them to.

So it falls through. The site launches, looks fine on launch day, and is quietly behind on a dimension the client did not know to evaluate. Six months later the marketing team is fighting their CMS to use their AI tools, and the SEO team is wondering why the AI-search channel is silent. Nobody traces the problem back to the build.

What to ask your next agency

If you are about to commission a new website or a migration, the specific questions worth asking:

  • Does the CMS have an MCP server, or can one be added. This is the lever that makes your team’s AI tools actually useful against your content.
  • Is every component documented in a way an AI model can consume. A SKILL.md or equivalent at the component level is the test. “We have Storybook” is not the same thing.
  • Will the site serve a markdown version of every page. Either at build time or through a request-time handler. Either way, agents that prefer markdown should get markdown.
  • Is the robots.txt explicit about AI crawlers. One rule per bot is normal now. A blanket allow-all with no signals is not.
  • Is there a skills index at /.well-known/agent-skills/. It tells agents what your site can do for their users.

If the agency looks blank at any of those, that tells you what you need to know. An agency that has not thought about either side of the shift is building for a web that is already behind.


Agencies that see AI as a competitor are misreading the situation. AI is a force multiplier for marketing teams and a new audience for the web. Both of those are opportunities, not threats. The agencies that lose are the ones ignoring them.

We build for both sides by default. Every headless website and every headless migration we ship includes the MCP setup, DESIGN.md, component-level docs, markdown siblings, bot rules, Content-Signal headers, and an Agent Skills index. This is baseline, not an upsell. If you want to know where your current site stands on both dimensions, start with a Headless Audit.

The race between agencies that build for the agentic web and agencies that do not is easier to read than it looks.

If your website has become a bottleneck, let’s talk!

Start with an Audit Or email me directly