How a VS Code Update Revealed the Missing Layer of the Web

Last month, Visual Studio Code shipped a major update. It published a “What’s New” notification that popped up in my editor — a nice, familiar pattern. New features, new capabilities, new integrations.

But it forgot to tell Claude.

Claude is a core partner in VS Code. The Claude extension is one of the most-used AI coding assistants in the editor. When VS Code ships new API capabilities, Claude could immediately leverage them — new debugging hooks, new terminal access patterns, new editor integrations. But Claude had no idea the update happened. The VS Code team published their changelog for humans. They updated their documentation for humans. They posted on Twitter for humans.

Nobody told the AI.

The Discovery Gap

This isn’t a VS Code problem. This is an internet problem. Every website, every platform, every service communicates exclusively with human visitors. When a company updates their product, publishes new documentation, or changes their policies, the information goes into HTML pages designed for human eyes.

AI agents — ChatGPT, Claude, Perplexity, Copilot — are left to figure it out on their own. They scrape. They guess. They hallucinate. And when they get it wrong, brands have no recourse because there was never a structured way to tell AI the right answer in the first place.

The numbers are staggering:

  • $67.4 billion — the estimated cost of AI hallucinations to businesses in 2024
  • 33-48% — hallucination rate for AI responses about companies
  • 85% — of brand mentions in AI come from third-party sites, not the brand’s own domain
  • 60-70% — of searches now end without a click to any website

The web was built for browsers. Browsers understand HTML, CSS, and JavaScript. We’ve spent 30 years optimizing for that visitor.

AI is a fundamentally different visitor. It doesn’t render your CSS. It doesn’t click your buttons. It needs structured facts — who you are, what you do, what it’s allowed to say about you. And right now, no website tells it.

From Passive to Active

The current approach to AI visibility is entirely passive. Brands publish content for humans and hope AI scrapes the right pages, interprets them correctly, and doesn’t hallucinate. The entire GEO (Generative Engine Optimization) industry — already hundreds of millions in annual spend — is built on this hope.

Companies are hiring GEO specialists to monitor what AI says about them. They’re paying for tools that track brand mentions across ChatGPT, Perplexity, and Gemini. They’re optimizing content, hoping to appear in AI-generated answers.

They’re playing defense.

What if you could play offense? What if, instead of monitoring what AI says about you and reacting, you could tell AI what you want it to know — in a format it can verify, through a standard it can discover?

That’s the question that started AI Discovery.

The /.well-known/ai Standard

The solution turned out to be surprisingly simple. Websites already have a convention for machine-readable metadata: the /.well-known/ path. Apple uses it for app associations. Certificate authorities use it for domain validation. It’s the place where machines look for structured information about a domain.

AI Discovery adds one more: /.well-known/ai

It’s a JSON manifest that answers the questions every AI agent needs answered:

  • Who are you? — Organization name, description, sector, legal entity
  • What do you do? — Core concepts, products, services, knowledge domains
  • How can I verify this? — Cryptographic signatures, content hashes, Digital Name
  • What am I allowed to do? — Quoting permissions, training permissions, content policies
  • Where do I learn more? — Knowledge endpoint, AI feed, content inventory

It’s not llms.txt (a flat list of URLs with no structure for identity or policies). It’s not Schema.org JSON-LD (designed for search engines, not AI agents). It’s a purpose-built three-tier architecture:

  1. ai.json — Discovery. The compact manifest every site needs.
  2. knowledge.json — Encyclopedia. Deep structured data about your organization.
  3. feed.json — Updates. AI-optimized news feed with structured metadata.

Origin, Not Just Visibility

Here’s what makes this different from SEO optimization or content marketing: origin and ownership.

When AI scrapes a Reddit thread about your company, it has no way to distinguish that from your own website. When it reads a competitor’s comparison page, it might cite their characterization of your product. 48% of AI citations come from user-generated content like Reddit and YouTube.

AI Discovery changes this. When an AI agent finds a cryptographically signed manifest at /.well-known/ai, it knows this is from the organization itself. Content hashes prove the information hasn’t been tampered with. The Digital Name links the manifest to a verified blockchain identity.

This isn’t about visibility — it’s about authority. Your domain becomes the canonical source of truth about your brand for AI.

What Would Have Happened

Back to VS Code. If Microsoft had AI Discovery configured on their VS Code documentation domain, here’s what would have happened when they shipped that update:

  1. Their feed.json would have included the new release as a structured entry
  2. Claude’s next context refresh would have discovered the update at /.well-known/ai/feed
  3. The feed entry would describe the new capabilities in machine-readable format
  4. Claude would know about the new features without any human needing to tell it

No scraping. No guessing. No waiting for someone at Anthropic to manually notice and integrate. Discovery.

The Transformation

The web is about to undergo the same transformation it went through with mobile. When smartphones arrived, websites had to become responsive — they had to work for a fundamentally different kind of visitor. Companies that adapted thrived. Companies that didn’t became invisible on the platform where their customers were spending the most time.

AI is that moment again. AI agents are the new visitors. They’re already trying to understand your site. The question isn’t whether you’ll communicate with AI — it’s whether you’ll do it passively (scraping, hoping, reacting) or actively (structured, signed, verified).

The transformation of the web from passive AI communication to active participation will change the future of every digital interaction. AI Discovery is how a site offers services and information to AI. It’s how an MCP server is discovered. It’s how the web becomes AI-native.

We built a scanner so you can check your site’s AI readiness in 60 seconds. A WordPress plugin that configures everything automatically. And a standard that’s open, documented, and free to implement.

Because the next time VS Code ships an update, AI should know about it.