← towton

The Plunder

Thirty-eight thousand pages consumed for every visitor returned. This is not an imbalance. This is what plunder looks like when it's measured.

Listen:

Synthesized from 49 source documents across 4 research domains and 13 analytical lenses. Source-reviewed, fact-reviewed, and gap-reviewed before publication.

Anthropic crawls 38,065 pages for every visitor it returns. OpenAI: 1,091 to 1. Google: 5.4 to 1.

The ratios come from Cloudflare Radar, which serves traffic for millions of domains across the web. The measurement period is July 2025. The caveats are real — native app traffic undercounts referrals because apps like Claude and ChatGPT don’t send standard Referer headers, training crawls and search crawls are aggregated into a single number, and the specific Anthropic figure is volatile, ranging from 286,930:1 in January 2025 to 11,736:1 by mid-March 2026 as Claude’s web search feature began generating referral traffic. Cloudflare, which also authored the Web Bot Auth IETF draft and operates the dominant CDN-level bot verification infrastructure, has a commercial interest in the metric it publishes — it sells bot-blocking tools and operates the Pay Per Crawl marketplace. The methodology is transparent and the Radar API allows independent verification. But the company’s editorial choices about which ratios to highlight should be understood in that context.

The caveats calibrate the number. They do not make it acceptable.


What they’re giving back

The replacement gap is the other side of the extraction ledger. AI referral traffic is growing — and it is a rounding error.

Contentsquare, tracking 99 billion sessions across 6,500+ websites, found AI-referred traffic represented 0.2% of overall traffic in 2026 despite 632% year-over-year growth. Conductor, benchmarking 13,770 domains across 10 industries and 3.3 billion sessions, measured AI referrals at 1.08% of all web traffic. ChatGPT drove 87.4% of those referrals. The rest of the AI field is a rounding error of a rounding error.

Put that against what’s being lost. The Reuters Institute, drawing on Chartbeat data from more than 2,500 publisher websites, found Google Search traffic to publishers declined 33% globally in the year to November 2025. In the U.S., 38%. Pew Research — 68,879 searches tracked through actual browsing behavior of 900 adults with installed tracking software, no commercial conflicts — found AI Overviews cut click-through rates by roughly 47% when they appeared. Users clicked a link only 8% of the time when an AI summary was present, versus 15% without. Only 1% clicked the sources cited within the summaries.

Zero-click searches — queries resolved on the search results page without a click to any external site — sit at 58.5% of all U.S. Google searches according to SparkToro/Datos clickstream data through May 2024. For news-specific searches, Similarweb measured 69% by May 2025. Different measurements of different query populations — not a single range, but the same direction.

The losses are measured in billions of visits. The gains are measured in millions. Similarweb data shows AI platforms drove 35.9 million global visits to news and media sites in June 2025. In the same month, search referral traffic was 11.2 billion. The ratio is approximately 300 to 1.

And the trajectory is accelerating in the wrong direction. TollBit — a vendor in bot management, with the commercial interest that implies — tracked AI app clickthrough rates dropping from 0.8% in Q2 2025 to 0.27% in Q4. A threefold collapse in a single year. Even publishers with direct 1:1 AI licensing agreements saw CTRs drop from 8.8% in Q1 2025 to 1.33% in Q4. The deals don’t protect you either.

The replacement math does not work at any plausible growth rate. The traffic is not coming back.


The conversion question

A counterargument circulates: AI-referred visitors convert at higher rates, so the raw traffic numbers overstate the damage.

Three conversion studies disagree with each other. Amsive — the only study applying proper inferential statistics, 54 websites, six months of GA4 data — found no statistically significant difference. Organic traffic converted at 4.60%, LLM referrals at 4.87%. The p-value was 0.794. There is no signal there.

Visibility Labs, analyzing 94 e-commerce stores over twelve months, found a 31% conversion premium for ChatGPT referrals. But total ChatGPT revenue across 94 stores was $474,000, against $32.1 million from organic search. ChatGPT generated 1.48% of organic revenue.

Seer Interactive found ChatGPT converting at 15.9% versus Google Organic at 1.76% — from a single client, roughly 6,700 ChatGPT sessions. Not generalizable.

The most rigorous study found no effect. The two that found premiums come from vendor-affiliated sources measuring small samples. Even granting the most generous premium, it doesn’t close a 300:1 replacement gap.


Who’s fighting back

Seventy-nine percent of top news sites block at least one AI training bot. Seventy-one percent block at least one retrieval bot. Only 14% block everything. The numbers come from BuzzStream, surveying 100 top news sites. The distinction matters: training bots collect content permanently into model weights and return nothing. Retrieval bots at least generate citations and potentially referral traffic. Publishers are making granular decisions — blocking GPTBot at 62% while permitting ChatGPT-User, which only 40% block.

It is an intelligent response. It is also failing.

Robots.txt compliance violations quadrupled through 2025. TollBit documented 30% of total AI bot scrapes in Q4 violating explicit robots.txt permissions, up from 3.3% in Q4 2024. OpenAI’s ChatGPT-User had the highest non-compliance rate at 42%.

In Ziff Davis v. OpenAI (S.D.N.Y., December 2025), Judge Sidney Stein ruled that robots.txt files carry no legal force under the DMCA. He compared them to a “keep off the grass” sign. The primary technical tool publishers use to express their blocking preferences has no independent legal teeth.

And then there is the deliberate evasion. Cloudflare documented Perplexity running a stealth crawler that impersonated Chrome on macOS alongside its declared PerplexityBot. Undeclared IP addresses. ASN rotation. Continued crawling domains with explicit all-bot blocks. Test domains designed to catch unauthorized access were accessed. That is not a compliance gap. That is documented deliberate evasion.

The cruelest data point is not about the crawlers. It is about the blockers. Zhao and Berman (Rutgers/Wharton, December 2025) studied 30 major newspaper publishers and found that those who blocked AI crawlers experienced a 23.1% decline in monthly visits and 13.9% decline in actual human browsing. The methodology uses synthetic difference-in-differences against a control group of top retail sites. The study period overlaps with Google’s AI Overviews rollout, making clean causal attribution difficult. But the finding creates a prisoner’s dilemma: each publisher who blocks unilaterally absorbs the cost while competitors who remain open capture the AI-referred traffic.

Block, and you lose traffic from reduced AI visibility. Don’t block, and you lose content to platforms that consume 38,000 pages for every visitor they return.

Cloudflare blocked over 416 billion AI bot requests between July and December 2025. Roughly 2.7 billion per day. Over one million customers enabled one-click AI crawler blocking. Small publishers and site operators without Cloudflare or an equivalent CDN have substantially less ability to enforce their preferences. The web’s access control regime is de facto privatizing.

The publishers who can fight are fighting. The ones who can’t are absorbing whatever terms emerge.


The marketplace wars

Amazon sued Perplexity for scraping its platform. A federal judge granted a preliminary injunction in March 2026, blocking Perplexity’s Comet agent from accessing Amazon. Amazon’s position: third-party shopping agents should operate transparently and respect platform terms.

In the same period, Amazon’s own Buy for Me feature — powered by Amazon Nova and Anthropic’s Claude — grew from 65,000 products at beta launch to over 500,000 by November 2025. Merchants are automatically enrolled on an opt-out basis. They discover participation only after receiving orders from “buyforme.amazon” email addresses. One store owner found products on Amazon she had never authorized, including items she had deleted from her Shopify backend, with AI-generated images that did not match her actual products.

Amazon locks its own gates and kicks down everyone else’s.

This is not hypocrisy in the colloquial sense. It is a coherent strategic position. Amazon’s advertising business generated $17.7 billion in Q3 2025 alone, up 22% year over year — on pace for $56 billion annually. Third-party AI agents that surface Amazon products in external interfaces, where Amazon cannot show ads or control the presentation, threaten that revenue directly. Internal AI strengthens the bundle of discovery, comparison, and checkout. External AI threatens to unbundle it.

eBay banned all buy-for-me agents in its February 2026 User Agreement update. One exception: Google’s shopping bot has explicit access to eBay’s cart subdomain while bots from Perplexity, Anthropic, Amazon, and others are explicitly blocked. Every major marketplace blocks buyer-side AI agents. Every one of them deploys AI aggressively for internal purposes — seller tools, search optimization, recommendation engines, advertising systems. The asymmetry is the strategy.

Shopify breaks the pattern. Four MCP servers shipped. ChatGPT, Perplexity, and Microsoft Copilot have live integrations with Shopify’s Checkout MCP server. The strategic logic is different: Shopify doesn’t own buyer relationships — its merchants do. Opening to AI agents expands the distribution channels available to Shopify merchants. Etsy participates in both OpenAI’s ACP and Google’s UCP — its inventory of handmade and unique goods is not easily commoditizable, so AI agents need Etsy’s supply more than Etsy needs to control the buyer experience.

The platforms whose value lies in controlling the transaction lock down. The platforms whose value lies in enabling merchants open up.


The first real test

The first real test of AI-mediated commerce ran the full cycle from launch to failure in six months.

OpenAI’s Instant Checkout launched September 2025. Walmart EVP Daniel Danker disclosed that it converted at one-third the rate of a regular click-through to Walmart’s own website. Every product was forced into its own individual transaction — its own payment, its own shipping calculation, its own delivery. Walmart excluded some products because it knew single-item checkout was “detrimental” in categories where bundling matters. The system lacked real-time inventory tracking, coupon integration, loyalty programs, customer data collection, and store pickup options. OpenAI had not built a system for collecting and remitting state sales taxes as of February 2026.

By March 2026, Instant Checkout was dead. The pivot tells you what OpenAI learned: discovery yes, transaction no. Checkout now occurs in retailer-specific apps embedded within ChatGPT or on retailers’ own websites. Walmart is embedding its Sparky AI assistant. Instacart launched as the first grocery partner. Target, Sephora, Nordstrom, Lowe’s, Best Buy, The Home Depot, and Wayfair have integrated for discovery.

The AI platform that tried to own the transaction defaulted back to the referral model — the same architecture the web has operated on for twenty-five years. Discovery through AI, transaction through the brand.

When the pivot was announced, travel intermediary stocks surged: Expedia +13.69%, Booking Holdings +8.46%. The market priced the failure as evidence that AI platforms cannot disintermediate existing commerce intermediaries. At least not yet.

Meanwhile, on different terrain entirely: Alipay processed 120 million AI agent transactions in a single week in February 2026. Completed purchases, ordered through chatbots, paid without leaving the conversation. The structural difference: Chinese tech giants control the complete stack — AI chatbot, marketplace, payment system. When your chatbot, your marketplace, and your payment system are the same company, you skip years of interoperability debates.

OpenAI’s Instant Checkout required coordinating across ChatGPT, retailer systems, Stripe, and merchant fulfillment — four or more independent entities with no shared data layer, no unified inventory view, and no common identity system. The agentic commerce that the Western market is trying to build through protocol coordination is already running at scale in China through vertical integration. The problem is architectural, not conceptual.


Where this leaves you

Thirty-eight thousand pages consumed for every visitor returned. A replacement gap of 300 to 1. Blocking that costs traffic and fails to stop crawling. A legal framework where the primary access control mechanism carries no more weight than a sign on a lawn. Marketplaces that lock their own gates and kick down everyone else’s. The first real test of AI commerce dead in six months, defaulting back to the referral model it was supposed to replace.

The plunder continues. The replacement does not arrive.

The extraction is not a bug in the system. It is the system.


If you want to understand why this keeps happening — why every intermediary disruption follows the same arc, and why the “AI SEO” industry is selling you the second trip through the same trap — read Trapped Twice. If you want to know who is most affected by these ratios and most absent from every negotiation, read Invisible. If you want to know what to do about it, The Quartermaster has the specifics for your situation.