← towton

Trapped Twice

Every intermediary disruption follows the same arc. The 'AI SEO' industry is selling you the second trip through the same trap.

Listen:

Synthesized from 7 source documents across the disruptive intermediaries, standards wars, and extraction reciprocity research lenses. Source-reviewed, fact-reviewed, and gap-reviewed before publication.

In 2011, Demand Media was worth $1.5 billion. It had done everything Google told it to do. Its eHow.com empire reached 105 million unique monthly visitors by producing algorithmically guided content at industrial scale — articles reverse-engineered from Google’s own search demand signals, published faster than any human editorial operation could match. The strategy was rational. The execution was precise. The IPO was the proof.

Google’s Panda algorithm update launched on February 23, 2011 — less than a month after the IPO. Third-party measurements showed Demand Media’s traffic fell as much as 40% within months. The stock dropped 38% over two weeks. Within two years, the company had plummeted to roughly a quarter of its peak valuation. (CNN Money; Search Engine Land)

Total adaptation. Total vulnerability. The trap is never visible from inside it.


The mechanism

The Demand Media collapse was not an anomaly. It was one case study in a pattern that has repeated across every intermediary disruption examined in the research behind this series — search, music, transportation, hospitality. The mechanism is a five-stage lifecycle, and it operates with structural consistency:

Stage 1: Genuine value. The new intermediary solves a real problem. Google made the web navigable. Napster made music instantly accessible. Uber matched riders and drivers with efficiency that taxi dispatch could not approach. The value is real. That is what makes the trap possible.

Stage 2: Rapid adoption. Network effects accelerate growth beyond what incumbents can respond to. Google captured dominant search market share within five years of launch. Napster reached approximately 70 million registered users in roughly two years. The speed varies. The trajectory is consistently exponential.

Stage 3: Adaptation and dependency. Producers restructure around the intermediary. Publishers built entire SEO disciplines. The music industry adapted to streaming — by 2024, streaming revenues reached $20.4 billion, representing 69% of total global recorded music revenues (IFPI Global Music Report 2025). Taxi companies built apps. Each adaptation is rational. Each deepens exposure.

Stage 4: Extraction escalation. Once producers are dependent, the intermediary monetizes more aggressively. Google progressively absorbed content into its own results — zero-click searches reached 58.5% in the US by 2024 (SparkToro/Datos, 2024). In music, Spotify’s per-stream payouts became a persistent source of artist discontent even as industry revenues recovered. In transportation, NYC taxi medallion values collapsed from over $1 million at their 2013 peak to as low as $25,000 by 2021, devastating medallion-holding drivers — many of them immigrants from Bangladesh, Pakistan, Haiti, and Ghana who had taken on median debts of $500,000 to purchase medallions as their path to middle-class stability (Columbia Human Rights Law Review).

Stage 5: Vulnerability to the next intermediary. The platform’s own success creates conditions for its disruption. Academic research identifies disintermediation as “the Achilles’ heel of the platform business model” — the paradox that the more platforms succeed at enabling trust and matching between parties, the more vulnerable they become to being bypassed (Ladd, Business Horizons, 2022). Google now faces this: Gartner predicted in February 2024 that traditional search engine volume would drop 25% by 2026 due to AI chatbots and virtual agents.

The lifecycle is not a theory waiting for evidence. It is the evidence waiting for the name.


The 25-year case

Google is the longest-running and most thoroughly documented example of the lifecycle in operation. Twenty-five years of data. Five distinct phases. A trajectory that builders need to understand because the AI version is starting where Google’s extraction took two decades to arrive.

The symbiosis was real. Google launched in 1998. By 2002, annual revenue reached $440 million; by 2003, $1.47 billion — with advertising representing 94% and 97% of revenue respectively (Google 10-K filing, 2004). The early relationship was genuinely reciprocal. Google needed content to index. Publishers needed traffic. AdSense, launched in 2003, allowed publishers to monetize the relationship directly. For small publishers especially, it was transformative: sites that previously had no viable advertising channel could generate revenue from their content.

Then Google’s Florida update created an industry. November 2003. The first major algorithm change. Many e-commerce and small business sites saw their rankings disappear overnight. Florida’s precise mechanism was never officially disclosed, but SEO industry analysis points to a link analysis algorithm that used statistical properties of linking patterns to detect manipulation. Affiliate sites bore disproportionate collateral damage. (Search Engine Journal; Search Engine Land)

Florida created the SEO industry as a professional discipline. Before Florida, optimization was a hobbyist’s game of tricks. After Florida, businesses needed expertise. Agencies formed, consulting practices emerged, an entire ecosystem of tools, conferences, and certifications grew around the need to understand and optimize for Google’s algorithms. By 2025, the global SEO services market reached an estimated $74.9 billion (Mordor Intelligence) — market size estimates vary by source, but the order of magnitude is consistent.

Each subsequent algorithm update — Panda in 2011, Penguin in 2012 — reinforced the same pattern. Publishers invested more in Google optimization, which increased their dependency on Google traffic, which gave Google more leverage to change the rules. The adaptation was rational at every step but cumulative in creating lock-in.

Then Google started absorbing the content. The Knowledge Graph launched in May 2012 with over 500 million entities and 3.5 billion facts. Featured snippets followed in January 2014 — extracting content directly from publisher pages and displaying it at the top of results, giving users the answer without requiring a click. Research found that when a featured snippet occupies the top position, it captures approximately 8.6% of clicks, while the standard first result without a snippet captures approximately 26% (Ahrefs study). The publisher whose content was extracted received a link but often lost the click.

This was the inflection point. Google shifted from directing users to content to absorbing content into its own surfaces.

Then the paid squeeze. In February 2016, Google removed all ads from the right-hand sidebar and began showing up to four ads above organic results for commercial queries, pushing organic results below the fold entirely. In mid-2020, Prabhakar Raghavan — who had led Google’s Ads and Commerce division — replaced Ben Gomes as head of Search. Internal documents from the Department of Justice antitrust case revealed this transition followed a 2019 “code yellow” triggered by slowing ad revenue growth (Where’s Your Ed At). The ads team pushed for changes the search team had resisted on quality grounds. The results are visible in the data: a January 2025 vs. January 2026 analysis by Aleyda Solis using Similarweb data found organic click share dropped 11-23 percentage points year-over-year across commercial verticals, while text ad click share roughly tripled or quadrupled. Total click volume also declined — meaning organic publishers lost share of a shrinking pie.

Then the zero-click culmination. The SparkToro/Datos 2024 study, analyzing clickstream data from tens of millions of devices, found 58.5% of US Google searches resulted in zero clicks. For every 1,000 searches, only 360 clicks reached non-Google properties. Nearly 30% of all clicks went to Google-owned platforms. Approximately 1% went to paid ads.

Google’s AI Overviews accelerated it. A Seer Interactive study analyzing 3,119 informational queries found organic CTR dropped 61% for queries where AI Overviews appeared. Chartbeat data reported by Axios shows Google search referrals to publishers declined globally by 33% year-over-year, with small publishers — sites with 1,000 to 10,000 daily pageviews — losing 60% of search referral traffic, compared to 22% for large publishers.

The zero-click trajectory that took Google 25 years to reach is the starting condition for AI. AI agents synthesize by default. The referral economy may not exist at all.


The mirror

The music industry offers the closest structural parallel — and the harshest lesson about what “recovery” actually means.

Global recorded music revenues fell from approximately $22.2 billion in 1999 to approximately $13.0 billion in 2014, a decline exceeding 40% over fifteen years (IFPI Global Music Report). In the US specifically, RIAA data shows revenues peaked at $14.6 billion in 1999 and fell to approximately $6.97 billion by 2014 — a 52% decline in nominal terms (RIAA). The RIAA filed approximately 30,000 lawsuits against individual file-sharers between 2003 and 2008 (EFF, 2008). Peer-to-peer traffic reportedly doubled during the same period. The lawsuits established precedent. They did not stop the bleeding.

Then Spotify arrived. Streaming-driven recovery brought global revenues to $31.7 billion by 2025, with paid subscriptions reaching 837 million globally (IFPI Global Music Report 2026). Spotify alone paid over $11 billion to rights holders in 2025 (Spotify Newsroom, January 2026).

The recovery narrative, told in nominal dollars, is a triumph. The industry exceeded its 1999 revenue peak.

Told in real dollars, it is something else. Inflation-adjusted, 2025 revenues represent roughly 65% of the 1999 peak. The industry recovered two-thirds of what it lost. The intervening decades — the careers, the mid-tier labels, the infrastructure that sustained working musicians between the top and the bottom — did not recover at all.

Nearly 1,500 artists earned more than $1 million from Spotify in 2024 (CNBC). Spotify pays artists an estimated $0.003 to $0.005 per stream — approximately $3,000 to $5,000 per million streams (TuneCore). The vast majority of working musicians cannot sustain a livelihood from streaming income alone.

In 2024, Spotify stopped paying royalties on any track with fewer than 1,000 streams, demonetizing an estimated 86% of tracks on the platform. An independent calculation estimated this redirected approximately $47 million from small artists to major labels and larger artists. (Disc Makers, 2025)

The industry recovered. Most of the people in it did not.

And that is the good outcome. US newspaper advertising revenue fell from approximately $49.4 billion in 2005 to under $10 billion by 2022 (Pew Research Center), and no Spotify equivalent emerged. The music industry found its intermediary. Newspapers didn’t. Which path does the web follow?


The rapid-fire evidence

Demand Media is the canonical case. But the dependency trap has closed on company after company, and the mechanism is identical every time: rational adaptation, platform change, stranded investment.

Facebook pivot-to-video (2015-2018). Starting in 2015, Facebook promoted video as the future of content on its platform. Publishers restructured their operations accordingly. Fox Sports eliminated its entire writing staff. Vocativ laid off its entire newsroom. MTV News cut its writing team. Vice Media laid off at least 60 employees. Mic cut ten writers. Conde Nast cut jobs and reduced magazine frequency. (Wikipedia, “Pivot to video”)

These decisions were made on the basis of Facebook’s reported video performance data. In September 2016, Facebook admitted it had inflated video-viewing metrics by 60-80% over an 18-month period. Court documents later revealed discrepancies potentially reaching 150-900%, and internal communications showed Facebook “obfuscate[d] the fact that we screwed up the math” and waited over a year to disclose the errors. Facebook settled the resulting advertiser lawsuit for $40 million in 2019. (Slate; Variety)

In January 2018, Facebook reduced audience exposure to public content from all Pages in favor of posts from friends and family. Publishers who had pivoted most aggressively saw traffic drops of 60% or more compared to the same period a year prior (comScore data). The damage was compounding: they had already cut the writers and editors who produced the text content that had driven their pre-video traffic. They could not simply revert.

The damage was not lost traffic. It was lost capability. The writers were already gone.

Zynga. At its peak, Zynga generated substantially all of its revenue through Facebook’s platform — reported as more than 90%, as disclosed in its IPO prospectus (Zynga S-1, SEC). It reached a peak market capitalization of approximately $10 billion after going public in December 2011. By July 2012, shares had crashed 40% after quarterly losses. The stock ultimately lost three-quarters of its value. (TIME)

Facebook mandated its Credits payment system, increasing its revenue share. Facebook tweaked its News Feed algorithm to favor new applications over established ones. Facebook opened App Center, creating new discovery dynamics that disadvantaged the incumbent. Each change was within Facebook’s rights as a platform operator. Each devastated Zynga’s business.

Twitter API destruction. Twitter progressively restricted its API over a decade, moving from an open ecosystem to the near-total shutdown of free API access in February 2023 under Elon Musk’s ownership. In January 2023, third-party clients including Twitterrific — which had been in development for 16 years — had their API keys revoked without warning. Years of investment, deleted overnight.

Vine. Over 200 million active users by December 2015. On October 27, 2016, Twitter announced it would be discontinued. By January 2017, shut down entirely. Creators who had built audiences exclusively on Vine lost their distribution channels overnight and had to rebuild from scratch. The platform’s failure to build creator monetization meant that creators’ only asset was their Vine audience — and that asset was non-transferable.

In every case, the trap operates through the same mechanism. The platform creates genuine value. Producers restructure around the platform. The platform changes. The producers’ investments become stranded assets. Step two feels like smart strategy at the time. The dependency is not visible until step three.


The new trap

Every finding from every case study above applies to the emerging AEO, GEO, and LLMO industries — four names for the same grift, repackaged with “AI” prefixed to the same dependency cycle.

The GEO (Generative Engine Optimization) market is projected to reach $33.7 billion by 2034, from $848 million in 2025 (Dimension Market Research). That projection carries high uncertainty — third-party estimates for nascent markets always do. What carries no uncertainty is the shape of what is being built. A new consulting class is forming around AI visibility, just as the SEO industry formed around Google visibility in the early 2000s. The fragmentation is already apparent: what earns citations in Perplexity may be invisible in Gemini. Platform-specific optimization is the beginning of the dependency trap.

The data used to sell these services undermines the services being sold. Citation volatility is so severe that only 30% of brands stay visible from one AI answer to the next for the same query — and only 20% across five consecutive runs (AirOps, methodology undisclosed). When the measurement surface is this volatile, no technical change can be causally linked to AI citation rates. The platform the optimizers are optimizing for does not hold still long enough to optimize.

Corroborate with the harder evidence. Cloudflare Radar primary network data — not vendor-conflicted — shows Anthropic crawling 38,065 pages for every visitor it returns. OpenAI: 1,091 to 1. Google: 5.4 to 1. (Cloudflare Radar, July 2025; Cloudflare, which also authored the Web Bot Auth IETF draft and operates the dominant CDN-level verification infrastructure.) The replacement economy does not function. The traffic is not coming back.

And then there is llms.txt — 844,000 sites adopted it, zero AI crawlers requesting it, no relationship to AI citations across 300,000 domains analyzed (BuiltWith; SE Ranking). The latest standard the industry sold compliance with before anyone checked whether it functioned.

The tools change. The trap does not.


The two layers

The evidence supports a structural distinction between two categories of investment — and the historical record is clear about which one survives.

Layer 1: Standards-based structural investment. The W3C standards behind semantic HTML have remained stable through every intermediary shift — adopted by assistive technology first, then search engines, now AI agents. Each new consumer inherited the same structural expectations. Server-side rendering has been the correct answer for every crawler since Googlebot. Schema.org, launched in 2011, was adopted by Google for rich results and is now used by AI systems for entity extraction and knowledge grounding — the same markup serves both intermediary generations. Clean information architecture, accessible design, direct audience relationships: these investments serve any intermediary, present or future, because they are based on open standards rather than any single platform’s requirements.

This is where the economics literature points. Katz and Shapiro’s foundational work on network externalities (American Economic Review, 1985) established that in markets with network effects, open licensing accelerates installed-base growth. Shapiro and Varian (California Management Review, 1999) identified installed base and coalition breadth as the strongest predictors of which standards win — stronger than technical merit, which is routinely overridden. Suarez’s integrative framework (Research Policy, 2004) found that first-mover advantages are fleeting when both technology and market change rapidly — the “rough waters” scenario that describes the current AI transition precisely. The academic evidence is consistent: in turbulent environments, structural capability beats platform-specific optimization.

Layer 1 investments pass a simple durability test: if the platform they target disappeared tomorrow, would the investment still be worth something? Semantic HTML would. Clean data structures would. An email list would. An llms.txt file would not.

Layer 2: Platform-specific optimization. Implementing llms.txt for current AI crawlers, optimizing for specific AI citation patterns, structuring content for individual platform behaviors — Perplexity’s recency signals, ChatGPT’s depth preferences, Google AI Overview’s structured hierarchy. These investments may be necessary for near-term visibility. They are inherently fragile. The historical pattern predicts they will need to be rebuilt when the platform landscape shifts — and the lesson of adaptation cycle compression is that the shift will come sooner than expected.

MCP — the Model Context Protocol — illustrates what a Layer 1 trajectory looks like for a new standard. Launched by Anthropic in late 2024, adopted by OpenAI by March 2025, donated to the Linux Foundation by December 2025 with co-founding backing from Block, AWS, Google, Microsoft, Salesforce, and Snowflake (Pento). Open licensing. Saturated coalition. Every major AI provider. That is the predictor profile of a standard that persists. Contrast with llms.txt: single-entity origin, no governance body, zero confirmed consumption. That is the predictor profile of a standard that doesn’t.


The return

In 2011, Demand Media had done everything Google told it to do. The stock was soaring. The metrics said the strategy was working. Every decision was rational.

Every decision Demand Media made is being made again right now, by businesses optimizing for platforms whose citation behavior changes 70% of the time for the same query. The vocabulary is different — GEO instead of SEO, AI visibility instead of search ranking, llms.txt instead of keyword density. The mechanism is identical. Total adaptation. Total vulnerability.

The first-mover evidence says that in a rapidly changing environment, the survivors are not the ones who moved fastest on the current platform. They are the ones who built structural capabilities that transferred when the platform shifted. The publishers who weathered Google’s algorithm changes best were those who invested in semantic foundations — clean HTML, good information architecture, genuine content quality — rather than chasing the platform’s current preferences.

The dependency trap closes the same way every time. Here is where you stand. Here is where you don’t.


Trapped Twice is part of Towton, a series on the AI-web conflict. The Plunder documents the extraction ratios this piece contextualizes historically. Empty Titles operationalizes the two-layer framework as conviction. The Quartermaster personalizes the Layer 1 investment for your specific situation.