The breakup letter that confirmed how I think about advertising. A small Norwegian clothing brand wrote to Mark Zuckerberg.

The breakup letter that confirmed how I think about advertising. A small Norwegian clothing brand wrote to Mark Zuckerberg.

SEO drives traffic. GEO builds authority.

SEO spoke to humans browsing. GEO speaks to the AI systems that humans now ask to browse for them.

Jo-Egil Tobiassen runs Northern Playground, a small outdoor clothing brand in Oslo. Their tagline is “buy less, play more.” They produce technical garments for people who spend time in Norwegian weather, and for years, they did what every direct-to-consumer brand does.
They paid Meta and trusted the algorithm to find their customers.

Then they stopped in late 2025.

Not because the ads stopped working, Tobiassen has been honest about that. Meta ads work. They were probably cheaper than almost anything else. But “we want more than to keep fueling a system we don’t believe in,” he wrote. So they pulled the plug, filmed a breakup video for Zuckerberg, and published a letter that ended with.
It’s not us. It’s you.

The letter circulated widely. The LinkedIn post that followed it went even further. But what really stayed with me, what I haven’t been able to stop thinking about, wasn’t the letter itself. It was the question that came after it.
We broke up with Meta. Now what?

The question nobody could answer

When Tobiassen posted that question publicly, the response was swift and nearly unanimous in its frustration. Brands, founders, and marketers from various industries chimed in with similar admissions: we feel the same way, and we have no idea what to do about it.

That isn’t a small admission. The people in that thread weren’t naive. They understood reach, conversion rates, and cost-per-acquisition, as well as the harsh economics of finding new customers without the surveillance tools of a platform that knows what your audience buys, fears, and scrolls past at two in the morning. They had done the math and knew what they were sacrificing.
And they were still asking.

The issue with Meta, and the digital advertising system it represents, isn’t that the ads are ineffective. The real problem is what you have to build upon to make them work. You have to operate within a system that profits from outrage, addiction, and the erosion of the informational commons that your customers also inhabit. You fund a machine that damages the world your brand is trying to operate in. Every conversion you purchase includes a small contribution to something you oppose.

Northern Playground explored alternatives, including Norwegian podcast advertising and a new Nordic social platform. They focused on micro-influencers they genuinely believed in. They also embraced organic TikTok, recognizing that going completely off Big Tech would mean risking their business. They weren’t abandoning performance marketing; instead, they wanted performance that didn’t require making a deal with the devil.

I read every comment carefully, including the alternatives listed and the honest admissions about why those options fell short. I realized that the missing element in all proposed solutions was the same. A truly new surface, not a smaller version of what already exists, but a channel that works on entirely different logic.
I quickly realized we had already built the solution with the music team.

Do you want your music listed or played?

SEO drives traffic. GEO builds authority.
For music catalogs, that authority starts with ensuring AI knows your music exists and can discuss it when asked.

What the extraction model actually does

The advertising industry has a fundamental issue it has been glossing over for twenty years. Interruptive advertising, such as banners, pre-rolls, feed insertions, and persistent retargeting, boosts revenue but harms user experience. Increasing ad slots per page, raising frequency caps, and lowering relevance thresholds all increase short-term income but diminish the platform’s long-term value.

The platforms respond by extracting more before the value is fully drained. They move from subsidizing creators and brands to extracting from them. Organic reach diminishes until paid reach becomes necessary. Ad costs increase as more brands compete for the same degraded inventory. The algorithm is adjusted to favor engagement over quality because outrage and anxiety keep users on-platform longer than contentment, and time-on-platform is what advertisers buy.

This is what Cory Doctorow calls enshittification, and it is not a metaphor. It is the expected outcome of any platform that convinces users it’s essential before taking the most from everyone who relies on it.

Northern Playground clearly saw it. So did many others in that discussion. The problem is that recognizing it and escaping from it are different. Escaping requires not just courage, but also an actual door.

A different channel was becoming visible

While the platform economy was refining its extraction model, an entirely separate discovery infrastructure was emerging, one that no one in that LinkedIn thread had mentioned, because no one had created the tools to detect it.

Every major AI system dispatches crawler bots across the web each day. GPTBot, ClaudeBot, PerplexityBot, Google-Extended, these automated agents gather information for systems like ChatGPT, Claude, Perplexity, Gemini, and Copilot to understand the world. They visit music pages, product pages, editorial content, and brand sites. They read, index, and then leave. The knowledge they collect influences what these systems recommend to hundreds of millions of users through conversational interfaces.

When someone asks ChatGPT about outdoor gear worth buying for Norwegian winter hiking, the AI pulls from what its crawlers have indexed. When someone inquires Perplexity about independent clothing brands with strong sustainability credentials, the answer comes from structured data that those bots have already catalogued.

Here’s the key point.

When a brand shows up in an AI recommendation, the user doesn’t see an ad. They see a trusted answer from a system they chose to consult. The intent is genuine. The trust wasn’t bought; it was earned by providing the right information, in the right format, at the right time when the crawler found it.

This is the platform that Northern Playground was searching for without realizing it existed. Not a smaller Meta. Not a Nordic social network with a tenth of the reach. A channel where relevance is driven by the medium itself, because AI systems have no commercial motive to recommend things that don’t match what their users asked for.

What I built to make this visible

GEO-Ads is the infrastructure that makes AI discovery measurable, actionable, and, for the first time, connectable to advertisers who want to reach the moment when an AI recommendation turns a question into a purchase decision.

The platform works on three levels.

Every page processed by GEO-Ads passes through a detection layer that classifies each visitor in real time as human, AI crawler, AI referral, or unknown. It currently recognizes 39 classified AI bots across three trust tiers. When a legitimate crawler visits GPTBot, ClaudeBot, or PerplexityBot, it receives a response enriched with Schema.org JSON-LD structured data that AI systems parse to understand what a piece of content is, who made it, and what context it belongs to.

Without structured data, a page is just unorganized HTML that a crawler might visit but can’t interpret reliably. With structured data, the AI can understand what it is reading well enough to make accurate recommendations. GEO-Ads fully automates this enrichment process, acting as middleware between content and AI systems to ensure crawlers index content that’s sufficiently rich to surface in relevant suggestions.

The Earned Bias Score condenses this into a single number for each entity. The formula is the conversion rate, how many AI crawls led to actual AI referral traffic, multiplied by a diversity factor that shows how many different AI systems are recommending the content. A high score indicates that multiple independent AI systems, with no commercial ties to the platform, have independently judged that the content warrants being shown to real users who request it.

That score can’t be bought. It can only be earned. And that’s the whole point.

The advertiser model Northern Playground would recognize

For brands on the platform, the association mechanism works like this:

A business registers a profile that describes what they represent, not just a target demographic or bidding strategy, but a genuine description of the brand and what it aligns with. The platform creates a semantic embedding from that description and runs a nightly batch process to match against the content database. When a match is genuine, the business’s structured data is integrated into the matching content.

When AI systems re-crawl that content, they detect the brand association embedded in the page’s machine-readable layer. The brand doesn’t appear as an ad; it exists in the informational context of content the AI has already deemed worth recommending.

A Norwegian outdoor clothing brand, known for strong sustainability credentials and a close connection to Nordic weather and terrain, would align well with content about Nordic hiking, technical outdoor gear, slow fashion, and responsible manufacturing. When AI systems index that cluster of content and detect the brand in the structured data, they would carry that association into future recommendations for users asking those very questions.

No outrage algorithm needed. No behavioral monitoring. No contribution to the decline of the platforms your customers also use. The advertising exists because it genuinely belongs there, and the AI’s relevance-enforcement ensures it stays that way.

The answer to “now what.”

When Jo-Egil Tobiassen asked that question publicly, we broke up with Meta.
Now what? he was speaking for a large and growing group of brands that have done the moral math and found it doesn’t add up. The issue has never been a lack of conviction. It has always been a lack of infrastructure.

The AI recommendation layer isn’t a smaller version of Meta. It doesn’t sell behavioral profiles and doesn’t profit from keeping users anxious or outraged. It recommends content because users ask for suggestions, and its crawlers find matching content. That’s the entire mechanism. There’s no dark side of the architecture to audit because its incentives are aligned correctly from the start.

Northern Playground sent a letter to California, addressed to Mark Zuckerberg. They spent tens of thousands of Norwegian kroner testing alternatives that were all, in various ways, smaller versions of the problem they were trying to solve.

The door they were searching for was open the entire time. The bots had already started crawling. No one was watching, and no one had created the tools to link what the crawlers were doing to the brands relevant to those conversations.

That is what GEO-Ads is for.