7 min read

Most SEO Case Studies Are Lies

That case study showing 500% traffic growth? Cherry-picked data, survivorship bias, and creative accounting. Here's how the sausage gets made.

"We increased organic traffic by 847% in 6 months!"

You've seen these case studies. They're everywhere. Agency websites. Conference talks. LinkedIn posts. Twitter threads. Everyone has a success story with eye-popping numbers.

Most of them are, at best, misleading. At worst, outright fabrications.

I've been in rooms where these case studies were constructed. I've seen how the sausage gets made. Let me show you behind the curtain.

The Cherry-Picking Game

Woman with a Parasol by Claude Monet
Shielding yourself from algorithm changes.

Every agency has dozens of clients. Some do well. Some do okay. Some fail spectacularly.

Guess which ones make it into case studies?

You'll never see a case study titled "How We Worked For 18 Months With No Results" or "The Time Our Strategy Completely Backfired." Those clients exist. They just don't get published.

This creates massive survivorship bias. You're only seeing the wins. You're not seeing the far more numerous mediocre outcomes and failures. So you assume the wins are normal when they're actually exceptional.

If an agency publishes 5 case studies and has 50 clients, you're seeing the top 10%. The other 90% might be fine, or they might be disasters. You'll never know.

The Baseline Manipulation

Here's a trick I've seen countless times: choosing a convenient baseline.

"Traffic increased 500% after our engagement!"

What they don't tell you: the baseline was chosen at a traffic low point. Maybe the site had just recovered from a penalty. Maybe it was seasonal. Maybe they started measurement right after a technical disaster was fixed.

If a site normally gets 10,000 monthly visitors but dropped to 1,000 due to a problem, and the agency started after the problem was fixed, they can claim 900% growth. But they didn't cause that growth. The site was just returning to normal.

Another version: measuring from before the site even existed. "From zero to 50,000 monthly visitors!" Well, yes, any new site goes from zero to something. That's not impressive, that's arithmetic.

The Attribution Shell Game

SEO doesn't happen in isolation. Companies do lots of things: PR, content marketing, paid ads, product improvements, market expansion.

But SEO case studies attribute all organic growth to SEO work. Even when other factors were clearly involved.

I've seen case studies where the "SEO success" was primarily driven by:

A viral news story that had nothing to do with the agency's work. A major product launch that generated natural press. The end of a pandemic lockdown that shifted consumer behavior. A competitor going out of business.

But the case study says "our SEO strategy drove results." Because the agency was doing SEO at the time, and traffic went up, and correlation equals causation in marketing land.

The Metric Switcheroo

When traffic doesn't look good, switch to impressions. When impressions don't look good, switch to rankings. When rankings don't look good, switch to links. When links don't look good, switch to "visibility."

There's always some metric that went up. Find it. Build the case study around it. Ignore everything else.

"We achieved a 340% increase in keyword visibility!"

Cool. Did traffic go up? Did conversions go up? Did the client make money? These questions are conspicuously absent.

Or the classic: "Ranking for 5,000 more keywords!" But the keywords are all long-tail garbage that gets 10 searches per month combined. Technically true. Practically useless.

The Time Frame Trick

The Nightmare by Henry Fuseli
Checking rankings at 3am.

SEO results fluctuate. A lot. Seasonality, algorithm updates, competitive changes - traffic is never a straight line.

Case studies pick the most favorable time frame. If Q1 was bad but Q2 was good, measure from Q1 start to Q2 end. If the last month tanked, end the case study one month earlier.

I've seen case studies published while the site was actively losing traffic. They just picked a time frame where things looked good and stopped there.

You also rarely see long-term follow-ups. "Here's what happened 2 years later" would be interesting. But often, 2 years later, the gains have reversed. Better not to mention that.

The Vanishing Context

Case studies strip out crucial context that would undermine the narrative.

"We helped this startup grow from 1,000 to 50,000 monthly visitors!"

They don't mention: the startup raised $10 million and spent heavily on brand awareness. They hired 3 content writers in-house. They launched 2 new product lines. They got featured in TechCrunch.

Maybe the SEO work helped. Maybe it was a rounding error in a much larger picture. Without context, you can't know.

The Reproducibility Problem

Even if a case study is 100% honest, it might not be reproducible. Because the case study typically doesn't include:

The specific domain history and existing authority. The competitive landscape at that moment. The algorithm state at that time. The budget actually spent. The other marketing activities happening simultaneously. The luck involved.

"We did X and got Y results" doesn't mean you'll get Y results from doing X. SEO is contextual. What works for one site in one market at one time might do nothing for another.

Why This Matters

Luncheon of the Boating Party by Pierre-Auguste Renoir
Celebrating metrics that don't matter.

These case studies set unrealistic expectations. Clients hire agencies expecting 500% growth because they saw it in a case study. When they get 30% growth (which is actually good), they're disappointed.

They also encourage cargo cult SEO. People try to replicate tactics from case studies without understanding the context. It rarely works, and they don't know why.

And they give cover to agencies doing mediocre work. As long as there's some metric that went up, there's a case study to be written. It's all part of the same fiction as monthly SEO reports - creative writing disguised as analytics.

How to Read Case Studies

I'm not saying ignore all case studies. Some are valuable. But read them critically:

Look for absolute numbers, not just percentages. "500% increase" from 100 to 500 visitors is very different from 500,000 to 2.5 million.

Ask about the time frame. Why did they choose that start and end date?

Consider what else was happening. Was the company doing other marketing? Did the market change?

Check if they're still a client. If the relationship ended, why?

Be skeptical of perfect narratives. Real SEO is messy. If the story is too clean, details are probably missing.

The purpose of SEO case studies is to sell SEO services. They're marketing materials, not scientific research. Read them accordingly.

The best predictor of future results isn't a case study. It's understanding the actual work being proposed and whether it makes sense for your specific situation. That requires judgment, not infographics.

Disagree? Good.

These takes are meant to start conversations, not end them.

Tell me I'm wrong