7 min read

Most SEO Case Studies Are Lies

That case study showing 500% traffic growth? Cherry-picked data, survivorship bias, and creative accounting. Here's how the sausage gets made.

"We increased organic traffic by 847% in 6 months!" - you've seen these case studies everywhere, on agency websites and in conference talks and scattered across LinkedIn posts and Twitter threads, everyone with their own success story featuring eye-popping numbers that seem almost too good to be true, and most of them are, at best, misleading, and at worst, outright fabrications.

I've been in the rooms where these case studies were constructed, I've seen how the sausage gets made, watched the spreadsheets get massaged and the timelines get adjusted and the metrics get cherry-picked until something presentable emerged from the chaos, so let me show you what's behind the curtain.

The Cherry-Picking Game

Woman with a Parasol by Claude Monet
Shielding yourself from algorithm changes.

Every agency has dozens of clients, some of whom do well, some of whom do okay, and some of whom fail spectacularly in ways that make everyone involved cringe when they think about it - and if you're wondering which ones make it into case studies, you'll never see one titled "How We Worked For 18 Months With No Results" or "The Time Our Strategy Completely Backfired," because those clients exist but they just don't get published.

This creates massive survivorship bias, where you're only seeing the wins, never the far more numerous mediocre outcomes and outright failures, and so you assume the wins are normal when they're actually exceptional outliers - if an agency publishes 5 case studies and has 50 clients, you're seeing the top 10%, and the other 90% might be fine, or they might be disasters, but you'll never know either way.

The Baseline Manipulation

Here's a trick I've seen countless times: choosing a convenient baseline, so that when someone says "Traffic increased 500% after our engagement!" what they don't tell you is that the baseline was chosen at a traffic low point - maybe the site had just recovered from a penalty, maybe it was seasonal, maybe they started measurement right after a technical disaster was fixed - and if a site normally gets 10,000 monthly visitors but dropped to 1,000 due to some problem, and the agency started after that problem was fixed, they can claim 900% growth even though they didn't cause that growth at all, the site was just returning to its normal state like a spring returning to its natural position.

Another version of this trick involves measuring from before the site even existed, leading to headlines like "From zero to 50,000 monthly visitors!" which, yes, technically true, because any new site goes from zero to something eventually, but that's not impressive, that's just arithmetic, that's just how numbers work.

The Attribution Shell Game

SEO doesn't happen in isolation - companies do lots of things simultaneously like PR and content marketing and paid ads and product improvements and market expansion - but SEO case studies attribute all organic growth to SEO work, even when other factors were clearly involved, even when it's obvious to anyone paying attention that something else was driving the results.

I've seen case studies where the "SEO success" was primarily driven by a viral news story that had nothing to do with the agency's work, or a major product launch that generated natural press, or the end of a pandemic lockdown that shifted consumer behavior, or a competitor going out of business and leaving a gap in the market - but the case study says "our SEO strategy drove results," because the agency was doing SEO at the time, and traffic went up, and correlation equals causation in marketing land.

The Metric Switcheroo

When traffic doesn't look good, switch to impressions; when impressions don't look good, switch to rankings; when rankings don't look good, switch to links; when links don't look good, switch to "visibility" - there's always some metric that went up, always some number you can cherry-pick, so find it, build the case study around it, ignore everything else.

"We achieved a 340% increase in keyword visibility!" - cool, but did traffic go up, did conversions go up, did the client actually make any money from this visibility, because those questions are conspicuously absent from the presentation.

Or there's the classic "Ranking for 5,000 more keywords!" which sounds impressive until you realize the keywords are all long-tail garbage that gets 10 searches per month combined, technically true but practically useless, the kind of achievement you can only celebrate if you never look too closely at what it actually means.

The Time Frame Trick

The Nightmare by Henry Fuseli
Checking rankings at 3am.

SEO results fluctuate - a lot, actually, due to seasonality and algorithm updates and competitive changes, meaning traffic is never a straight line - and case studies pick the most favorable time frame with surgical precision, so if Q1 was bad but Q2 was good they measure from Q1 start to Q2 end, and if the last month tanked they end the case study one month earlier, and I've seen case studies published while the site was actively losing traffic, where they just picked a time frame where things looked good and stopped there, freezing the narrative at its most flattering moment.

You also rarely see long-term follow-ups, and "Here's what happened 2 years later" would be genuinely interesting reading, but often, 2 years later, the gains have reversed entirely, so better not to mention that, better to let the case study exist in its own perfect little time capsule forever.

The Vanishing Context

Case studies strip out crucial context that would undermine the narrative, so when they say "We helped this startup grow from 1,000 to 50,000 monthly visitors!" they don't mention that the startup raised $10 million and spent heavily on brand awareness, that they hired 3 content writers in-house, that they launched 2 new product lines, that they got featured in TechCrunch - and maybe the SEO work helped, or maybe it was a rounding error in a much larger picture of growth, but without context you can't possibly know which.

The Reproducibility Problem

Even if a case study is 100% honest, it might not be reproducible, because the case study typically doesn't include the specific domain history and existing authority, doesn't mention the competitive landscape at that moment, doesn't acknowledge the algorithm state at that time, doesn't reveal the budget actually spent, doesn't account for the other marketing activities happening simultaneously, doesn't quantify the sheer dumb luck involved.

"We did X and got Y results" doesn't mean you'll get Y results from doing X, because SEO is contextual in ways that resist generalization, and what works for one site in one market at one time might do absolutely nothing for another site in a different market at a different time.

Why This Matters

Luncheon of the Boating Party by Pierre-Auguste Renoir
Celebrating metrics that don't matter.

These case studies set unrealistic expectations, leading clients to hire agencies expecting 500% growth because they saw it in a case study, and when they get 30% growth - which is actually good, which would be a success by any reasonable standard - they're disappointed, because they've been conditioned to expect magic.

They also encourage cargo cult SEO, where people try to replicate tactics from case studies without understanding the context, and it rarely works, and they don't know why, and they assume they must be doing something wrong rather than questioning whether the original case study was telling the whole truth.

And they give cover to agencies doing mediocre work, because as long as there's some metric that went up, there's a case study to be written, and it's all part of the same fiction as monthly SEO reports - creative writing disguised as analytics.

How to Read Case Studies

I'm not saying ignore all case studies - some are valuable, genuinely instructive - but read them critically: look for absolute numbers, not just percentages, because a "500% increase" from 100 to 500 visitors is very different from 500,000 to 2.5 million; ask about the time frame and why they chose that particular start and end date; consider what else was happening, whether the company was doing other marketing, whether the market changed; check if they're still a client, and if the relationship ended, why; and be skeptical of perfect narratives, because real SEO is messy, and if the story is too clean, details are probably missing.

The purpose of SEO case studies is to sell SEO services. They're marketing materials, not scientific research. Read them accordingly.

The best predictor of future results isn't a case study. It's understanding the actual work being proposed and whether it makes sense for your specific situation. That requires judgment, not infographics.

Disagree? Good.

These takes are meant to start conversations, not end them.

Tell me I'm wrong