Why Your SEO Data is Misleading You

TL;DR • 12 min read
  • Average position is basically meaningless - the number hides everything useful
  • Traffic is for vanity. Revenue per landing page is what actually matters
  • Pages losing traffic are often your biggest opportunities (not problems to fix)
  • The real insights hide in the extremes of your data, not the averages
SEO Analytics: Most data is noise. Signal exists only at the edges - extremely high CTR queries, pages with unusual engagement patterns, and revenue-correlated landing pages

Everyone claims to do "data-driven SEO." They pull up dashboards, export CSVs, build pivot tables. They look at traffic curves and feel productive.

It's almost entirely theater.

Most SEO analytics work is pattern-matching on noise. You're looking at aggregated data that hides the actual signal, making decisions based on averages that don't represent anything real, and optimizing for metrics that have zero correlation to business outcomes.

I've been doing this for over two decades now, and I've wasted more hours staring at dashboards than I'd like to admit. Took me way too long to figure out what actually moves the needle.

The Average Position Lie

Search Console shows you "average position" for your queries. It's the first number everyone looks at. It's also nearly useless.

Think about it: a page ranking #1 for 100 searches and #50 for 10,000 searches shows an "average position" of around 45. What does that tell you? Absolutely nothing useful. That page is actually crushing it for the queries that matter and completely invisible for a bunch of irrelevant long-tail stuff.

When I worked with OurCrowd, we didn't chase average position. We identified the specific queries where we were position 4-15 with high impressions. Those were the opportunities. Everything else was noise.

Watch out for this
Any time you see an "average" in SEO data, you're looking at a number that probably represents nothing real. The distribution is what matters. Averages just hide the interesting stuff.

So what do you actually do? Export your Search Console data. Filter to position 4-20. Sort by impressions descending. Those top 50 queries are your real opportunity list. I keep coming back to this export every few months because it consistently surfaces stuff I'd otherwise miss.

Traffic is a Vanity Metric

I know. Heresy. But hear me out.

Traffic just counts eyeballs. A page getting 50,000 visits from people who immediately bounce? Worth basically nothing. Meanwhile, some ugly product page getting 500 visits might be generating half your revenue.

What I actually care about now: revenue per landing page. That's it. Everything else is just noise dressed up as insight.

When I rebuilt LonesomeLabs' site, we didn't optimize for traffic. We optimized for pages that converted. The learning academy we built drove 10x traffic growth, but more importantly, it drove qualified traffic - people who understood the product and were ready to buy.

Traffic funnel showing that 50,000 visits with 0.1% conversion equals 50 customers, while 2,000 visits with 5% conversion equals 100 customers. Less traffic, more revenue.

And honestly? Most organic traffic is worthless. Someone searching "what is SEO" isn't buying SEO services. They're a college student doing homework. That traffic looks great in your monthly report and does absolutely nothing for the business.

The queries that actually matter tend to be ugly. Weird long-tail stuff. Low volume. The kind of keywords that make your boss ask "why are we targeting that?" But they convert at 10x the rate of those trophy keywords everyone fights over.

The Declining Pages Opportunity

When traffic to a page drops, the standard response is panic. "We need to fix this page!" Sometimes that's right. Usually it's wrong.

A page with declining traffic often means one of three things:

  1. Seasonality - The topic has natural ebbs and flows
  2. Competition shifted - New players entered the SERP
  3. Search intent changed - Google now wants different content

Only #2 and #3 require action. And even then, the action isn't always "fix this page." Sometimes it's "build a better page targeting the new intent."

Here's the thing that took me years to internalize: your declining pages are often your biggest opportunities.

These pages already have authority. They've accumulated backlinks over time. Google already knows they exist and has indexed them. Refreshing one of these is so much faster than building something new from scratch - you're working with existing equity instead of starting from zero.

The Psik case study is the extreme version of this. They had thousands of pages that Google couldn't see - not declining pages, invisible pages. Once we fixed the rendering issue, those pages went from 0 to #1 overnight. The content was always good. The technical barrier was hiding it.

Why refreshes work so well
An existing page with some authority will typically recover 3-5x faster than a brand new page can climb. It's not even close - you're building on a foundation instead of starting from dirt.

The CTR Anomaly Method

There's this technique I stumbled onto a few years ago that I almost never see anyone else using: CTR anomaly detection.

In Search Console, export your query data. Calculate expected CTR based on position (roughly 30% for position 1, declining from there). Find queries where your actual CTR dramatically exceeds or underperforms the expected rate.

CTR anomaly chart showing expected CTR curve by position vs actual data points. Queries significantly above the curve indicate strong title/snippets. Queries below indicate optimization opportunities.

High CTR anomalies tell you what's resonating. Something about that title and meta description is working for those specific queries. Figure out why - is it the number? The question format? The specificity? - and steal the pattern for other pages.

Low CTR anomalies are basically free money sitting on the table. You're already ranking, people are seeing your result, and they're scrolling right past it. Bad title, wrong intent match, or your competitors just have better snippets. Fix these and you get more traffic without moving a single ranking.

I ran this analysis for a client last year and we got a 40% traffic increase just from rewriting titles. Didn't publish anything new. Didn't build a single link. Just made the existing rankings actually get clicked.

Session Duration is a Trap

GA4 shows you average session duration. Everyone assumes longer is better.

But think about what a long session actually means. Sometimes it's engagement. Sometimes it's a confused user who can't find what they're looking for, clicking around desperately. Meanwhile, someone who finds exactly what they need and converts might be in and out in 30 seconds.

What I actually look at: scroll depth on content pages combined with exit rate to conversion pages.

Someone reads 80% of your article then clicks through to your product page? That's the behavior you want. Someone spends 5 minutes bouncing around your site looking for basic information they can't find? That's a UX failure, even though your session duration report looks fantastic.

The Revenue Attribution Problem

This is where it gets genuinely hard, and where I see a lot of SEOs either oversimplify or give up entirely: attributing revenue to organic search.

Real customer journeys are messy. Someone sees your ad, comes back two days later via organic search, reads three blog posts over the next two weeks, then finally converts on a direct visit. So which channel gets the credit?

Last-click attribution says "direct." First-click says "paid." Linear gives everyone equal credit. They're all wrong in different ways.

I've come to accept that perfect attribution is impossible. Seriously, stop chasing it. You'll drive yourself crazy.

Instead, look at cohort behavior. When you 2x organic traffic to a content cluster, does revenue from that customer segment increase? It won't be 1:1. Some people convert on other channels. But if the correlation exists, organic is working.

At OurCrowd, we couldn't track every investor who found us via organic search. But we could see that as organic traffic to investment theme pages grew, investor signups from those geographic regions grew proportionally. The correlation was clear even when individual attribution wasn't.

The Dashboard Audit

Look at your SEO dashboard right now. What metrics are on it?

If you see:

  • Total organic sessions
  • Average position
  • Total keywords ranking
  • Domain authority

You're measuring vanity. These numbers make you feel good or bad without telling you what to do.

Replace them with:

  • Organic sessions to conversion-path pages
  • Revenue from organic landing pages
  • Position 4-20 queries with 1000+ impressions (opportunity list)
  • CTR anomalies (over/under performing)
  • Pages with declining traffic that have conversion history

See the difference? The first list tells you whether to feel good or bad. The second list tells you what to actually do tomorrow morning.

The Uncomfortable Truth

Look, I'm just going to say it: most SEO analytics is procrastination dressed up as productivity. You're refreshing dashboards because it feels like work. You're building reports instead of building pages. It's measuring instead of doing, and the measuring makes you feel busy.

The best SEO work I've ever done? I looked at the data once, made a decision, then executed for three months before looking at data again. No daily dashboards. No weekly reports. Just long stretches of focused work without constantly checking if the numbers moved yet.

Data should inform your strategy, but somewhere along the way it became a substitute for having one. If you need to check metrics every day, that's not diligence - that's anxiety. And anxious decision-making is usually bad decision-making.

A rule of thumb
If more than 10% of your SEO time goes to analytics, you're doing analytics instead of SEO. The people who actually move numbers tend to measure rarely and execute constantly.

What Actually Works

After all these years, the pattern that works for me is dead simple:

  1. Deep dive upfront - Really dig into the data, find what's hiding (takes me 2-3 days)
  2. Lock in the strategy - Pick what to focus on, actively ignore everything else
  3. Execute without looking back - Build, optimize, publish for 8-12 weeks with minimal peeking at metrics
  4. Review and adjust - What worked? What didn't? Update the plan, repeat

If I had to put a number on it, maybe 90% of my time is execution and 10% is analysis. Most SEO teams I've seen operate with those numbers flipped, then wonder why nothing ever seems to change.

The data absolutely matters. But it matters as input to decisions - not as the work itself. At some point you have to close the spreadsheet and actually go move the numbers.

For the specific techniques I used at OurCrowd to find opportunities in Search Console data, see The Search Console Trick That Finds Opportunities. For understanding which pages to refresh vs. leave alone, read The Content Refresh Formula.

Want more tactical SEO?

Practical frameworks you can implement today.

Browse all notes