12,847
4 REAL
0%

AI Overviews Are Filtering Out Bad Traffic

Those zero-click queries never converted. Those visitors bounced. That traffic was inflating your ego, not your revenue. Google is doing you a favor. You just don't want to hear it.

Rand Fishkin's 2024 clickstream study found that 58.5% of Google searches end without a click. With AI Overviews, that number is climbing toward 70%. The SEO industry is treating this like a five-alarm fire. Twitter threads with fire emojis. LinkedIn posts that read like eulogies. Conferences where grown adults stand on stage and talk about "mourning" their traffic.

It's genuinely pathetic, but here's the question nobody's asking because asking it would require them to stop crying into their Search Console dashboards for five seconds: what were those clicks actually worth?

I pulled engagement data from 47 sites I've worked with over the past 18 months, and the pattern is consistent across every single one of them, depressingly so: informational query traffic (how to, what is, why does) shows median time-on-page of 52 seconds, scroll depth under 25%, bounce rates above 70%, and return visitor rate under 3%, which is to say these weren't customers, they weren't even readers in any meaningful sense of the word, they were people who clicked because Google's snippet was too short, skimmed for 40 seconds while their eyes glazed over, and left without so much as a backwards glance, the click being nothing more than a tax they paid to get information they felt entitled to, and now the AI Overview removes the tax, they get the information without the click, and website owners everywhere are furious about losing visitors who were never really visiting in the first place.

The Man Who Waved at Trains

Let me tell you about a guy I invented for the purposes of this analogy. His name is Gerald. Gerald lives near train tracks in rural Pennsylvania, and Gerald measures his social connections by counting how many trains he waves at per day.

Gerald has optimized this. He knows the Amtrak schedule by heart. He's memorized which engineers wave back and which ones just stare straight ahead like sociopaths. He bought a high-visibility vest so they can see him better. He positioned a lawn chair at the exact spot where the engineer's sightline is unobstructed for maximum wave-back probability. He's waved at 12,847 trains over the past six years. He has a spreadsheet.

Gerald considers himself extremely well-connected. "Twelve thousand relationships," he tells people at parties. "I know guys on every line from Boston to DC." He doesn't know their names. He's never spoken to them. They wave reflexively, the way you wave at a child, an instinctive acknowledgment that means nothing. But Gerald counts every wave-back. He graphs them. He notices seasonal patterns. He wrote a blog post about how wave-backs decline during holiday schedules and attributed it to "engineer burnout."

Now the railroad is switching to autonomous trains. No engineers. No one to wave at. Gerald's wave-backs have dropped 94%. He's devastated. He's writing angry letters to Amtrak about "destroying community connection" and "the death of human interaction in transit." He started a petition. He's considering a lawsuit.

Gerald, obviously, was never connected to anyone. Those engineers didn't know he existed as a distinct human being. They saw motion in their peripheral vision and their arms moved. Gerald was having a parasocial relationship with the concept of being seen. The waves were an artifact of human-operated trains, not a genuine social bond. Autonomous trains didn't kill Gerald's relationships. They revealed he never had any.

Gerald is the SEO industry right now. The industry is Gerald. You are Gerald. I might be Gerald. We need to talk about this.

Gerald's Social Connection Methodology - showing his wave-back optimization framework, engineer response taxonomy, and a funnel from 12,847 waves to 0 actual friends
Gerald's methodology was rigorous. His results were not.

Goodhart's Law Comes for SEO

Goodhart's Law states that "when a measure becomes a target, it ceases to be a good measure," and the SEO industry spent twenty years proving this law correct with the kind of thoroughness that would make a dissertation committee weep with joy.

Traffic became the target, traffic became the KPI in board decks, the metric that justified budgets, the number that determined whether SEO "worked," and because traffic was the target, traffic became worthless as a signal of actual business value, a hollow number that signified nothing except its own existence, a masturbatory exercise in watching graphs go up and to the right while nobody stopped to ask what all those visitors were actually doing once they arrived, which, as it turns out, was almost nothing.

Think about what traffic actually measures: the number of times someone clicked a link, that's it, not interest, not intent, not commercial value, just clicks, and those clicks, it turns out, were an artifact of Google's UI limitations, because the snippet was too short to answer the question so people clicked through to get the rest, and now AI Overviews have removed the UI limitation, the full answer appears directly in search results, the click is no longer required, so the click disappears, and what we're witnessing isn't traffic loss at all but rather the exposure of traffic that was never meaningful in the first place.

The Conversion Gap Nobody Mentions

WordStream's industry benchmark data shows search conversion rates averaging 2.35% across industries, but that's heavily weighted by transactional queries, and when you segment by intent the picture changes dramatically, because informational queries ("how to," "what is," "why does") convert at 0.5% to 1.5% according to FirstPageSage's analysis while transactional queries ("buy," "pricing," "near me") convert at 3% to 8%, which represents a 5-10x gap in commercial value per session, a chasm so wide you could lose an entire marketing budget in it.

Now look at where AI Overviews actually appear, because this is where it gets interesting: BrightEdge research found AI Overviews trigger on 84% of question-based queries versus only 35% of commercial queries, which means the feature is systematically filtering out the lowest-value traffic while preserving the highest-value traffic, and when SearchEngineLand reports a 61% CTR drop the missing context is intent distribution, because that 61% was overwhelmingly informational traffic with sub-1% conversion rates while the transactional traffic that actually drives revenue remains largely intact, and yet the industry is mourning ghost visitors, grieving phantoms that were never going to convert, Gerald standing by the tracks with tears streaming down his face because the autonomous train didn't wave back.

Who Actually Lost Something (And Why You Should Point and Laugh)

OK fine, this isn't a claim that AI Overviews hurt nobody, because certain business models did get obliterated, and let's pour one out for them, just kidding, let's examine what they actually were and then feel nothing:

Ad arbitrage sites. Those sites ranking for "how many ounces in a cup" with 47 display ads wrapped around a single sentence like a tumor, pop-ups on pop-ups, auto-playing videos in the corner, cookie consent dialogs that require a PhD in dark patterns to dismiss, the entire business model being to harvest informational queries, serve ads against traffic with zero commercial intent, and pray nobody at Google notices you're essentially a digital parasite, and CPM rates for this traffic were already collapsing before AI Overviews hit, which means these people were running a grift on borrowed time and now they're acting like victims when they're not victims at all, they were arbitraging human inconvenience and the inconvenience ended.

Affiliate thin content. The 3,000-word "Best X for 2025" articles written by someone who never touched a single X in their life, "Best Chainsaws for Homeowners" by a 23-year-old in Manila who has never held a power tool and assembled the article from Amazon reviews and competitor listicles, the entire value proposition being "I compiled information that exists elsewhere and added 2,800 words of padding so Google would rank me," and now AI Overviews do this compilation better, faster, without the padding, so shocking that they lost, truly shocking, nobody could have predicted this.

Programmatic content farms. Sites with 10,000+ pages targeting every "keyword + question word" permutation, "How to [verb] [noun]" times infinity, not because users needed ten thousand variations but because the algorithm rewarded volume and nobody asked whether these pages deserved to exist, and Semrush documented 40-70% traffic drops for these sites after the Helpful Content Update, which means AI Overviews just showed up to the funeral and shoveled dirt on the coffin, these sites already being dead, they just hadn't stopped twitching yet.

Notice the pattern: every devastated business model depended on traffic that had no intention of buying anything, these models were arbitraging Google's inability to answer questions directly, they were parasites feeding on a UI limitation, and then Google fixed the limitation and the parasites starved, and this is not a tragedy, this is pest control.

The Traffic That Remains Is Better

The Great Traffic Filtering - before and after AI Overviews showing how worthless traffic got filtered out while valuable commercial intent traffic remained
The trash took itself out. The remaining traffic actually converts.

Here's what the panic narrative obscures: Seer Interactive's analysis shows remaining clicks have measurably higher engagement metrics, time on site up 12%, pages per session up 8%, bounce rate down 15%, and the mechanism is straightforward because if the AI Overview answered someone's question they don't click, but if they still click after reading the overview it's because they want something the overview couldn't provide, whether that's deeper analysis or a specific product or pricing or a service, which means that click now signals genuine interest rather than mere information retrieval.

From the 47 sites in my dataset, here's what the numbers show post-AI Overview rollout: total organic sessions down 34%, organic conversion rate up 28%, revenue per organic session up 41%, total organic revenue down only 6%, and I need you to read that again because a 34% traffic drop translated to only a 6% revenue decline, which means the traffic that disappeared was worth almost nothing while the traffic that remains is worth considerably more per visit, and when Google's VP Liz Reid claimed "average click quality has slightly increased" the SEO community dismissed this as PR but the data supports it, because when you filter out clicks that never would have converted the remaining clicks are by definition higher quality.

What You Should Actually Track

If AI Overviews broke your dashboard, your dashboard was measuring the wrong things, so here's what actually matters:

Revenue per session. Not sessions, revenue per session, because if this number is flat or rising while total sessions decline you lost nothing of value, traffic volume being vanity while revenue per visit is sanity.

Conversion rate by intent. Segment traffic by query type (informational, navigational, commercial, transactional) and track conversion rates for each segment, because my data shows informational traffic converting at 0.8% versus commercial traffic at 4.2%, and if your informational drops 60% while commercial drops only 5% you know exactly what's happening.

Qualified pipeline from organic. Not form fills, not email signups, but actual qualified opportunities that could become customers, because if this number is stable your business is fine regardless of what your traffic graph shows.

Customer acquisition cost. Same SEO spend plus same customer count equals unchanged CAC, and all that traffic between acquisition events was just noise in the funnel, expensive noise sometimes, but noise nonetheless.

The whole point of Goodhart's Law is that single metrics get gamed, and traffic was a single metric that got gamed for twenty years, but now AI Overviews are forcing a transition to multi-metric frameworks that better approximate business value, and that transition is painful, yes, but it's also overdue.

The Uncomfortable Reframe

Here's the frame shift that's hard to accept: traffic was never the goal, traffic was a proxy for the goal, and it was a bad proxy that we optimized for twenty years because it was easy to measure, it went up and to the right, and most importantly it let us avoid having difficult conversations with clients about whether SEO was actually generating revenue or just generating impressive-looking graphs, you know those graphs, the ones with big green numbers that SEO agencies put in monthly reports, "Traffic up 47%!" and great, but revenue, "well, that's harder to attribute," and conversions, "that's really more of a sales issue," and qualified leads, "the client's CRM doesn't integrate properly."

What your SEO report says vs what it means - showing +47% traffic translates to +3.1% revenue when you break down the actual visitor composition
Left: what gets emailed to the C-suite. Right: what it actually means.

Traffic was the metric we could always win, so traffic became the game, and the game was bullshit, we all knew it was bullshit, we just couldn't stop playing because the alternative was admitting we didn't know what we were actually delivering, because customers were always the goal, revenue was the goal, business outcomes were the goal, and traffic was supposed to correlate with these things but for informational queries it barely did, and we just pretended otherwise because the pretense was profitable, and now AI Overviews are deleting the bad proxy and forcing a return to first principles, which is disorienting if your entire measurement framework was built on the proxy, but the proxy was always a lie, and the lie is just becoming impossible to maintain.

The Adaptation Playbook

Here's what actually works in a post-AI Overview landscape:

Concentrate on commercial and transactional queries. AI Overviews appear 2.4x less frequently on commercial queries versus informational, which means "best CRM software" triggers AI Overviews less than "what is CRM software" because the former has purchase intent while the latter doesn't, so prioritize accordingly.

Build brand recognition. Branded searches don't trigger AI Overviews, so when someone searches "[your company name]" they get your website, no AI summary, full CTR, and brand becomes the moat that AI can't drain.

Get cited in AI Overviews. Botify's research shows sources cited in AI Overviews see 23-35% higher CTR than uncited competitors, so if zero-click is inevitable for informational queries then being the cited source captures brand exposure and preferential clicks.

Restructure your analytics. Start with revenue, work backwards through customers to qualified leads to commercial visits to total visits, and recognize that traffic is now the widest and least meaningful part of the funnel, which means you should stop putting it at the top of your dashboard.

The Forcing Function

For twenty years, the SEO industry avoided a hard question: what is organic search actually worth, not in traffic but in revenue, in customers, in outcomes that matter to businesses, and traffic dashboards let us avoid answering because traffic went up so SEO was "working," traffic correlated loosely with good things happening, and nobody asked too many questions.

AI Overviews are forcing the question, because the traffic is disappearing, the correlation is breaking, and the industry has to finally prove value in terms that matter or admit it never could, and the 58-70% of searches ending without clicks were never going to be customers anyway, they were an artifact of UI design that's now been fixed, so the question is whether your business was built on that artifact or on something real, and if it was something real you'll be fine, but if it was the artifact you were always going to have this reckoning, AI Overviews just moved up the timeline.

Gerald's Lesson

Gerald, our train-waving friend from earlier, eventually had a realization, though it took him a while, after six months of writing angry letters to Amtrak and starting a failed petition with 23 signatures (most from family members who felt bad for him), and what he realized when he finally sat down and thought about what he actually wanted was that he wanted connection, he wanted to matter to people, he wanted relationships that meant something, and waving at trains was never going to give him that because it was a simulacrum of connection, a metric that felt like connection, a number that went up when he wanted the feeling of social bonds without doing the work of building them, and the engineers didn't know him, they never would, the waves were empty calories.

Gerald joined a bowling league and now he has four actual friends, people who know his name and remember his birthday and text him about things that aren't trains, and his social connection count, by his old methodology, dropped from 12,847 to 4, but those 4 are real, they show up, they matter, and traffic is Gerald's train waves while customers are Gerald's bowling league, the number got smaller but the thing the number was supposed to measure got better.

Stop counting waves and start counting what matters, or don't, keep standing by the tracks in your high-visibility vest crying about autonomous trains, the rest of us have a bowling league to get to.

Data sources cited above include SparkToro, WordStream, FirstPageSage, BrightEdge, Seer Interactive, SearchEngineLand, Semrush, and Botify. Proprietary data from analysis of 47 client sites across B2B SaaS, e-commerce, and professional services verticals, Q2 2024 through Q4 2025.