SEO Vanity Metrics Are Astrology for People With Spreadsheets
- → Domain Authority is a number Moz made up. Google has never used it. Google will never use it.
- → "Keywords ranked for" counts include garbage you'd be embarrassed to show a client
- → E-E-A-T is a quality rater guideline, not a score. There is no E-E-A-T score. Stop looking for it.
- → The only metric that matters is money. Everything else is a proxy, and most proxies are lies.
I want to tell you about a man I knew in the industry. Let's call him Viktor. Viktor had a dashboard. The dashboard had forty-seven metrics. Viktor could tell you, at any moment, his Domain Authority, his Domain Rating, his Trust Flow, his Citation Flow, his Page Authority, his Spam Score, his "topical authority" across fourteen verticals, and the precise number of keywords his sites ranked for.
Viktor's sites made no money.
Viktor showed me his dashboard once, at a conference, with the pride of a man showing off his firstborn child. The numbers were green. The arrows pointed up. The graphs swept toward heaven like the spires of a cathedral built to worship nothing.
"What's your revenue per visitor?" I asked.
Viktor looked at me like I had asked him to calculate the weight of his soul in imperial units. He did not know. He had never checked. The dashboard did not have a field for that. The dashboard had fields for everything except the thing that mattered.
Viktor is not a real person. Viktor is all of us. Viktor is the industry.
A Brief History of Making Things Up
In 2004, Moz invented Domain Authority, and this was a genuinely useful thing to do, because Google's PageRank toolbar was dying, SEOs needed something to look at, and Moz provided it, and they were clear about what it was: a predictive score, based on their own crawl data, that attempted to estimate how well a domain might rank, a proxy, an approximation, a guess, but somewhere between 2004 and now the SEO industry collectively forgot the "guess" part, and Domain Authority became gospel, clients started asking for it by name, agencies started promising to increase it, link sellers started pricing by it, and an entire economy emerged around a number that Moz themselves will tell you Google doesn't use, has never used, and will never use.
This is how vanity metrics work: someone creates a proxy because the real thing is unknowable, the proxy becomes more famous than the thing it was proxying, everyone forgets it was ever a proxy at all, and the map becomes the territory, the menu becomes the meal, the dashboard becomes the business.
Domain Authority / Domain Rating
What it claims to measure: The overall "strength" or "authority" of a domain
What it actually measures: The quantity and "quality" of backlinks, as determined by a proprietary algorithm that is not Google's algorithm
Why it's useless: A DR 80 site can be outranked by a DR 20 site any day of the week if the DR 20 site has better content, better relevance, and better user signals for that specific query. I have seen this happen hundreds of times. I have caused this to happen hundreds of times.
The Keyword Count Delusion
Every SEO tool will tell you how many keywords your site "ranks for," and Ahrefs will say 50,000, and Semrush will say 73,000, and Moz will say 41,000, and they're all making this number up, they're all making it up differently, and the number is meaningless in every case, because here's what "ranking for 50,000 keywords" actually means: your site appears somewhere in the top 100 results for 50,000 search queries according to one tool's sample of one country's search data on one particular day.
Of those 50,000 keywords:
- 40,000 are position 50+, which means no human has ever seen your result for them
- 8,000 are variations of the same query ("best pizza near me" vs "best pizza close to me" vs "good pizza nearby")
- 1,500 are your brand name plus random words
- 400 are queries with zero monthly search volume
- Maybe 100 actually drive meaningful traffic
But the tool shows "50,000 keywords" in big green numbers, and that number goes up every month because the tool keeps finding more garbage to count, and the client is happy because big number go up, and nobody asks what any of this has to do with revenue.
I once audited a site that "ranked for" 200,000 keywords. They were making $3,000 a month. Another site in the same niche ranked for 8,000 keywords and made $400,000 a month. The second site had fewer keywords, less "authority," and worse metrics across the board. They just happened to rank #1-3 for the queries people actually searched before buying things.
Keywords Ranked For
What it claims to measure: Your site's visibility across the search landscape
What it actually measures: The size of the tool's keyword database multiplied by your site's tendency to appear somewhere, anywhere, for anything
Why it's useless: Ranking #87 for a query is the same as not ranking. But the tool counts it. The tool always counts it. The tool is paid to count it.
The E-E-A-T Scam
This one makes me want to scream into a pillow, because E-E-A-T, which stands for Experience, Expertise, Authoritativeness, and Trustworthiness, comes from Google's Search Quality Rater Guidelines, a document Google gives to human contractors who manually evaluate search results to help train the algorithm, and it is, at its core, nothing more than a framework for human raters to think about content quality, which means it is emphatically not a score, not a metric, not a number, not a thing in Google's ranking algorithm, not a thing you can measure, not a thing you can optimize for directly, and not a thing that exists in any quantifiable form whatsoever, and yet I have seen agencies sell "E-E-A-T audits" for $5,000, I have seen tools claim to measure "E-E-A-T scores," I have seen consultants recommend specific actions to "increase your E-E-A-T," and I have seen people argue about whether a page has "enough E-E-A-T" like they're debating whether a soup has enough salt, which would all be very funny if it weren't so expensive for the people being sold this nonsense.
There is no E-E-A-T score, Google doesn't calculate one, you can't look it up, and when someone tells you they can improve your E-E-A-T score, they are lying to you, or they are confused, or both, and they might be well-intentioned, they might genuinely believe what they're saying, but it's still not true.
What you can do: make your content demonstrably created by people with real expertise. Show credentials. Link to sources. Be accurate. Have real humans with real bylines write real things about topics they actually understand. This is just "being good at content." Calling it "E-E-A-T optimization" is like calling "cooking food properly" a "thermal flavor enhancement strategy."
E-E-A-T Score
What it claims to measure: Your content's Experience, Expertise, Authoritativeness, and Trustworthiness
What it actually measures: Nothing. It doesn't exist. You're measuring a hallucination.
Why it's useless: You cannot measure the unmeasurable. You cannot optimize for the unoptimizable. You can only do good work and hope.
Backlink Counts: A Love Story
Let me tell you about my friend's backlink profile, because he has 50,000 backlinks, fifty thousand, which sounds impressive until you look at where they come from:
- 47,000 are from blog comment spam that accumulated between 2008-2012
- 2,000 are from a widget he made that got embedded on recipe blogs
- 800 are from scraped content on sites that no longer exist
- 150 are from legitimate sources
- 50 actually pass value
His competitor has 2,000 backlinks. But they're from the New York Times, industry publications, and respected blogs. His competitor outranks him for everything that matters.
The tools don't know the difference, the tools count links, one link from the Wall Street Journal and one link from "free-backlinks-casino-viagra-2019.blogspot.com" are both one link, the tool dutifully adds them together and presents a number, and that number is useless, and worse, having more bad links can actively hurt you, the backlink count going up can be a disaster, but nobody puts "backlink count went up (this is bad)" in their monthly report, because that's not how the game is played.
Total Backlinks / Referring Domains
What it claims to measure: The quantity of external sites linking to you
What it actually measures: The quantity of external sites linking to you, with no regard for whether those links help, hurt, or do nothing
Why it's useless: One good link beats a thousand garbage links. But "1" looks worse than "1,000" in a dashboard, so we pretend otherwise.
Trust Flow and Citation Flow: The Majestic Fantasy
Majestic invented Trust Flow and Citation Flow, and these metrics purport to measure the "trustworthiness" and "influence" of a site's backlink profile, and they have nice logos, and they come in two colors, and they look scientific, and they are completely made up, and I don't mean this in the sense that all metrics are somewhat made up, I mean that Majestic created a proprietary algorithm, assigned numbers between 0 and 100 to websites, and then convinced the SEO industry that these numbers mean something, even though Google does not use Trust Flow, Google does not use Citation Flow, Google does not know what these numbers are, and Majestic knows what these numbers are, and Majestic charges money to tell you what they are.
This is not a criticism of Majestic specifically, because they're doing what everyone does, they're selling shovels in a gold rush, and the shovels happen to be made of foam, but so is everyone else's, and at least Majestic's foam shovels have a nice grip.
Trust Flow / Citation Flow
What it claims to measure: The quality and quantity of your backlink profile's "trust"
What it actually measures: Majestic's opinion about your backlink profile, expressed as a number
Why it's useless: The only entity whose opinion about your backlink profile matters is Google, and Google isn't telling.
Topical Authority: The Newest Delusion
"Topical authority" is the SEO industry's current favorite made-up concept, built on the idea that Google rewards sites that demonstrate comprehensive expertise in a topic, which means you need to publish lots of content about related subjects to "build topical authority," and while there's probably some truth here, in the sense that Google likely prefers sites with depth over sites with random scattered content, there is no "topical authority score," you cannot measure topical authority, you cannot know when you have "enough" topical authority, and what happens in practice is that agencies recommend publishing 50 articles about tangentially related topics to "build topical authority," clients pay for 50 articles, the articles get published, nothing changes, and the agency says "we need more topical authority" and recommends 50 more articles, in a cycle that continues until the client runs out of money or patience, whichever comes first.
I have seen sites with three pages outrank sites with three hundred pages on the same topic, I have seen it happen repeatedly, and the three-page site had better content, better links to those specific pages, and better user engagement, while the three-hundred-page site had "topical authority," whatever that means, which apparently means nothing.
Topical Authority Score
What it claims to measure: How comprehensively you cover a topic
What it actually measures: Whatever the tool vendor decided it measures this week
Why it's useless: Publishing more content doesn't make your content better. It just makes more of it. These are different things.
The Core Web Vitals Fetish
Core Web Vitals are real, Google actually uses them, which makes them unusual among SEO metrics, and yet the industry has still managed to turn them into a vanity metric, because the fetish works like this: everyone obsesses over getting "green" scores in PageSpeed Insights, developers spend weeks shaving milliseconds off LCP, entire site redesigns are justified by Core Web Vitals, "we need to pass Core Web Vitals" becomes a rallying cry, and meanwhile Google has repeatedly said Core Web Vitals are a tiebreaker, not a major ranking factor, which means if your content is worse than your competitor's, passing Core Web Vitals won't save you, and if your content is better, failing Core Web Vitals probably won't kill you.
I know a site that fails every Core Web Vital, red scores across the board, and it ranks #1 for some of the most competitive terms in its industry, and I know another site that passes everything, green checkmarks everywhere, and it ranks on page 7 for its target keywords, and the green checkmarks are satisfying, I understand the appeal, but satisfying is not the same as useful.
Core Web Vitals Pass Rate
What it claims to measure: Your site's user experience quality
What it actually measures: Whether specific technical metrics meet specific thresholds
Why it's partially useless: These metrics are real and matter at the margins, but the industry has elevated them from "one factor among hundreds" to "the thing we obsess over while ignoring content quality"
Share of Voice: The Biggest Lie
This one is my personal favorite, because the lie is so blatant and nobody seems to care, and "Share of Voice" claims to measure what percentage of search visibility you have compared to competitors, with tool vendors calculating it by taking the keywords you track, estimating traffic from rankings, and dividing by total estimated traffic, but the problems are legion:
- You chose which keywords to track, so you're measuring your share of a set you selected
- The traffic estimates are guesses based on click-through rate models that are wildly inaccurate
- Your competitors are tracking different keywords, so their "Share of Voice" number isn't comparable
- The whole thing is circular: you're measuring your share of a market you defined
It's like asking "what percentage of the world's best pizzas do I make?" and then only counting the pizzas in your own kitchen, and agencies love Share of Voice because it always goes up if you add more keywords where you rank well, and clients love Share of Voice because it looks like a market share number, which feels businessy and important, and everyone is happy except the person who eventually asks "why isn't this translating to revenue?"
Share of Voice
What it claims to measure: Your market share of organic search visibility
What it actually measures: Your share of a keyword set you selected, using traffic estimates that are wrong
Why it's useless: It's not a share of anything real. It's a share of an imaginary market you defined to make yourself look good.
What Actually Matters
Here is the complete list of SEO metrics that matter: revenue from organic search, meaning how much money did organic visitors generate; conversions from organic search, meaning how many organic visitors did the thing you wanted them to do; and revenue per organic session, meaning how valuable is each organic visit.
That's it, that's the list, and everything else is a proxy for one of these three things, and most proxies are bad proxies, because traffic is a proxy for conversions but lots of traffic converts at 0%, and rankings are a proxy for traffic but ranking #1 for a zero-volume query means nothing, and Domain Authority is a proxy for ranking ability but the proxy was never validated against actual rankings and never will be.
The reason we use vanity metrics is because the real metrics are hard, because you have to set up conversion tracking, you have to attribute revenue, you have to wait long enough to see results, you have to admit that sometimes nothing is working, and the vanity metrics let you avoid all of this, they let you look at green numbers and feel productive, and I understand the appeal, I've felt it myself, because looking at DA go up is satisfying in a way that looking at a revenue spreadsheet is not, but satisfaction is not the same as success, and the dashboard is not the business.
The Uncomfortable Truth
Viktor, my fictional friend from the beginning, eventually figured this out, and it took him three years and a near-bankruptcy, but he figured it out, he deleted his dashboard, he stopped checking his Domain Authority, he started asking one question, "Did we make money this month?", and his business got better, not because asking the question made money appear, but because asking the question forced him to focus on the things that actually made money appear, and he stopped publishing content for "topical authority" and started publishing content that converted visitors into customers, he stopped building links for "DR" and started building links that sent relevant traffic, and the metrics didn't matter, the outcomes mattered, and the metrics had always been a way of avoiding the outcomes.
This is the thing about vanity metrics: they're comfortable, they're safe, they give you something to do and something to show, they let you feel like progress is happening even when it isn't, they're the SEO equivalent of busywork, and the whole industry is built on them because the whole industry needs something to sell and selling "we'll try to make you money and we'll tell you honestly if it's working" is harder than selling "we'll make your DA go up," and I don't have a solution, I don't think there is one, because the incentives are too strong, the tools are too entrenched, the clients want dashboards, the agencies want deliverables, and everyone wants to feel like something is happening.
But you can, at least, stop lying to yourself, you can look at your vanity metrics and know what they are, you can ask the uncomfortable question, "Is this making us money?", you can delete the dashboard or at least stop checking it every day, and the numbers will keep going up, the numbers always go up, that's what they're designed to do, but whether your business goes up is a different question entirely.
Viktor isn't real, but Viktor's dashboard is on your screen right now. You might want to close that tab.