Take
I've Been Doing This for Twenty Years and I Still Google How to Do Things
Parked outside a client's office, frantically searching 'GTM data layer setup' before the meeting. This is what expertise actually looks like.
I've Been Doing This for Twenty Years and I Still Google How to Do Things
Parked outside a client's office, frantically searching "GTM data layer setup" before the meeting. This is what expertise actually looks like.
I'm in the parking lot of a Fortune 500 company's regional headquarters, and I have nine minutes before a meeting that I am being paid a genuinely uncomfortable amount of money to attend, and I am Googling "Google Tag Manager data layer ecommerce tracking setup." I am Googling this because the meeting is about ecommerce tracking, and while I understand ecommerce tracking conceptually - while I could draw the architecture on a whiteboard, while I could explain the philosophy of data collection to a room full of executives with the confidence of someone who has been doing this for two decades - I cannot, at this exact moment, remember the specific syntax of the dataLayer.push object for a purchase event. I know it exists. I know roughly what it looks like. I know I've implemented it before, multiple times, on multiple sites, including one site where I implemented it wrong the first time and spent three hours debugging it at two in the morning while a client in a different timezone sent increasingly anxious Slack messages that I was too deep in Chrome DevTools to answer.
I cannot remember the exact syntax the way I cannot remember whether Paraguay is east or west of Uruguay, which is to say: I knew it once, I could look it up in four seconds, and my brain has decided this is not important enough to store in permanent memory when the space could be used for song lyrics from 1997.
(It's east, by the way. Paraguay is east of - no, wait. West. Paraguay is west. I think. Don't quote me on this. Actually, don't quote me on the data layer syntax either, because I am literally in a parking lot Googling it right now.)
The search results load. I click on the Google Developer documentation. I scan the code example. Right. Yes. That's what it looks like. The transaction ID goes in the transaction_id field, not the transactionId field, because Google switched from camelCase to snake_case when they moved from Universal Analytics to GA4, which is exactly the kind of breaking change that Google makes without apparent remorse, like a person who rearranges all the furniture in your house while you're on vacation and then acts surprised when you trip over the coffee table. I read the code example twice. I screenshot it on my phone. I get out of the car. I walk into the building. I go to the meeting. I spend ninety minutes advising a team of six people on their ecommerce tracking implementation with the confidence and fluency of someone who has been doing this his entire career. Because I have been doing this my entire career. I just couldn't remember the syntax.
This is what expertise actually looks like, and it looks nothing like what most people think expertise looks like, and the gap between the perception and the reality is responsible for more imposter syndrome, more fraudulent confidence, and more misallocation of trust than almost anything else in professional life.
The Things I Google
In the interest of radical honesty, and because I think radical honesty about this topic is genuinely therapeutic for an industry that is drowning in performative certainty, here is a partial list of things I have Googled in the last twelve months, often while actively being paid for my expertise in the very subject I was Googling:
"hreflang tag syntax" - I have implemented hreflang tags on probably fifty websites over the last fifteen years. I cannot reliably remember whether the language code for Brazilian Portuguese is pt-BR or pt-br (it's pt-BR, but I check every time, because getting the case wrong can cause the tag to be ignored, and I once spent an entire afternoon debugging a hreflang implementation that turned out to be a casing issue, and the memory of that afternoon has left a scar but apparently not a permanent memory of the correct casing).
"robots.txt disallow syntax trailing slash" - Does the trailing slash matter? I know the answer to this. I have known the answer to this for twenty years. The answer is yes, it matters, because a Disallow directive without a trailing slash matches the path prefix, and a Disallow directive with a trailing slash matches only the directory. I know this. I have known this. But every single time I write a robots.txt file, I Google it anyway, because the consequence of being wrong is that you accidentally deindex an entire section of a website, and the cost of checking is ten seconds, and ten seconds of insurance against catastrophic error is the best trade in all of technology.
"canonical tag self-referencing best practice" - Should a page's canonical tag point to itself? Yes. I know this. Every SEO knows this. I still Google it approximately once a quarter because a client asks and I want to send them a link to an authoritative source instead of just saying "yes, trust me," because "trust me" is not a citation and I am, at my core, a person who wants to be right and can prove it.
"search console coverage report excluded by noindex" - What does this status mean, exactly? I know what it means generally. I know what it means in practice. I sometimes forget the nuances of how Google differentiates between "excluded by noindex" and "crawled - currently not indexed," because the difference is subtle and Google's documentation is written in the dialect of English that Google uses, which is technically English but has a relationship with clarity that can only be described as adversarial.
"structured data testing tool" - Not because I don't know it exists, but because Google has renamed, moved, replaced, or deprecated the structured data testing tool approximately four times in the last five years, and I can never remember whether the current version is the Rich Results Test, the Schema Markup Validator, the old Structured Data Testing Tool that they said they were sunsetting but that still works, or some fourth option that they launched during an I/O keynote and that I have since lost track of. The tool ecosystem for structured data is like a restaurant that changes its name and menu every six months but keeps the same chef and the same dining room and then acts confused when regular customers can't find the entrance.
"log file analysis screaming frog" - I use Screaming Frog constantly. I have used it for years. I periodically forget the specific steps to import server log files for analysis, because it's a feature I use maybe three or four times a year, and the interface is not intuitive (Screaming Frog's interface is, in general, what you'd get if a very competent engineer designed a UI while actively hostile toward the concept of user experience, which I say with love, because the tool is indispensable and I cannot imagine working without it).
"GA4 regex filter audience" - GA4 has a regex implementation that is slightly different from every other regex implementation I have ever used, which means I regularly write regex patterns that work in every other context and fail in GA4 for reasons that are opaque and maddening. I Google this at least once a month. I will Google it next month. I will Google it until GA4 is replaced by GA5, at which point the regex implementation will be different again, and I will Google that too.
The list goes on. It goes on for a very long time. If I were to catalog every search query I've made in the last year that relates to things I "should" know by now, the list would number in the hundreds. Maybe the thousands. And I am not ashamed of this. Or rather, I was ashamed of it, for about fifteen years, and then I stopped being ashamed, and the thing that made me stop being ashamed was realizing that every expert I respect does the same thing and is also ashamed of it, and the collective shame is creating a completely false picture of what expertise looks like, and the false picture is hurting everyone.
The Expertise Illusion
There is an image of the expert that exists in the popular imagination, and it goes something like this: the expert is a person who has memorized everything relevant to their field and can recall it instantly. The expert doesn't need to look things up. The expert doesn't hesitate. The expert speaks with fluent certainty on every topic within their domain, drawing on a vast internal database of knowledge that has been accumulated over years and is accessible at will, like a library where every book is cataloged and every page is bookmarked and the librarian never takes a day off.
This is fantasy. This has always been fantasy. But it's a particularly damaging fantasy in fields like SEO, where the body of knowledge changes constantly, where the platform we all depend on (Google) makes hundreds of changes a year to its algorithm and doesn't tell us what most of them are, where the tools we use update their interfaces and methodologies without warning, and where the "best practices" of three years ago might be the "outdated advice" of today. Nobody can memorize a body of knowledge that is being rewritten in real time. The attempt to do so is not expertise. It's an anxiety disorder with a professional veneer.
What experts actually have is not a database. It's a compass.
A database stores everything. Every fact, every syntax, every edge case, every configuration option. A database can be queried and it returns exact answers. This is what junior people think expertise is, and it's why junior people feel like imposters - because they're comparing their incomplete database to the imagined complete database of the senior person, and finding themselves lacking.
A compass doesn't store anything. A compass points you in the right direction. It tells you which way to go, not every step of the path. And when you combine a compass with the ability to read a map (which is what Google is, metaphorically - it's a map of all the knowledge you don't need to carry in your head), you can get anywhere. You don't need to memorize the route. You need to know the direction.
When I'm in that parking lot Googling the data layer syntax, I'm using Google as my map. But the compass - the thing that tells me that the data layer is what matters, that the architecture of the tracking implementation is the critical issue, that the six people in the meeting room need to hear about event-driven data collection rather than pageview-based measurement, that the specific technical decision we're about to make will have implications for their attribution model that none of them have considered - that's the expertise. The syntax is a fact. A fact I can look up in four seconds. The judgment about what to do with the syntax, when to implement it, how to implement it in a way that doesn't break their existing setup, what questions to ask the development team, what mistakes the last person who did this probably made - that's twenty years of experience, and you can't Google that.
The Dunning-Kruger Amphitheater
The SEO industry has a Dunning-Kruger problem that is, I think, worse than most industries, and it's worse because of the specific combination of two factors: the knowledge required is broad and shallow (you need to know a little about a lot of things - HTML, JavaScript, server configuration, content strategy, link building, analytics, UX, and about thirty other disciplines), and the feedback loops are long and noisy (it takes months to see whether a decision was right, and even then, attribution is murky). This combination creates an environment where it's very easy to feel confident after learning a little, and very difficult to develop the calibrated uncertainty that is the hallmark of actual expertise.
The person who has been doing SEO for six months knows what a title tag is, knows what a backlink is, has read a few blog posts about Core Web Vitals, and feels pretty good about their knowledge. They are at the peak of Mount Stupid on the Dunning-Kruger curve. They don't know what they don't know, and the things they don't know are so numerous and so consequential that their confidence is essentially random - they might be right about something, but they can't tell the difference between the things they're right about and the things they're wrong about, which means their confidence is uncalibrated, which means it's useless as a signal of competence.
The person who has been doing SEO for five years knows enough to be terrified. They've seen enough to know how much they don't know. They've made enough mistakes to have developed the healthy paranoia that comes from having personally caused a 40 percent traffic drop by deploying a robots.txt change on a Friday afternoon (never deploy anything on Friday, a lesson I learned the expensive way and that I now pass on to every junior person I mentor, along with "always check the canonical tags" and "never trust the client's developer to implement your recommendations exactly as specified"). The five-year person is in the Valley of Despair. They feel like an imposter. They Google things constantly. They second-guess themselves. They are, ironically, much more competent than the six-month person, but they feel much less competent, because they've developed the self-awareness to see their own gaps.
The person who has been doing SEO for twenty years - me, I'm talking about me - lives in a strange place on the curve. I have enough experience to have pattern recognition that is genuinely valuable. I can look at a website and identify the primary issue in minutes because I've seen the same issue hundreds of times before. I have judgment that is, I think, pretty good - I know what matters and what doesn't, I know when to push and when to wait, I know the difference between a real problem and a noisy signal. But I also know, acutely, specifically, with the precision that only comes from two decades of being wrong about things, how much I don't know. I know that Google could make a change tomorrow that invalidates a recommendation I made yesterday. I know that the "best practice" I'm recommending has a confidence interval around it, and that confidence interval is wider than I'd like. I know that my pattern recognition, while generally reliable, is also a source of bias, because patterns make you see what you expect to see, and sometimes the site in front of you is an exception to the pattern, and the cost of missing the exception can be very high.
So I Google things. I Google things I've known for twenty years. I Google things I learned last week. I Google things to confirm what I already know, because the cost of confirmation is ten seconds and the cost of being wrong is enormous. And I have come to believe that the willingness to Google - the willingness to say "I'm not sure, let me check" - is not a weakness. It's the most reliable signal of genuine expertise I know.
What Clients Are Actually Paying For
There's a moment in every client engagement - usually early, usually during the first or second meeting - when I say something that surprises the client. Not a technical insight. Not a recommendation. Something more fundamental than that. I say: "I don't know. Let me check."
The first time I said this to a client, early in my career, I panicked internally. I was twenty-seven years old and charging rates that I felt I hadn't earned (I had earned them, but imposter syndrome is not interested in evidence), and I thought that admitting ignorance on any point, no matter how minor, would destroy the client's confidence in me. I thought expertise meant having answers. I thought the consultant's job was to be the person in the room who never said "I don't know."
What actually happened was the opposite of what I feared. The client's respect for me increased. Not decreased. Increased. Because the client was a smart person - most clients are smart people, despite what some consultants seem to believe - and the client could tell the difference between a consultant who says "I don't know, let me check" and a consultant who bullshits. And the client had been burned by the bullshitter. The previous consultant, the one they'd fired before hiring me, had never said "I don't know." Had always had an answer. Had always spoken with confidence. And the answers had been wrong, and the confidence had been misplaced, and the company had spent eight months implementing wrong answers delivered with confidence before realizing that confidence and competence are different things.
"I don't know" is the most underrated sentence in consulting. It does three things simultaneously. First, it's honest, and honesty is rare enough in professional services that it functions as a differentiator. Second, it calibrates the client's trust - every time you say "I don't know" about a small thing, the client trusts you more when you say "I'm certain" about a big thing, because they know you're not the kind of person who claims certainty they don't have. Third, it reframes the value proposition from "I have all the answers" to "I know how to find the right answers," which is both more accurate and more durable, because answers change but the ability to find them doesn't.
What clients are paying for, when they hire me or any experienced consultant, is not a database of memorized knowledge. They are paying for judgment. Judgment is the ability to look at a situation and know what matters. Judgment is knowing that the canonicalization issue is critical but the missing alt text is cosmetic. Judgment is knowing that the technical audit can wait but the content strategy can't. Judgment is knowing when to follow best practices and when to break them. Judgment is knowing what to Google and what to ignore. Judgment is, fundamentally, the ability to make good decisions in the presence of uncertainty, and uncertainty is the permanent condition of everything in SEO, because Google doesn't publish the rules and the game changes while you're playing it.
Knowledge can be Googled. Judgment cannot. This is the distinction that the entire industry misunderstands, and the misunderstanding has consequences.
The Honest Version of Twenty Years
I want to give you the honest version of what twenty years of SEO expertise looks like, not the curated version, not the conference-speaker version, not the LinkedIn version where every day is a masterclass and every engagement is a success story. The honest version.
In twenty years, I have made every mistake at least once. I have recommended a site migration strategy that caused a 70 percent traffic loss (the migration was necessary, my timeline was too aggressive, and the redirect mapping had gaps that I should have caught but didn't because I was overconfident in my process). I have recommended content strategies that produced nothing. I have built link-building strategies around tactics that later became penalties. I have advised clients to ignore algorithm updates that turned out to matter, and to panic about algorithm updates that turned out to be nothing. I have been wrong about the future of SEO approximately as often as I have been right, which is to say: more often than I'd like to admit and less often than you'd think.
In twenty years, I have also gotten a lot of things right. I've recovered sites from penalties. I've turned around traffic declines. I've built organic channels that became the primary revenue driver for companies. I've identified issues in fifteen minutes that other consultants missed in three months. I've provided advice that, compounded over years, generated millions of dollars in revenue for clients. I'm not saying this to brag. I'm saying it because the mistakes and the successes exist simultaneously, in the same career, in the same person, and pretending otherwise is dishonest.
The difference between Year One Amos and Year Twenty Amos is not that Year Twenty Amos knows more facts. He knows some more facts, sure. But the real difference is that Year Twenty Amos makes better decisions. He makes better decisions because he's made more bad decisions, and he's learned from the bad decisions, and the learning has accumulated into something that looks like intuition but is actually pattern recognition built on a foundation of scar tissue.
Year One Amos would have walked into the Fortune 500 meeting without Googling the data layer syntax, would have tried to remember it from memory, would have gotten it slightly wrong, and would have spent the rest of the meeting trying to cover for the error. Year Twenty Amos sits in the parking lot and Googles it, walks in with the correct syntax on his phone, and delivers the meeting flawlessly. Year One Amos was more afraid of looking like he didn't know. Year Twenty Amos is more afraid of being wrong. The difference between those two fears is the entirety of professional development.
Imposter Syndrome Is the Industry's Silent Epidemic
I talk to a lot of SEOs. Not just the ones at conferences, who tend to be the most confident and the most polished (conference speakers are a self-selected sample of the most extroverted and self-assured people in any industry, which is why conferences give a wildly distorted picture of what the industry actually looks like). I talk to the ones in the trenches. The in-house SEOs at mid-market companies who are the only person on the team who knows anything about organic search and who carry the weight of the entire channel on their shoulders. The agency SEOs who manage ten accounts and feel like they're failing all of them. The freelancers who are good at the work but terrible at the sales and who lie awake at night wondering if they're charging too much or too little or if the client has figured out that they're not as smart as they seemed in the pitch.
Almost all of them have imposter syndrome. Almost all of them think they're the only one. Almost all of them look at the confident people on Twitter and LinkedIn and at the conferences and think: those people know things I don't, those people have a level of certainty I can't access, those people are the real experts and I'm just pretending.
They're wrong. The confident people also don't know. The confident people are also Googling things in parking lots. The confident people are also second-guessing recommendations after they send them. The difference is that the confident people have either (a) learned to perform confidence as a professional skill, which is legitimate and valuable but is a communication skill, not a knowledge skill, or (b) actually don't know what they don't know, which is Dunning-Kruger, and which makes them not more expert but less.
The healthiest, most competent SEOs I know are the ones who have made peace with not knowing. They've accepted that the field is too broad, too dynamic, and too unpredictable to master completely. They've accepted that Googling things is not a failure of expertise but a tool of expertise. They've accepted that the person who says "I don't know, let me figure it out" is more trustworthy than the person who always has an answer, because always having an answer in a field where nobody has all the answers is a red flag, not a credential.
If you're reading this and you have imposter syndrome, I want to tell you something that I wish someone had told me fifteen years ago: the feeling that you don't know enough is not evidence that you don't know enough. It's evidence that you know enough to see the gaps. The person who doesn't see the gaps is the person you should be worried about, because the gaps are there whether you see them or not, and the person who can't see them is going to fall into one eventually, and they're going to take their client with them.
The Liberation of Admitting You Don't Know
Something changed for me about five years ago. I was on a call with a client - a large retailer, complex site, significant organic revenue - and the client's head of engineering asked me a question about a specific Googlebot rendering behavior. The question was technical, specific, and at the edge of my knowledge. I could have given a plausible-sounding answer that was probably mostly right. I had enough context to fake it. The engineer would probably not have known the difference.
Instead, I said: "I'm not confident enough in my answer to give you one right now. Let me research it and get back to you by end of day."
There was a pause. And then the engineer said: "Thank you. That's the first time anyone from your side has said that, and I've been asking questions I know the answers to for the last hour just to see if you'd do it."
He'd been testing me. He was a senior engineer who had worked with multiple SEO consultants and agencies, and he'd developed a test: ask questions you already know the answer to and see if the consultant bullshits. Every previous consultant had bullshitted. Every previous consultant had given a confident, fluent, partially wrong answer to a question designed to see if they'd admit uncertainty. And every time, the engineer had lost trust, because if the consultant would bullshit about something the engineer could verify, what were they bullshitting about on the topics the engineer couldn't verify?
That moment changed how I work. Not gradually. Immediately. I started saying "I don't know" more often. I started saying "let me check" instead of going from memory. I started prefacing uncertain answers with explicit confidence levels: "I'm about 80 percent sure that's correct, but let me verify before you implement." I started treating uncertainty not as a weakness to hide but as information to communicate, the same way a weather forecaster communicates probability rather than certainty, because the consumer of the information deserves to know the confidence interval, not just the point estimate.
And my business got better. Not just marginally better. Significantly better. Clients trusted me more. They implemented my recommendations more consistently (because when I said "do this," they knew I was certain, because they'd seen me be uncertain about other things and communicate that uncertainty honestly). They referred me more. They stayed longer. The honesty wasn't a sacrifice. It was a strategy, except it wasn't a strategy at all, it was just honesty, and the fact that honesty functions as a strategy tells you something depressing about how rare honesty is in professional services.
The Things That Matter and the Things That Don't
After twenty years, I have a pretty clear picture of what matters in SEO and what doesn't, and the picture is much simpler than the industry wants it to be, because a simple picture is bad for selling tools, agencies, courses, and conference tickets. A simple picture means you don't need to spend $10,000 a year on tools and $5,000 a month on an agency and $2,000 on a conference ticket to learn the "latest tactics." A simple picture means the fundamentals are the fundamentals, and they haven't changed much in ten years, and the 95 percent of SEO Twitter activity that's about algorithm updates and ranking factors and the latest Google statement and whether links still matter is mostly noise that you can safely ignore while you focus on the stuff that actually moves the needle.
Here is what matters, as best I can tell after twenty years:
Make something worth finding. This is the content question, and it's the most important one, and it has nothing to do with word count or keyword density or TF-IDF scores or any of the pseudo-scientific metrics that the content optimization tools have invented to make their existence seem necessary. Is the page useful? Does it answer the question the searcher has? Does it do it better than the other pages that answer the same question? If yes, you have a chance. If no, nothing else matters.
Make it easy for Google to find and understand. This is the technical SEO question, and it's important, and it's also mostly straightforward. Can Google crawl the page? Can Google render the page? Does the page load in a reasonable time? Are the canonical signals consistent? Is the site architecture logical? These are not mysteries. They're engineering problems with known solutions, and the solutions haven't changed much in a decade, and the fact that the industry treats technical SEO as arcane and complex is a reflection of the industry's incentive to make things seem harder than they are so that you'll pay someone to do them.
Earn trust and authority over time. This is the link and reputation question, and it's the one that's hardest to shortcut, because trust is a function of time and consistency, and there's no tactic that replaces either of those things. You can build links, and you should build links, but the links that matter are the ones that come from genuine recognition of the value you're providing, and those links accrue to sites that have been consistently useful over time, and there's no way to fake time.
That's it. That's the framework. Three things. Everything else - the algorithm updates, the ranking factor studies, the "is this a ranking signal" debates, the tool comparisons, the technical edge cases, the structured data markup, the Core Web Vitals optimization - is either a subset of one of those three things or a distraction from them. And the fact that I can't always remember the specific syntax of the things that are subsets of those three things doesn't diminish my understanding of the three things themselves. The compass points north. The map has the details. I carry the compass. Google carries the map.
Nine Minutes in a Parking Lot
I walked out of that Fortune 500 meeting four hours after I Googled the data layer syntax in the parking lot. The meeting went well. The team implemented the tracking architecture I recommended. The data started flowing correctly for the first time in the company's history. They could finally see which products were being purchased by customers who arrived through organic search, which meant they could finally optimize their organic strategy around revenue rather than traffic, which meant they could finally make smart decisions about where to invest in content and where to pull back.
None of that happened because I remembered the syntax. It happened because I knew the syntax mattered, knew why it mattered, knew what questions to ask the engineering team, knew which implementation approach would work with their existing stack, knew what the common failure modes were, knew how to test the implementation, and knew how to interpret the data once it started flowing. The syntax was a fact. Everything else was judgment. The fact took four seconds to look up. The judgment took twenty years to develop.
I drove home that night feeling good about the meeting and slightly amused at myself, because I had spent nine minutes in a parking lot in a mild state of panic over a syntax I could have looked up in the meeting itself, in front of everyone, and it would have been fine. It would have been completely fine. Nobody would have judged me. The engineering team would probably have respected me more, because engineers understand that looking things up is not a failure, it's a methodology, and the best engineers in the world spend half their day on Stack Overflow and the other half pretending they don't.
But old habits die hard, and the habit of performing omniscience is the oldest habit in consulting, and it takes a long time to unlearn, and I'm not sure I've fully unlearned it even now. I still feel a small surge of anxiety when I have to say "I don't know." I still occasionally fake fluency on a minor point rather than admitting I need to check. I'm better than I was. I'm not where I want to be.
I'm still working on it. I'm still Googling things. I'll Google something tomorrow that I've Googled a hundred times before. I'll Google something next week that I should probably know by heart after twenty years but don't, because my brain has decided to use that storage space for the complete lyrics to "Semi-Charmed Life" by Third Eye Blind, a song that has never once been useful in a client meeting but that I can recite from memory with a fluency and confidence that I cannot bring to bear on the hreflang syntax for Brazilian Portuguese.
This is what twenty years of expertise looks like. Not a library. Not a database. Not a person who has all the answers. A person who has all the questions, knows which questions matter, knows where to find the answers, and has the judgment to know what to do with them. A compass and a phone with a search bar and twenty years of scar tissue that tells you which direction to point both of them.
If that's not worth paying for, I don't know what is.