The Technical SEO Checklist (The Real One)

TL;DR • 10 min read
  • Crawlability: robots.txt, XML sitemap, internal linking
  • Indexability: canonical tags, noindex usage, duplicate content
  • Speed: Core Web Vitals, image optimization, caching
  • Mobile: responsive design, mobile usability errors
Technical SEO priority stack: P1 Indexation, P2 Crawlability, P3 Page Experience, P4 Structure

Technical SEO audits have become bloated beyond all reason -200 checkpoints, most of them irrelevant, each one generating a line item in a report that makes the audit look impressively thorough while teams spend weeks on issues that don't affect rankings, classic bikeshedding dressed up as due diligence, and meanwhile the three or four things that actually break rankings go unfixed because they're buried somewhere around page 47 of the audit deck.

Here's what actually matters -the handful of issues that determine whether Google can find, understand, and choose to rank your pages -and you should fix these things first and ignore the rest until you have these handled, because a perfectly optimized alt tag on an image that Google can't even crawl to is not helping anyone.

Priority 1: Indexation

If Google can't index your pages, nothing else matters -not your content quality, not your backlink profile, not your site speed, not your Core Web Vitals, not your schema markup, nothing -because a page that isn't indexed is a page that doesn't exist as far as Google is concerned, and you can optimize a non-existent page forever without results.

Check these:

  • Noindex tags: Search Console > Indexing > Pages > "Excluded by noindex tag". Should only be pages you intentionally blocked.
  • Robots.txt: Test important URLs at /robots.txt. Is anything blocked that shouldn't be?
  • Canonical issues: Are canonical tags pointing to the right pages? Self-referencing canonicals on all indexable pages?
  • Index status: Search site:yoursite.com/important-page. Does it appear?

Priority 2: Crawlability

Google needs to find and access your content, which sounds obvious but is surprisingly easy to break in subtle ways that don't throw errors -a robots.txt rule that blocks the wrong directory, an internal linking structure that leaves important pages orphaned, a JavaScript framework that renders content client-side after Googlebot has already given up and moved on.

Check these:

  • Server errors: Search Console > Indexing > Pages. Any 5xx errors? Fix immediately.
  • Crawl errors: Are important pages returning 404s?
  • Internal linking: Can Google reach all important pages through links? Any orphan pages?
  • XML sitemap: Does it exist? Is it submitted? Does it only include indexable pages?
  • JavaScript rendering: Use "URL Inspection" in Search Console. Does the rendered HTML include your content?

Priority 3: Page Experience

Speed and usability fall into this category -less critical than indexation and crawlability in the sense that a slow page that's indexed will still rank while a fast page that isn't indexed won't rank at all, but still matters once you've got the fundamentals handled because user experience signals do influence rankings and a three-second load time is costing you conversions regardless of what Google thinks.

Check these:

  • Core Web Vitals: Search Console > Experience > Core Web Vitals. Fix pages failing LCP (Largest Contentful Paint) first.
  • Mobile usability: Search Console > Experience > Mobile Usability. Any errors?
  • HTTPS: Is the entire site served over HTTPS? Any mixed content warnings?
  • Intrusive interstitials: Popups blocking content on mobile? Remove them.

Priority 4: Structure

Structure helps Google understand your content and is nice to have after the above is fixed, but I want to be clear about the priority here: if you're spending hours agonizing over the perfect title tag while your important pages are blocked by robots.txt, you're rearranging deck chairs on the Titanic, and the iceberg in this metaphor is your catastrophic indexation problem.

Check these:

  • Title tags: Unique, descriptive, under 60 characters
  • Meta descriptions: Unique, compelling, under 155 characters
  • H1 tags: One per page, describes the content
  • URL structure: Readable, includes relevant words, not too long

What You Can Ignore

These items appear on every audit but rarely affect rankings, and I include this list not because these things are bad but because they're overemphasized by tools that need to generate long reports to justify their existence, and by consultants who need to fill deliverables with something even when the important problems have already been listed:

  • Missing alt tags: Add them for accessibility, but they won't change your rankings
  • Keyword density: Not a thing anymore
  • Exact match domains: Doesn't help
  • Schema markup: Gets you rich snippets, not better rankings
  • Word count targets: Content length doesn't directly affect rankings
  • Outbound link counts: Not a ranking factor
The 80/20 rule
Indexation and crawlability issues cause 80% of technical SEO problems. Fix those completely before touching anything else.

A 15-point focused audit beats a 200-point unfocused one every single time, because the goal is not to identify every possible issue but to identify the issues that actually matter and then fix them -fix what breaks rankings, ignore what doesn't, resist the temptation to optimize for the sake of optimizing, and understand that a perfect score on some tool's arbitrary checklist is not the same thing as a well-optimized website. For a different perspective on whether technical SEO deserves the attention it gets, see this take: technical SEO is a distraction.

Want more tactical SEO?

Practical frameworks you can implement today.

Browse all notes