The Redirect Audit That Matters
- → Redirect chains waste crawl budget and dilute link equity
- → Find chains: any redirect pointing to another redirect
- → Fix by updating to point directly to final destination
- → Check for redirect loops (infinite redirects)
Redirects seem simple -point old URL to new URL, done, move on with your life -but over time, like any form of technical debt, sites accumulate redirect problems that compound silently in the background while everyone focuses on more exciting work: chains get longer as each migration adds another hop, links break as destinations disappear, rankings suffer as link equity dissipates through each unnecessary redirect, and nobody notices any of this until one day traffic tanks and someone finally bothers to check what's actually happening when Googlebot tries to follow a link to your site.
The Problems to Find
Chain limit
Google follows up to 5 redirects before giving up. More than 2 hops is already hurting you. Flatten every chain to a single redirect.
Redirect chains are what happen when A redirects to B, B redirects to C, C redirects to D, and so on through however many hops accumulated over however many years of site migrations and URL restructurings and CMS changes -and each hop loses a small amount of link equity, which means long chains lose a lot, sometimes enough that by the time you reach the final destination the link juice has been so diluted it barely registers.
Redirect loops are the more catastrophic cousin: A redirects to B, B redirects back to A, the browser spins forever, the page never loads, Google can't crawl it, and whatever link equity was pointed at that URL is now trapped in an infinite cycle of pointing at itself like a dog chasing its own tail except the dog is your SEO and the tail is your rankings.
Broken redirects are when A redirects to B but B returns a 404, which means the redirect is completely useless -you've successfully redirected the user and Googlebot to a page that doesn't exist, which is somehow worse than just having the original page 404 in the first place because at least then you'd know something was wrong.
Soft 404s are the subtle version: A redirects to B, but B is a generic page (like the homepage) that doesn't match the original content at all, which Google treats as a soft 404 because they understand that redirecting a page about "blue widgets" to a homepage that says nothing about blue widgets is not actually helping anybody.
The Quick Audit
The process itself is simple: export your backlink profile from Ahrefs, Semrush, or Search Console to get a list of all the URLs that external sites are linking to; run those URLs through a bulk HTTP status checker; flag anything that isn't returning a 200 status; and for any redirects (301s or 302s), follow the chain all the way to the final destination to see how many hops are involved and whether that destination is even the right page. Free tools like httpstatus.io can check bulk URLs and show you the full redirect chain, which is all you really need to identify the problems -fixing them is the harder part, but at least you'll know what you're dealing with.
Priority Fixes
The highest priority fixes are redirect chains from pages with backlinks, because if a page has external links pointing to it and that page redirects through a chain of hops before reaching its destination, you're losing link equity on every single hop -which means the links you worked so hard to earn (or paid so much money to acquire) are being wasted by technical laziness, and A -> B -> C -> D should become A -> D, a single hop, no intermediate steps, no unnecessary friction.
Second priority is internal link chains, because your own internal links shouldn't go through redirects at all -you control these links, you can update them whenever you want, and yet most sites have thousands of internal links pointing to old URLs that redirect to new URLs that redirect to even newer URLs, all because nobody bothered to update the links when the URLs changed, and the fix is simple: update the link to point to the final destination directly.
Third priority is redirect loops and broken redirects, which are complete failures where nothing is being passed at all -these should actually be fixed immediately when you find them because they're not just inefficient, they're actively broken, but I list them third because they're usually less numerous than chains and therefore have less total impact even though each individual instance is worse.
The 301 vs 302 Question
The answer is deceptively simple -301 means permanent, 302 means temporary -but the implications are significant: use 301s for anything that's a permanent change (site migrations, URL structure changes, consolidated pages, anything where the old URL is never coming back), and use 302s only for genuinely temporary moves, which are rare enough in practice that if you're not sure whether something is temporary, it's almost certainly permanent and should be a 301.
If you see 302s being used for permanent changes, convert them to 301s immediately -Google will eventually figure out that your 302 is really a permanent redirect and treat it like a 301, but why make them guess, why introduce uncertainty into a signal that should be clear, why risk Google misinterpreting your intent when you could just tell them explicitly what's happening?
Prevention
When creating a new redirect:
- Always redirect to the final destination, not an intermediate page
- Check if the old URL already redirects somewhere (update that redirect instead of creating a chain)
- Update internal links to point to the new URL directly
- Document the redirect for future reference
Clean redirects are invisible, which is exactly how they should be -the user doesn't notice, Google doesn't notice, everyone gets where they're going without friction or delay. Messy redirects are invisible too, unfortunately, until they break something obvious enough that someone finally investigates, by which point you've been leaking link equity and confusing crawlers for months or years without realizing it. Fix them before they become problems, audit quarterly, and pair this work with log file analysis to see what Googlebot is actually experiencing when it tries to crawl your site, which is often quite different from what you assume it's experiencing.