Keyword Cannibalization: Why Your SEO Pages Compete (And How to Fix It)

Edward Sturm| 00:13:57|May 1, 2026
Chapters12
An introductory discussion on keyword cannibalization, sharing the author's exploration, tests, and sources from the SEO subreddit and prior episodes. It sets up the topic and the podcast’s approach to understanding cannibalization.

Edward Sturm breaks down keyword cannibalization, shows how to spot it with Search Console and SERP overlap, and walks through practical fixes like 301 redirects and careful page consolidation.

Summary

Edward Sturm walks viewers through the realities of keyword cannibalization, explaining that when multiple pages compete for the same query, Google can split authority and rankings suffer. He emphasizes practical diagnostic steps, including filtering Search Console data by query and page to spot overlapping URLs, watching for volatile positions, and using site: queries to uncover duplicates. He revisits a well-known Reddit post and ties in insights from the SEO community, including debates sparked by Matt Cutts on duplicate content. Sturm then lays out a concrete remediation playbook: decide a winner based on current ranking, clicks, and backlinks; retire or consolidate the losers; implement 301 redirects; and update internal links to pass authority to the surviving page. He warns against common mistakes like adding more content before fixing cannibalization or creating year-specific URLs that compete with evergreen content. A week-by-week recovery timeline follows: verify redirects in week one, then anticipate position volatility as Google reindexes, with stabilization and higher total clicks in the longer run. Throughout, he stresses focusing on core, money-making keywords rather than chasing every potential cannibalization issue. The episode closes with reader comments from the SEO subreddit community and a plug for his compactkeywords.com course, framed as a path to purchase-intent SEO results much like the success stories he shares.

Key Takeaways

  • Use Google Search Console to flag cannibalization by filtering by query and page to reveal multiple URLs for the same keyword.
  • Look for page position volatility (20–80 range) and high impressions with low clicks across competing pages.
  • Use site:yourdomain keyword to see if multiple pages surface for the same topic and apply SERP overlap analysis (70%+ overlap in top 10 suggests same intent).
  • Choose a single survivor page by ranking position, recent click data (last 90 days), and backlink count; favor the page with fewer internal links if all else is equal.
  • When consolidating, pull unique content from losing pages into the winner, then set losers to draft/delete and implement 301 redirects to the winner.
  • Always implement proper 301 redirects to transfer authority; canonical tags alone are insufficient for a true fix.
  • Recovery typically shows initial 404 checks in week 1, followed by position volatility as Google reprocesses pages, with eventual stabilization and higher click totals for the target keyword.

Who Is This For?

Essential viewing for SEO professionals and content editors who manage large sites with multiple pages targeting similar keywords, especially those chasing funnel or money-keyword rankings and needing a concrete consolidation workflow.

Notable Quotes

""Keyword cannibalization is when your own site competes against itself. You have two or more pages targeting the same keyword, Google can't decide which one to rank and so it splits the authority between them.""
Sturm defines the core problem and its effect on rankings.
""Set up 301 redirects from the old URLs to the winner. Update any internal links across your site that were pointing to the losing pages.""
Outlines the essential remediation steps.
""The 301 redirect part matters more than people think. A proper 301 is what moves the authority.""
Emphasizes redirect quality in authority transfer.
""If you want to learn more SEO that makes money, SEO that specifically targets people who are looking to purchase something... compactkeywords.com.""
Promotes his course as a practical follow-up.

Questions This Video Answers

  • How do I detect keyword cannibalization in Google Search Console step by step?
  • What is the difference between a canonical tag and a 301 redirect for cannibalization fixes?
  • How long does it take for rankings to stabilize after consolidating cannibalized pages?
  • Should I merge or separate pages when two pages cover the same topic?
  • What’s the best way to decide which page should survive when two pages compete for the same keyword?
Keyword CannibalizationGoogle Search ConsoleSERP Overlap301 RedirectsCanonical Tags vs RedirectsSEO Recovery TimelineInternal Link ManagementPurchase-Intent SEO
Full Transcript
I've been going down the cannibalization rabbit hole lately and wanted to write up what I've learned so far. This is a mix of things I've tested myself and stuff I picked up from posts here. Happy to be corrected on anything because I'm still figuring a lot of this out. What even is cannibalization? This is a post from the search engine optimization subreddit. Here's what I have learned about keyword cannibalization. The whole thread is really excellent and a few episodes ago, episode 1026 of this show SEO crawling myths, why crawl budget isn't your problem with David Quaid, we had a long discussion about keyword cannibalization and many commenters said they loved it. And so I said, let's do another episode on this. And this one is more structured. So, the short version, this is back to the post now. The short version, keyword cannibalization is when your own site competes against itself. You have two or more pages targeting the same keyword, Google can't decide which one to rank and so it splits the authority between them. Neither page ranks well. You essentially have your own chances. Here's how to spot it. Open Google Search Console, pull your search analytics data filtering by query and page. If you see the same keyword showing multiple different URLs from your site, that's a flag. Also watch for page position volatility. So, the page is bouncing between positions 20 and 80. A page bouncing around wildly is often doing so because Google is confused about which of your pages is the more relevant answer to the query. High impressions but low clicks across several pages for the same query. Running a site:query search for a topic and getting three or four results back from your own domain. The SERP overlap method is also useful here. If you take two suspected competing pages and look at how much their actual search results overlap. To me, more than 70% overlap in the top 10 results usually means Google sees them as targeting the same intent. you probably want one page, not two. That's a really important distinction, especially if you're deciding on how to target keywords and you have keywords that are similar to each other. Look at the SERPs, see if there's a lot of overlap in the top 10 results and if there is, then choose the keyword with the more volume to target. And also shout out to legitimate salary 108. That is the poster who made this. Shout out to this poster for writing this up. So, continuing with this post, when you decide to consolidate, you need to pick which page survives. I evaluate them roughly on which is currently ranking highest. So, the best existing position for the target query or keyword, which got the most clicks in the last 90 days, which has more backlinks pointing to it. If two pages are close on all of that, I'd keep the one that has fewer incoming internal links to update just to reduce the work. Once you have a winner, read through all the losing pages and pull out anything unique that isn't already in the winner. Set the losing pages to draft or delete them. Set up 301 redirects from the old URLs to the winner. Update any internal links across your site that were pointing to the losing pages. The 301 redirect part matters more than people think. A proper 301 is what moves the authority. And here are common mistakes that this poster sees and has made. Creating new content before fixing existing cannibalization. If your site has pages competing over each other, adding more content just adds more competition. Fix what you have first. Making year-specific URLs like best tools 2024, best tools 2025. Sometimes these compete with each other and with the evergreen version. It's better to have one URL that you update, not new URLs every year. I love that. Treating canonical tags as a real fix. They're better than nothing, but they're not the same as a redirect. And there's a recovery timeline. Week one, you're mostly checking that redirects work and there are no 404s. Week two onwards, expect some position volatility while Google sorts things out. The winner should start stabilizing at a better position and total clicks for that keyword should go up. So, before we get into the comments and the comments here are also very insightful. It's really easy to go overboard on fixing keyword cannibalization. The really, really important thing to note is that this matters most for keywords that are core to your acquisition, getting customers, getting users, getting leads or keywords that are core to your strategy and you're not ranking well consistently for these keywords. So, it's like if it's a bottom of funnel keyword. And even then, keyword cannibalization isn't necessarily responsible for the poor performance. But personally, I would only pay attention to this if these are keywords that are going to bring you money or again are going to be important for your strategy. Because if you're doing a lot of SEO, it's so easy to spend a ton of time fixing cannibalization for keywords that are not going to bring you money. So, you want to be selective with what you're doing. The top comment on this is from the moderator of the SEO subreddit, Weblinker, who is great, who I'm always sharing on this podcast. Weblinker says, here's the how it happens. Google is content agnostic. And then Weblinker shares this video from Matt Cutts, Google's previous head of webspam, who is a legend in the SEO industry. A really, really great person. Great great person all around, great person to learn from. And it's this video, how does Google handle duplicate content? It's a short video. Two things or three things that stand out. One, Matt Cutts says around 25 to 30% of the web is duplicate content. There's another quote about how Google handles duplicate content. Matt Cutts says, so most of the time suppose we're starting to return a set of search results and we've got two pages that are actually kind of identical. Typically, we would say, okay, you know what? Rather than show both of those pages, since they're duplicates, let's just show one of those pages and we'll crowd the other result out. And Matt Cutts says, duplicate content isn't treated as spam, it's just something that they need to cluster appropriately. Unless you are doing the duplicate content in a manipulative or malicious way. Weblinker's takeaways from this video were Google can't deal with duplicate entries, not that it cares about duplicate content. It only checks the document name like the slug and the page title and H1 combination. It's the post retrieval top 10 process that carves out the answer. So, this is the problem and this is from 12 years ago showing how old the problem is and how slow Google is. Here's another interesting comment from forwardup.de. This poster said, from my experience, this most often happens with classic local business sites strongly focused on one product or service. Let's take a plumber. The main page is mainly about plumbing. Then there are service pages under one main category page, which is plumbing. The main page has less content but is stronger. The plumbing subpage has more content but is weaker. Basically, two pages competing for rankings for the same terms or what you could call near duplicate content. There are two ways to solve this, separating or merging. You either use really different titles and make sure the content is different enough or you merge them into one and do a 301 for the one you don't want to keep. For really huge sites with some pages constantly cannibalizing and switching positions, this is far less obvious, of course. The best way to avoid it in the first place is proper documentation. Whenever you want to publish a new piece, you check if this topic wasn't already partly covered. So, Weblinker wrote more, the moderator of this subreddit wrote more and said, this is a very common issue, especially where domains have publishers with a half knowledge of SEO. Like they know how important it is to target searches using the document name, which includes a slug, but they don't know they're creating duplicate content that Google cannot detect. Cannibalized content is essentially duplicate content. Another way to describe it is duplicative entry, two or more pages in the same index with roughly the same relevance score. It becomes duplicative because of BERT or semantics. The part of the indexing algorithm that catches duplicate pages based on the document name, the one that throws duplicate page different canonical chose does not work on semantic synonyms. Matt Cutts explains in the video that I just talked about, this how does Google handle duplicate content video, Matt Cutts explains it very well. He says the two selective parts of the SERP builder post index retrieval effectively pick two pages, then block each other. But a few things, it's much wider than your observation. Cannibalization happens at any position, not just between positions 20 and 80. Obviously, it's most detrimental to the top three places. If you take two suspected competing pages, there can be as many as 12 pages. Adjective phrases, things like best and top in the slug do not differentiate because they are not part of the index name. Diagnosis can be made harder because pages only rank intermittently. On high volume sites, privacy withholding exaggerates misdiagnosis. And how to fix cannibalization? You can just do a manual removal or no index of both to immediately remediate the situation, then republish under a new document name that doesn't cause a duplicate entry. So, the poster of this thread, the original poster, said, can you please elaborate on diagnosis can be made harder because pages only rank intermittently and on high volume sites, privacy withholding exaggerates misdiagnosis. Weblinker said, here's an actual scenario around top SEO experts. You can see that the average position was heading up. Then I introduced a new page on February 21st using a second page with a synonym for expert. You can see the page heads up then a rotation battle. The rotation battle results in less impressions and a falling click-through rate because it's a low volume keyword and this is the exact phrase filtered both pages lose out. And then 2 days ago both fall out of the visible index. Responding to on high volume sites privacy withholding exaggerates misdiagnosis on sites with 1 million plus clicks data from Google Search Console becomes really weird. I have no better words. We found that when the graph is at a default position for either a selected page or query no data is shown. No clicks, no impressions. When data is filtered with a second filter data magically appears. It seems counterintuitive but it also mirrors the problem. As you focus on data, whatever was causing the data to be withheld suddenly stops blocking the other data which now becomes visible. It also means that pages only rotate every few days which means it can take a week to make a full diagnosis {slash} impact study for remediation. That is this entire thread. Here's what I have learned about keyword cannibalization. It was so good. So much information here. Again, don't get caught up in the weeds on this if it's not going to make you money or if it's not core to your strategy. That's my advice at least. And if you want to learn more SEO that makes money, SEO that specifically targets people who are looking to purchase something, to use something, to make a discovery call. That's my SEO course compactkeywords.com. It is specifically about doing purchase intent SEO. I showed this testimonial yesterday and I'm going to show it again cuz I'm so hyped for Chris Ott who sent me this. Hey, what's up guys? Chris with the Brightside, owner of a pressure washing company up here in Spokane, Washington. I just wanted to do a little video for Edward because compact keywords has been crucial to the growth of our business. I hate to say it but we spent nearly $18,000 in the last year and a half on marketing and SEO through different agencies locally. Um and [clears throat] that did nothing. Did not get the phone to ring. Nothing happened. Banging my head against the wall trying to figure out what I'm actually paying for per per month. Uh decided to take the leap on the compact keyword class. Wasn't sure what to expect but I've been listening to his podcast for over 55 days and that's all I listen to on my headphones at work. And so decided to do it. Things are great. The best part is going through Moz and some of the other features that Edward shows you how to use in the the the the compact keyword class. We are now ranking and outranking the local SEO agency who also runs a pressure washing company up here in several high intent keyword categories. Um we're getting about six to eight calls per day on a good day. Um which is just unheard of. So this course, if you're on the fence about it, check it out. Uh in-depth. I'm not smart. I'm I'm more the type of business owner that goes out there, gets my hands dirty as you can tell with my shirt. And the computer thing is not my thing but man, this course do it. I I promise you you'll see results and if you don't then you're probably doing something wrong. Thank you again to Chris Ott for that. That is at compactkeywords.com. This is everything for episode 1030 of the Edward show. 1030 days in a row doing this podcast. If you watch this on YouTube, thank you so much for watching. If you listened on Spotify or Apple podcasts, thank you so much for listening. And I'll talk to you again tomorrow. Bye now.

Get daily recaps from
Edward Sturm

AI-powered summaries delivered to your inbox. Save hours every week while staying fully informed.