Copy The SEO Strategy That Grew a 10,000+ Page Website by 27%

Exposure Ninja| 00:35:12|May 4, 2026
Chapters13
Introduces the dual evaluation by Google's traditional search and AI driven tools and sets up the need for a unified strategy.

A practical enterprise SEO playbook for large sites that works for both Google and AI search, with a clear audit-prioritize-fix framework and automation tools like Rank Math Pro.

Summary

Exposure Ninja’s guide to scaling SEO for large websites tackles the twin challenges of traditional Google ranking and AI-driven recommendations. The team demonstrates how a 10,000-page site boosted organic traffic by 27% and doubled AI-overview visibility through a disciplined audit-and-fix process. They outline a seven-area enterprise audit (crawlability, indexing, on-page signals, schema, structure, and internal links) and emphasize that quick wins (low effort, high impact) should precede more complex fixes like schema and architecture. Rank Math Pro is showcased as the tool enabling in-dashboard audits, metadata templates, schema templates, and automated internal linking. The talk also covers a prioritization framework (impact vs. effort), measurement setups (Semrush, Peak.AI, Google Search Console, GA4), and actionable strategies for content at scale, site architecture (pillar pages and supporting content), performance optimization, and a data-driven approach to internal linking. The video closes with real-world examples, a tour of the Rank Math AI Link Genius, and a nudge toward building an AI-ready content strategy alongside traditional SEO tactics.

Key Takeaways

  • Audit first, prioritize based on impact and effort, then fix—don’t assume visible issues are the only ones worth addressing on a 10,000-page site.
  • Rank Math Pro’s SEO Analyzer and competitor analyzer can identify gaps, guide fixes, and streamline multi-page metadata and plugin interactions on WordPress.
  • Schema is essential for AI search visibility; use global templates and conditional rules to scale schema across thousands of pages without manual entry.
  • Internal linking at scale requires a policy and automated tools like Rank Math’s Link Genius to manage and optimize pageRank flow across pillar and supporting content.

Who Is This For?

Essential viewing for enterprise SEO teams managing large, multi-territory websites, and for WordPress-based agencies looking to scale audits, schema, and internal linking with automation.

Notable Quotes

""The compounding problem... when you combine them, they were.""
Introduces the idea that multiple small issues compound on large sites.
""The first step is do the audit, get visibility on everything that's wrong with the website.""
Emphasizes auditing as the foundation of prioritization.
""Schema really tells a crawler what a page is about, what a page includes.""
Highlights the role of schema for AI and traditional search.
""Internal links are how PageRank... flows through a website.""
Underlines the importance of a deliberate internal linking strategy.
""Link Genius... allows you to manage your internal links at scale.""
Showcases Rank Math’s automation for internal linking.

Questions This Video Answers

  • How do you audit a 10,000-page site for SEO at scale?
  • What is a good impact-vs-effort framework for enterprise SEO fixes?
  • How can Rank Math Pro help with AI search visibility?
  • What are pillar pages and supporting content in an enterprise site architecture?
  • How can you automate internal linking to improve PageRank flow?
Enterprise SEOLarge Website ManagementRank Math ProSchema MarkupInternal LinkingSite ArchitectureAI Search VisibilityCrawlabilityContent StrategyPillar and Cluster Model
Full Transcript
If you're doing SEO on a large website, you're being judged by two algorithms. You've got Google's traditional search algorithm, and you have this new breed of AI recommendation tool like ChatGPT, Google's AI overviews, AI mode, Claude, and Perplexity that have their own set of criteria for the brands, businesses, and content that they want to recommend. Well, today we're going to show you an approach that you can take if you're optimizing a large website that works for both Google traditional search and AI search. This is the approach we've used on a global brand's website with over 10,000 pages to improve their organic traffic 27% whilst doubling their mentions in AI search. And as a little bonus, there's one SEO issue that becomes completely unmanageable at scale. We're going to show you a tool that fixes it. If you've managed a large website's SEO, you know that there are some unique challenges. The first is the compounding problem. On a small website, you can notice a problem and fix it. On a large website, by the time a problem shows up in analytics, it's usually been quietly affecting hundreds or even thousands of pages for months. And it's usually not alone. And every page or section added to a large website adds a whole bunch more opportunities for problems to creep in. Things like duplicate or missing title tags. Google has rolled out some recent updates testing using its Gemini AI tool to generate title tags and meta descriptions, but really as a business and as an SEO, you want to be giving the best possible titles and descriptions to Google for each of the pages on your site. Images without alt text. On some sites that we've seen, the number of images without alt text set can be into the hundreds of thousands. Pages accidentally being blocked from indexing via sort of template level errors. Internal links pointing to dead or inactive URLs. Sometimes the URL of an entire section is changed and therefore you end up with thousands and thousands of broken links. Or if they do get redirected, you end up with a whole bunch of redirect chains. Schema errors that cascade silently through every page of a post type. Orphaned content that crawlers can only find via site map if they ever discover it at all. And poor or confusing site structure that only gets worse as a site grows and grows and grows. And the challenge for enterprise SEOs is that whilst none of these issues is a huge catastrophe on their own, when they're added together and compounded over time on a very large site, this can really bog down the site's performance. But because none of these issues are catastrophic on their own, it can be quite difficult to identify them happening without the right tools. So, how is this traditionally managed? Well, because enterprise sites are constantly growing and growing and growing, and you might have people from different teams and territories adding new sections, developers tweaking things over here not knowing how it impacts over there, what typically happens is that a full SEO audit is carried out every 6 to 12 months. The fixes that are prioritized and implemented gradually over the next 6 months, by the time you get to the next audit, everything's broken again. So, whilst enterprise SEO of course has the SEO challenges, it also has the systems challenges. And the fix isn't [music] more hours in the day, the fix is processes and tooling that keeps the website in a constant state of health. So, let me give an example of this global brand who came to us for help with their 10,000 page website. They came to us with declining organic performance, less traffic from search. They were so close to the situation and the issues with the site had sort of accumulated gradually over time that they were struggling to work out what was going on. So, we used exactly the process I've outlined. We carried out an audit. We found that they had 349 duplicate page titles, 219 duplicate content issues, 195 completely orphaned pages, i.e. they existed with no links pointing to them, meaning that search crawlers would find it really difficult to find these pages in the first place, 118 hreflang errors across their international markets, over 200 missing meta descriptions, more than 100 missing H1 tags, and over 1,500 images missing alt text. So, none of these issues were the single cause of the drop, but when you combine them, they were. Now, we helped them get the website sorted and saw an increase on organic traffic of 27%. Average ranking on Google for their target keywords improved by 2.7 positions, and we doubled their visibility in Google AI overviews. But before you can start prioritizing and implementing these fixes, you need to know what's broken, and that is always the first step. When we're working on SEO for a large website, we'll always carry out an audit first. Whilst it might be tempting to go straight in and start fixing some of the most visible issues, the reality is that until you've worked out everything that's broken about a website, you don't know if jumping in on those fixes, you might actually be working on low priority stuff because there's other stuff over here that's way more critical. So, the first step is do the audit, get visibility on everything that's wrong with the website, and you can then prioritize those actions and start working through the highest value stuff first. So, the enterprise SEO audit typically covers seven areas. Technical crawlability. Can the traditional search crawlers and the AI crawlers access and render every page on your site as it should be seen? Indexing. Are the right pages indexed and the right ones excluded? On-page signals like metadata, hreflang, target keywords, canonicals, and headings. Content quality and intent alignment. So, is the content really good on the page and is it actually addressing the topic that it should be? Schema coverage. Which pages have structured data and is it formatted correctly? Website structure. So, are the pages ordered and categorized correctly? And internal link structure. How is authority flowing through the site and where is it leaking? And by the way, one of the tools that we use to do this work is actually the sponsor of today's video, Rank Math. Now, I'm here on the Elite Renewables website, which is one of the businesses that we own, and we've got the Rank Math plugin set up here. And this plugin allows you to do a lot of this initial auditing right inside the dashboard. Rank Math Pro has this SEO Analyzer, which carries out an initial audit of your website. If you're on WordPress, it's a huge time-saver. It gives you scores across dozens of tests. It shows you which tests you've passed, which ones you have warnings in, and then which ones you failed, and it gives you clear instructions on how to fix them as well. Now, it's designed for WordPress, so it understands things like WooCommerce product pages if you're an e-commerce business, and it understands custom post types and plugin interactions that some more generalized crawler tools won't understand. It interestingly also has a competitor analyzer where you can run an SEO audit of one of your competitor's sites to see how they're doing. Now, the outcome of this audit that you're going to carry out, whether you're using Rank Math or doing it manually, is going to become the basis of the plan that we're going to cover next. Once we've carried out the audit on the site, of course it's now time to prioritize the fixes. Not every SEO fix is going to have the same priority. Some are going to be absolutely spurting blood critical issues. For example, pages being blocked. Others are going to be in the sort of nice-to-have category, which we might get to if and when we have time because they have a much weaker impact on visibility and user experience. And there will also be some issues which have a knock-on impact. Let's say that you need to change the structure of your website. Well, it makes no sense to go and fix all the redirects only if you're going to then change the entire structure of the website anyway. So, generally a lot of the bigger tasks you'll want to prioritize first to prevent having to redo stuff later on. And what we like to do is the impact versus effort framework. So, this prioritizes all of your tasks on two axes, [music] impact on ranking and performance and the effort required to make that change. In the rest of this video, we're going to use this structure talk about the fixes that you'll be implementing. Obviously, the top priority ones are the ones that are low effort and high impact. They're often the quick win technical fixes, and we'll do them first. After that, there's typically a complex technical layer, which will have high impact but also might be high effort. Content is typically the third priority. It can often have a moderate to high impact and also a moderate to high effort level. Internal linking authority usually comes fourth. It can have a slightly less impact and it can be quite a lot of work depending on how much of an issue your current internal link structure is. But the important thing is that each stage builds on the last and that we're doing them in the order that they have an effect on your website. Also really important to measure your site's progress as you're going through this whole journey. We need to baseline things from the start so you can see that as you're doing each fix, how it's impacting results. How do we do that? Typically, we're tracking keyword visibility in the tool like Semrush. We're also tracking AI search visibility, again in a tool like Semrush or Peak.AI or Profound. We'll also track traffic changes, and we'll do this on a page-specific level so we can see how fixing one page impacts the rankings visibility of that page and leads to an increase in traffic or not. If we're doing metadata fixes, the things that will show us if those are working are typically impressions and click-through rate. So, we'll measure the number of impressions that a page or a section is getting in search and what the click-through rate of those impressions is. If we see an increase in click-through rate, that's usually because the metadata that we're giving Google or the AI tool is more compelling and therefore we're getting more clicks to that page. And of course, we'll look at crawl error trends as well. Usually, enterprise SEO teams will want to report the effects of their work to leadership, and what we'll do there is tie the effects of all of this work into revenue. So, not just looking at, "Hey, here are the number of SEO crawl errors that we're reporting, but here's how this is impacting traffic and conversions. If you're using a tool like Rank Math Pro, then a lot of this analytics stuff is actually handled in the back end of the website. For example, I'm in here in the analytics section, you can see SEO performance of your site as a whole, you can see winning and losing posts, so you can monitor the performance of individual posts that are posted on your site, and how your site has progressed over time. Now, this links into Google Search Console and Google Analytics 4, so you know the data is good. You can also track your keyword performance, and if you're on the agency plan, you get up to 50,000 keywords tracked. If you want to schedule email reports, you can do that from right inside the tool, and of course, having it all inside WordPress means that you don't need to go to different platforms. So, that's how to audit, and that's how to track. Now, it's time to dig into some of the individual issues, and let's start with the highest leverage ones. When you're doing enterprise SEO, actually one of the things that you're kind of hoping to find is one of these quick win technical fixes. The things where just changing some settings can unlock massive growth in visibility. And sometimes enterprise sites will have some real facepalm technical issues. For example, just heading over to the robots.txt file and checking for any disallow rules that shouldn't be there. But we've seen all sorts. Staging site de-indexing rules that were added years ago and should have been removed years ago, but somehow are still there and inadvertently are causing the de-indexing of huge sections of the website. Sometimes there will be plugins or other functionality which has helpfully added blocking rules for AI crawlers. Things like GPT bot, Claude bot, Perplexity bot, or Google Extended. All of these are bots used by AI tools to crawl your website and take that information back to the user. But sometimes these are being blocked because people think, "Oh, we don't really want AI tools crawling our website and stealing our content." Not realizing that if you're blocking these bots, you're going to be impacting your visibility in the AI search results. Sometimes in the meta robots file, you'll see a no index applied at the wrong level, which can be accidentally blocking an entire category of pages. This can be common after a CMS migration or a plugin update, where settings get carried over incorrectly. 404s and redirect chains can also be a silent killer that sits in this category. Now, 404s are just a natural part of running a large site. You're going to have page URLs that change, and 404s are going to happen. But the question is whether you're catching them as they happen, or you're waiting months later to pick them up in an audit. And of course, the solution to 404s is redirect, where you redirect an old URL to its new version. But sometimes what you end up with is a bit of a redirect chain, where URL A is now pointed at URL B, which is then being pointed to URL C. If the site has been through migration or multiple changes, you end up with this chain of redirects. Each extra hop losing a bit of page rank and slowing down the crawl throughput. If you have a super long chain of more than 10, Google bot will actually lose the will to live and won't get to the end of it. Sad times. Now again, on Rank Math SEO, you can fix a lot of this actually in the platform. You can edit your robots.txt without needing any developer support. You can handle your redirections and set up rules for your redirections as well. You've also got a separate section of the tool that allows you to manage and see your current redirections. You can also see your 404s as they're happening, so you can see if you've got particular URLs that are triggering lots of 404s, and you can quickly add redirections to fix those. You can also set up auto redirects on URL changes. So, if you change the URL of a page, it will automatically create a a redirection from the old URL to the new URL. The next level of priority fixes are usually on-page SEO fixes. These are things like title tags and H1s. They're typically high volume, but we'd still call them quick wins cuz they can have a relatively strong impact on performance. Amongst the most common issues on large sites are duplicate titles and missing meta description tags. Typically, there wasn't a metadata system in place, or metadata wasn't being consistently implemented. So, as the site grew, you end up with a bunch of pages that never had meta descriptions, or where page titles were just duplicated from a template. Missing H1 tags are also a common issue, either missing H1 tags, or a page type that set up where all of the headings on a page are all H1s. Now, really you only want one H1 per page, and it should be unambiguous about what that page is about. What we tend to see is that on large sites with many contributors, this is something that easily gets broken. For example, the marketers in this territory like the look of the H1, so they just turn everything into H1. Sometimes people end up writing an entire page in H1, and then just changing the font and sizing. Got to love them. Images without all tags are also an area that commonly falls into this category. Now, it's a good idea to have all tags on your images. This can impact your image visibility in search. It can mean that if you've got all tags set correctly, AI crawlers can understand better what's on your page, and of course, accessibility compliance. On the global brand that we're doing this 10,000 page audit and fixes for, there were over 1,500 images that didn't have all tags. And that's not unusual. It's kind of the default state amongst large sites that haven't automated this. One of the telltale signs of these types of on-page issue is click-through rate gaps. So, if you go into Search Console or Rank Math SEO, and you see that a certain page has really high impressions, but very low click-through rate, it can sometimes be because the metadata just isn't compelling enough to be attracting the click. And when you look into it, you'll often find that actually the metadata isn't either there, or isn't really relevant to the content of the page. Rank Math has some options here that can save you a ton of time, particularly with the image all text generation. It can actually do this automatically based on the file name and the page content. And it can apply these across the whole website. That doesn't mean they're all going to be absolutely perfect, but it does mean that you've got a first pass to edit from, rather than having to start from a blank page. You can also do the on-page analysis from inside the plugin. So, looking at any particular page and grading it against its target keyword. So, is that target keyword found in the H1, in the page title, in the meta description, and in the content itself? This allows contributors to the page to fix the issues before they come up in an SEO audit later on. And if you're working on a site with lots of missing meta, you can set global meta templates. I.e., what is the default meta for each page based on its page title, the URL. So, new pages have correctly structured metadata as soon as they are created. Don't forget, if you're watching this and you're thinking, "I really need some help with this from a company that really understands this stuff deeply and has done it lots of times for lots of different businesses in lots of different geographies," that is what Exposure Ninja does. We can help you implement as much of this as you need, and take things to another level to improve your AI search visibility, your traditional search visibility, and most importantly, the number and quality of leads and sales that your website is generating. To request a free digital marketing review from the team, head over to exposureninja.com/review. That's exposureninja.com/review. Please note that not everybody is eligible for this free review. They do take us absolutely ages, so we do have to set some criteria. So, you do need to apply for this. Just head over to exposureninja.com/review. Now, those are the quick wins, and you may or may not find some of them on your site, but they're typically the things that are easiest to implement and have the highest impact. But there's a second layer of technical fixes that require a bit more planning and a bit more resource. And these are often the ones that tend to have the highest compounding effect over time. And that is the complex technical layer, which includes things like schema, architecture, and site performance. This is the structural layer that determines your website visibility's long-term ceiling. So, it's really important, but it takes a little bit more work. Let's talk about schema markup. This is now a prerequisite for good AI search visibility, as well as rich results in traditional search. Structured data or schema really tells a crawler what a page is about, what a page includes. So, things like product information, pricing information, review information, FAQs, articles, business location. And it does this in a standardized way. And this means that whatever crawler, AI tool, or search engine doesn't need to infer that understanding from the HTML content of a page. It is [music] expecting to see content in a certain format, and it finds it in exactly that format. And this allows things like rich results in search engines. So, when you see stars, or product images, or product pricing in the Google results, that's usually coming from the schema on that page. AI platforms like ChatGPT, Google AI mode, also use schema. Our observation is that pages that include schema are more likely to show up in these AI search tools results. The problem, of course, for large sites and enterprise SEOs, is that you can't really go through a 10,000 or 100,000 page website and add schema to each page individually. You need a more systematic and global solution. Usually, this means defining the schema configuration once per post or category type. I.e., for every page that looks like this product page, here is the default schema configuration. Of course, this can also lead to its own challenges because sometimes if that template has an error, that error can then be propagated through hundreds, thousands, tens of thousands, hundreds of thousands of pages of that type. Now, the priority schema types of most large websites are article and blog posting. These are for the content pages on your website. FAQs, this maps directly to how AI tools are looking at FAQ type content. Product and offer, this is really essential for e-commerce type pages and actually is used by Google Shopping. How to, this matches the sort of step-by-step structure that a lot of AI platforms use in their results for informational or instructional queries. And organization and website, this establishes your brand as a clearly defined entity in both search engines and AI tools. Okay, so how does this work in practice? Well, again, just using Rank Math SEO as an example, you can create schema templates, which allows you to create a structure once and then apply it conditionally across post types, categories, or custom fields. This means every new page of that type gets that schema template applied automatically. You can also create conditional rules. Let's say for example that you want to create FAQ schema on some of your post types, but only if they are categorized as guides. Or maybe that you want to create some video schema, but only on pages or posts that have an embed in them. By spending a bit of time to create these rules, you can implement schema across dozens, hundreds, or even thousands of pages with relatively little effort and a really good degree of accuracy. You can also validate your schema live. You can test it against Google's rich results to see how your schema will actually show up in the wild. This can catch errors in your schema templates before they cascade through all of the pages on your site, meaning you don't have to discover them later on in Google Search Console after everything's gone live and been running for a bit and it's completely broken. There's also a WooCommerce module that allows you to do all of the correct schema for products as well. Price, SKU, availability, brand, all of this can be populated from the WooCommerce [music] product data automatically. This means no more manual schema entry per product, which can be a massive time saver. And because it's using global product identifiers, that makes all of your products eligible for Google Shopping automatically. Okay, so let's talk about one of the least sexy but important issues in enterprise SEO. Site architecture and internal authority. This is the structural layer that underpins everything on your website. How your pages link to each other is how page rank flows through your website. But the problem is on most large websites, that flow is accidental rather than intentional and designed. The consequence of this is that sometimes your most important and strategic pages can receive a fraction of the page rank that they actually should be, while some other thin or byproduct pages end up inadvertently getting a whole bunch of attention. We'll commonly see authority spread across dozens or even hundreds of pages of a similar structure with no clear canonicalization. That's a waste because that authority could be directed at some key pages and give them a lot more visibility in search rather than trying to spread that really thin across pages that are essentially very similar. Large e-commerce sites, particularly ones with products that have lots of variations, are particularly vulnerable to this. So, what does a well-structured architecture really look like? Well, it's got three properties. Firstly, pillar pages. These are the most comprehensive and authoritative pages on a topic. They should be the ones that have the most internal links on a site, so other pages linking to them. Secondly, you'll have supporting content. These will link to the pillar pages. They will reinforce topical authority signals to the search engines. The third property is that when new content is added to the website, it should fit within this structure. And that should happen from the moment it's published. It shouldn't just be published and then, oh, we'll figure out how to structure this later on. So, your website's structure should be designed to take in new content as it's published rather than, this is the ultimate state of our website and anything else that we publish later on, we'll figure out where it goes. Another really important factor in enterprise SEO or SEO for larger sites is page speed and performance. Now, most SEOs know that Core Web Vitals, which is Google's particular way of measuring page speed, has an impact on ranking. But actually, page performance is more important and runs deeper than this. Google allocates a finite amount of crawl budget to each website. And if your pages load slower, your clock is ticking and you might have fewer pages crawled and indexed as a result. This is particularly an issue for larger sites that have maybe thousands of URLs that need indexing. Even worse, these AI tools have even less patient crawlers. They will give your site less time. They won't wait for JavaScript to load. They won't retry slow pages. They'll be moved on and find your competitors' pages instead. They got enough problems with their data center costs, so they're not hanging around waiting for your website to load. And of course, slow pages impact user experience and has a measurable impact on conversion rate amongst other metrics, which has been very well documented. Okay, so let's talk about the content layer. Might be a bit more work, might involve other teams, but can be incredibly important to your visibility if you have a large website. The problem is that high quantity content without being a sufficient quality doesn't compound, it dilutes. Or rather, it compounds the dilution. Now, content at scale does become a bit more straightforward if you've solved the architecture thing before and you're using that topic cluster approach with the pillar content and the supporting content. That does make creating content at scale a bit more straightforward because you're not having to worry about duplication of information. Every page has a clear role to play. It also helps if you've got really clear intent for each page, so you know what the search intent is going to be for that page because you can then write or create content for that intent. And of course, if you've got your keyword and query research done, you know which keywords for search or queries or topics or prompts for AI search you're targeting with that content, which helps to give you a bit more structure. Another consideration that the team of Exposure Ninja make whenever they're creating content is, is this content worthy of external links? There'll be certain times where we'll create pages on a website that their main goal is actually to attract links from other websites. For example, pages about statistics and data, we'll create knowing that writers, journalists, people on other websites need to cite certain sources, need to cite information. We'll create content on our websites and our clients' sites full of data and information knowing that writers and journalists on other publications will want to cite good quality sources of data when they're writing some of their own content. This can be great if you've got proprietary data or some unique first-hand experience inside your business that you can share through your content. This is also really useful for AI because AI tools will often give a perspective or an opinion on something to their user and they'll then search for an online source to back up that perspective. And that might mean an expert who's got hot take on something that matches the perspective that the AI tools has just shared. Or this might be data that backs up a particular position that the AI tool has given. The AI tools want to cite external sources because they don't necessarily have their own data, their own understanding. And showing that a human expert agrees with their perspective makes their answers stronger. So, it can be a great way of getting some branded mentions and where you create pages targeting every aspect of a certain topic because this is how those AI tools work. If you ask Perplexity a question, it will go and run a whole bunch of searches in the background for different subtopics and then compile the results of all of those into an answer. So, if you match that structure with your content, you can cover that topic from every angle and essentially increase the number of tickets you have in the raffle to get mentioned in that response. The Rank Math SEO plugin actually has this topic research tool built into it, which can actually help you come up with topics for your content and research in this sort of query fan out style. For example, here I've just used the topic research section of the plugin to come up with some heat pump trends, which could form the basis of a much more detailed article that I can write. If I want to, I can then copy those and actually have the tool create the blog post via the blog post wizard. It can then create the optimized titles and descriptions for me and help me as I'm writing that article with feedback and scoring on things like keyword usage, the strength and quality of the headings, and internal linking coverage. Now, if you're a really small team trying to cover content across a very large website, something like this can be really useful to help give you a lot more leverage for your time. Here I'm using the blog post wizard to come up with some ideas and then write those blog posts. So, we've got some blog post ideas here. I'm not sure about that one, so I'm going to click regenerate. It's going to come up with another idea. Uh, let's try another one. Once I've got something I like, I can click next step. I can then give the tool a essentially a brief for my blog post and click generate. It's then going to generate the entire post for me. Now, I may not want to use this exactly as is. I may want to tweak it and improve it, and add some of my specialist knowledge to this post, but it gets me to a really good starting point, rather than leaving me with the dreaded blank page. Now, finally, let's talk about internal linking. This is one of the most underutilized levers on large websites. Internal links are how PageRank, Google's measure of authority, flows through a website. And as we said before, on a larger website, that flow is almost entirely accidental. Typically, we also talked earlier about how your site architecture would have this structure where you've got pillar content and supporting content. And all of that is communicated and implemented via internal links. If you've got that page structure, but you don't have the internal links to support it, all you end up with is a bunch of isolated pages covering the same topics and competing with each other, rather than supporting and amplifying each other. Now, on a small site with a few pages, you kind of know all of the internal links. Right? If you've got 10 pages, you can probably keep all of the internal links between each page in your head. But as the site starts to grow larger and larger, instead you need an internal linking strategy. And if you've got multiple teams, multiple contributors, sometimes across multiple regions and territories, you need to be able to communicate that internal link strategy very clearly. Again, traditionally, what would happen is that somebody does an audit, we then identify a bunch of fixes that need to be implemented, go and do those fixes. That's okay, but you don't want to have to rely on that for all of your internal linking across your site. You really need something that can be updated, and ideally done as you're building the site, and as you're adding to the site, rather than having to go back every so often and periodically fix all the new stuff. Now, there is no one solution that we can recommend to this, because it's going to depend on your business, but how we would typically handle this would be to create an internal link policy, which is then distributed through your teams. Let me just show you how Rank Math handles this, because you'll see some of the components that could ideally be included in that link policy. So, Rank Math has this AI Link Genius, which is a new feature, and this allows you to manage your internal links at scale. So, you get this dashboard, which shows all of your links in one place, and it works for both internal links, i.e. links between different pages on your website, and external links, where you're linking out to someone else's website. So, here we've got the link section. We can filter this by internal links, so that's the links between different pages on our website, or external links, the links from our website to other websites. We can [music] see the ones that are working, successful. We can see the ones that are broken, thankfully we don't have any of those. We can see their anchor text. We can see the destination page. So, this is a really nice way of just analyzing all of the internal links on our site. If we ever need to bulk update lots of links, so let's say for example that the decision has been made to move the knowledge base content on your website from {forward-slash} knowledge base to {forward-slash} guides, then you might want to update all of the links on your site that have pointed to the knowledge base section, instead update them to the guide section. So, this is where you would do that, without having to go through each of the individual links on your site and update them, finding all those knowledge base links and manually updating them. So, it's a huge time saver to have something like this. There are also some automation options, so actually when you're writing content, it can suggest links to other pages, and if you want to set up some global rules using AI, it can suggest and put in links when you mention certain topics to the related product or service page, for example. It can also automatically track redirects, and when you change your URL, go and update the links to that URL. So, it's just a really smart way of handling a lot of this internal link stuff automatically, which, let's be fair, this is pretty low value work SEO team to be doing, so it's nice to try and automate as much of it as possible. And of course, you can use this with the Content AI thing. So, if you're generating your posts using Content AI, you can then use the Link Genius piece to add those internal links, which again just removes another task from your to-do list. You're still going to want to edit this content, it's much better to be working from some written content already with those internal links in place, without having to just start from a blank page. So, there you have it, the enterprise SEO playbook that works for traditional search and AI search, and hopefully a useful prioritization framework that you can use when you're working out what to do first. As we've seen in this video, Rank Math Pro is a great plugin to help you do this stuff at scale. It's got some really cool tools like the Content AI and the Link Genius that allow you to automate a lot of this stuff. And as we've seen, Rank Math Pro works inside WordPress to help you do a lot of this stuff. For me, the Link Genius piece is really smart, because managing links at scale becomes incredibly tedious and very difficult to do, and this puts most of it on autopilot, which is a big tick in my box. Now, of course, the SEO layer that we've covered is just one part of the pie, and if you want to improve your visibility in AI search by building an AI search strategy on top of this, then check out this video, which is going to show you exactly how to do that.

Get daily recaps from
Exposure Ninja

AI-powered summaries delivered to your inbox. Save hours every week while staying fully informed.