Digital Marketing Agency | SEO, Paid Social & PPC

How To Avoid Ruining SEO During A Website Redesign

Learn strategies to protect your website's SEO assets during a redesign to prevent detrimental impacts on search engine rankings.

Share This Post

If you have devoted resources to SEO and are contemplating a website hosting upgrade or migration, it’s important to engage an SEO expert who has experience in website SEO migration before you start the process.

We recently worked on a fashion website which suffered a significant loss of around $95,000 in revenue due to a sharp decline in organic visibility following a website redesign. Such losses can be avoided with strategic SEO consultation and proper planning.

In this article, we explore a reliable strategy to protect SEO assets during a website redesign. Explore critical SEO loss, strategies for selecting and prioritizing URLs, and how to use effective tools to streamline the process.

How To Avoid Ruining SEO During A Website Redesign

Below are some factors to be considered during the redesign process to avoid a decline in your website’s SEO performance after a redesign:

  • Domain change.
  • Adding new URLs without proper 301 redirects.
  • Changes in page content
  • Changes in on-site keyword targeting
  • Unintentional blocking of search engine crawlers.
  • Changes in website performance metrics, such as Core Web Vitals and page speed.

These elements are important as they directly impact the indexability and keyword relevance of your site. Furthermore, conducting a comprehensive audit of internal links, backlinks, and keyword rankings is essential. While these factors may greatly impact your performance, they are very important factors to consider.

The Effect of Domains and URLs on Your Rankings

During a website redesign, it’s typical for URLs to change. The critical aspect is ensuring the implementation of proper 301 redirects. A 301 redirect signals to Google that a page’s destination has changed.

Google on Spammy Backlinks and Their Adverse Effect on Rankings

Every URL that is deleted and results in a 404 error poses a risk of losing organic rankings and valuable traffic. Google tends to avoid ranking pages that lead to dead ends. Landing on a 404 page after clicking a Google result is a bad user experience.

Minimizing the need for 301 redirects by retaining the original URL structure is advisable to reduce the number of pages dropping from Google’s index.

If URL changes are unavoidable, using tools like Screaming Frog to crawl and organize all website URLs can facilitate the mapping of the old URLs to their new URLs. Many SEO tools or Content Management Systems (CMS) platforms allow the import of CSV files containing redirection lists, sparing the manual addition of redirects.

While this process of retaining SEO assets may require lots of effort, it’s the most reliable method to ensure Google connects the old and new URLs. Intentionally creating 404 errors can prompt Google to remove low-value pages from its index, making a website redesign an opportune moment for a thorough cleanup.

Even though Google offers a method through its Change of Address Tool in Google Search Console to communicate domain changes, improper setup of redirects can jeopardize performance.

Content on webpages and keyword targeting

Google’s index primarily consists of content gathered from crawled websites, which is then subjected to ranking algorithms to produce organic search results. The ranking of a page heavily relies on the relevance of its content to specific keyword phrases.

Website redesigns often involve restructuring and rewriting content, which can lead to changes in relevance and subsequent alterations in rank positions. For instance, a page initially optimized for “best pepperoni pizza” might become more pertinent to “best pizza restaurant,” resulting in a decline in its ranking for the original keyword.

Google Responds To A Query About A Crawl Budget Concern

Occasionally, content modifications are unavoidable and may be needed to improve a website’s overall effectiveness. However, it’s essential to know that the more significant the changes to your content, the greater the potential for volatility in your keyword rankings. Expect to lose some rankings and gain others as Google reassesses your website’s new content comprehensively.

As website content changes, metadata frequently changes inadvertently. Components such as title tags, meta descriptions, and alt text play an important role in aiding Google’s comprehension of your page’s content.

When changes to word choices happen in the headers, body text, or metadata on the revamped site, on-page SEO elements can inadvertently be removed. Consequently, changes in keyword relevance also occur, leading to fluctuations in rankings.

Performance and Core Web Vitals

Several factors contribute to website performance, ranging from your chosen CMS or website builder to design elements such as image carousels and video embeds.

Modern website builders provide extensive flexibility and features, helping the average marketer create a desired website. However, having lots of features on your webpage often results in poor webpage performance. Balancing your specific requirements with Google’s performance metric standards while selecting the appropriate platform can pose a significant challenge.

Unintentionally Blocking of Google Crawlers

A best practice among contemporary web designers involves setting up a staging environment, allowing them to design, build, and test your new website in a live setting.

To prevent Googlebot from crawling and indexing the testing environment, you can opt to block crawlers using a disallow protocol in the robots.txt file. Alternatively, implementing a noindex meta tag directs Googlebot not to index the content on the page.

Despite its apparent simplicity, websites are frequently launched without removing these protocols. Consequently, webmasters are left perplexed as to why their site promptly vanishes from Google’s search results.

Google Says Excessive Focus on Links Building Could Be A Waste Of Time

Ensuring the removal of these protocols is a crucial step before launching your new site. Failing to address this issue can result in Google removing your website from organic search results if it crawls these protocols.

Tools for Website SEO asset migration

Here are factors that determine which pages of your website qualify as SEO assets: links, traffic, and top keyword rankings. Pages that receive backlinks, consistent organic traffic, or rank well for multiple keywords should be replicated on the new website as closely as possible. In some cases, there will be pages that meet all these criteria.

Consider these pages as invaluable assets. Often, you’ll need to weigh how much traffic loss you’re willing to accept by removing certain pages. If those pages have never contributed significant traffic to the site, the decision becomes much simpler.

Below is a list of tools you can use to efficiently audit large numbers of pages.

Internal Links and External Links

  • Semrush
  • Google Search Console
  • Screaming Frog

Website Traffic

  • Google Analytics 4
  • Google Search Console

Keyword Rankings

  • Semrush
  • Google Search Console

Information Architecture

  • Octopus.do (low-fidelity wireframing and sitemap planning)

How to Find Valuable SEO Resources in Your Website

Any webpage currently attracting backlinks, generating organic traffic, or ranking prominently for multiple keywords as an SEO asset—particularly those meeting all these factors. These pages represent concentrated SEO assest and should be migrated to the new website carefully.

How To Locate and Organize Pages with Backlinks

Start by retrieving a comprehensive list of URLs and their corresponding backlink counts from your preferred SEO tool. For instance, in Semrush, use the Backlink Analytics tool to export a list of your top backlinked pages. Since SEO tools offer finite datasets, it’s advisable to gather the same data from an alternative source.

You can extract the same data type from Google Search Console, as it provides a broader dataset for analysis. Next, cross-check your data, identifying any additional pages overlooked by the tool and eliminating any duplicates.

Additionally, you can aggregate the number of links between the two datasets to determine which pages have the highest overall backlink count. This will aid in prioritizing URLs with the most backlinks across your site.

Internal Link Value

Once you have identified the pages receiving the most external links, it’s important to also assess which pages on your website have the highest concentration of internal links from other pages within your site.

Pages with a greater number of internal links also possess more assets, thereby improving their ranking potential. You can obtain this information from a Screaming Frog Crawl in the URL Details or Inlinks report. Consider the internal links you intend to retain. Internal links serve as Google’s primary method of navigating your website and transferring links from one page to another. Removing internal links and altering your site’s crawlability can impact its overall indexability.

Compile a List of The Top Organic Traffic Pages

It’s important to acknowledge that webpages attract visitors from various channels. Just because a page doesn’t drive significant organic traffic doesn’t mean it lacks value for referral, social, or email visitors.

The Landing Pages report in Google Analytics 4 offers insights into how many sessions originate from a specific page. Access it by navigating to Reports > Engagement > Landing Page. These pages play an important role in attracting visitors to your website, whether through organic means or other channels.

Google Has Revised The Definition of “Top Ads” In Its Search Results

Depending on your website’s monthly visitor count, consider extending your date range to analyze a larger dataset.

List Out Pages with Top Rankings

Semrush offers a convenient method to compile a spreadsheet of your webpages that rank for keywords within the top 20 positions on Google. Use the Organic Research tool and choose Pages. From there, you can export a list of your URLs with keyword rankings in the top 20.

By integrating this data with your top backlinks and primary traffic sources, you compile a comprehensive list of URLs meeting one or more criteria for being deemed an SEO asset.

You can adjust thresholds for the number of backlinks, minimum monthly traffic, and keyword rank position to alter the stringency of the criteria for identifying pages as SEO assets.

Avoid Ruining SEO During A Website Redesign

The key to SEO success in a website redesign project lies in careful planning. Redesign your new website layout around the existing assets instead of forcing them into a new design. Even with thorough preparation, there’s no assurance of completely avoiding drops in rankings and traffic.

Avoid placing blind trust in your web designer’s assurances that everything will work out smoothly. Take charge of creating the plan yourself, or enlist the expertise of someone who can. The potential consequences of inadequate planning are too significant to overlook.

Would you like to read more about “How To Avoid Ruining SEO During A Website Redesign” related articles? If so, we invite you to take a look at our other tech topics before you leave!

Use our Internet marketing service to help you rank on the first page of SERP.

Subscribe To Our Newsletter

Get updates and learn from the best