A Reddit user inquired about implementing a sitewide code modification for a website supporting ten languages. Google’s John Mueller provided broad guidance on the risks associated with sitewide alterations, emphasizing the importance of simplicity. Although the query pertained to hreflang, Mueller’s response, being universally applicable, carried broader significance for SEO.
What is the Time Taken To Process URL Changes?
The time taken to process URL changes can vary depending on factors such as the size of the website, the frequency of Google’s crawling, and the complexity of the changes. Generally, it may take anywhere from a few days to a few weeks for Google to process URL changes and reflect them accurately in search results.
“I am working on a website that contains 10 languages and 20 culture codes. Let’s say blog-abc was published on all languages. The hreflang tags in all languages are pointing to blog-abc version based on the lang. For en it may be en/blog-abc
They made an update to the one in English language and the URL was updated to blog-def. The hreflang tag on the English blog page for en will be updated to en/blog-def. This will however not be dynamically updated in the source code of other languages. They will still be pointing to en/blog-abc. To update hreflang tags in other languages we will have to republish them as well.
Because we are trying to make the pages as static as possible, it may not be an option to update hreflang tags dynamically. The options we have is either update the hreflang tags periodically (say once a month) or move the hreflang tags to sitemap.
If you think there is another option, that will also be helpful.”
Processing sitewide changes is a time-consuming task. I came across an intriguing point in a research paper that resonated with insights shared by John Mueller regarding the time it takes for Google to comprehend the relationships between updated pages and the broader internet landscape.
The research paper highlighted the need to recalculate the semantic embeddings of updated webpages and extend this process to other documents. In passing, the paper discussed the inclusion of new pages in a search index.
“Consider the realistic scenario wherein new documents are continually added to the indexed corpus. Updating the index in dual-encoder-based methods requires computing embeddings for new documents, followed by re-indexing all document embeddings.
In contrast, index construction using a DSI involves training a Transformer model. Therefore, the model must be re-trained from scratch every time the underlying corpus is updated, thus incurring prohibitively high computational costs compared to dual-encoders.”
In 2021, John Mueller remarked that Google may require several months to evaluate both the quality and relevance of a website. He also highlighted Google’s endeavor to grasp a website’s contextual fit within the broader web ecosystem.
Here’s what he said in 2021:
“I think it’s a lot trickier when it comes to things around quality in general where assessing the overall quality and relevance of a website is not very easy.
It takes a lot of time for us to understand how a website fits in with regards to the rest of the Internet.
And that’s something that can easily take, I don’t know, a couple of months, a half a year, sometimes even longer than a half a year, for us to recognize significant changes in the site’s overall quality.
Because we essentially watch out for …how does this website fit in with the context of the overall web and that just takes a lot of time.
So that’s something where I would say, compared to technical issues, it takes a lot longer for things to be refreshed in that regard.”
The aspect concerning evaluating a website’s alignment within the broader web context is an intriguing and uncommon observation.
His remarks regarding this alignment echoed remarkably similar sentiments to those expressed in the research paper, which discussed the search index’s process of “computing embeddings for new documents, followed by re-indexing all document embeddings.”
Below is John Mueller’s response on Reddit regarding the challenges associated with updating numerous URLs:
“In general, changing URLs across a larger site will take time to be processed (which is why I like to recommend stable URLs… someone once said that cool URLs don’t change; I don’t think they meant SEO, but also for SEO). I don’t think either of these approaches would significantly change that.”
What is Mueller implying by stating that significant changes require time for processing? It might align with his 2021 comments about the comprehensive reassessment of a website for its quality and relevance. This notion of relevance could parallel the research paper’s discussion on computing embeddings, involving the creation of vector representations of webpage words to grasp their semantic significance.
The ramifications of complexity endure over the long term, as John Mueller elaborated further in his response.
“A more meta question might be whether you’re seeing enough results from this somewhat complex setup to merit spending time maintaining it like this at all, whether you could drop the hreflang setup, or whether you could even drop the country versions and simplify even more.
Complexity doesn’t always add value, and brings a long-term cost with it.”
For over two decades, I prioritized building websites with the utmost simplicity. Mueller’s observation is spot on—it significantly simplifies updates and overhauls.
Would you like to read more about “Google On Time It Takes To Process URL Changes on Larger Websites” related articles? If so, we invite you to take a look at our other tech topics before you leave!
Use our Internet marketing service to help you rank on the first page of SERP.