While it is widely agreed that SEO and Enterprise SEO Experts heavily is interwoven with Google in the contemporary context, history has it that this practice began long before the discovery of the world’s most popular search engine co-founded by Larry Page and Sergey Brin.
Although it is fair to say that SEO and the field of search engine marketing could be considered as having their founding date tagged to 1991 with the launching of the first website, or even more aptly with the introduction of the first web search engine, the searchable narrative of SEO picks up around 1997.
As Bob Heyman relates in his book “Digital Engagement”, an ironic catalyst behind the birth of what we currently know as “search engine optimization” can be attributed to the manager of the rock band “Jefferson Starship”.
This manager was unhappy that the official Jefferson Starship website was showing up on the fourth page of search engine results at the time, rather than in the much more desirable first position on the first page. While the veracity of this tale is open to interpretation, the general consensus is that the term SEO originated around 1997.
Further research down the line shows that John Audette of the company called Multimedia Marketing Group was using the term as early as February 15, 1997, to add more evidence to the origins of SEO.
The Evolution of Search Engine Ranking in 1997
In 1997, ranking well in search engines was still a new idea, and the world of search was very directory-driven.
At that time, prior to DMOZ (the Mozilla Open Directory Project) becoming the main resource for Google, there were a number of other directories that played important roles. LookSmart used Zeal, Go.com had its own directory, and Yahoo Directory had a great deal of influence over Yahoo Search.
If you are not familiar with DMOZ, it was essentially the Yellow Pages of websites. It was a centralized site where websites were listed and organized. At the start, Yahoo’s mission was to provide users with great websites out there, curated by editors and the foundation of this was the Yahoo Directory.
In 1998, when i realized that the websites I was building needed some much-needed traffic, I began implementing SEO practices on behalf of our clients. Little did I know SEO would one day become a lifestyle for me.
Considering that the World Wide Web was still a relatively novel idea for most individuals during that period, it’s understandable. Fast forward to today, and it’s a different story. Everyone aspires to dominate the search engine results pages (SERPs).
Search Engine Optimization vs. Search Engine Marketing
Before Search Engine Optimization became the official name, other terms were used as well:
- Search engine placement.
- Search engine positioning.
- Search engine ranking.
- Search engine registration.
- Search engine submission.
- Website promotion.
However, any comprehensive discussion on this topic would be incomplete without acknowledging another significant term.
Search Engine Marketing
In 2001, an influential writer in the industry proposed “search engine marketing” as a potential successor to “search engine optimization.” However, it is evident that this transition did not occur.
You may also like: Best Link Building Strategies for Local SEO
Be prepared to encounter numerous false claims such as “SEO is dead” or the emergence of supposed new forms of SEO. There will also be attempts to rebrand SEO under different names like “Search Experience Optimization.”
While the term SEO may not be flawless since we are not optimizing search engines themselves but rather our online presence, it has remained the preferred and widely adopted term within our industry for over two decades. It is likely to continue as such in the foreseeable future.
Search Engine Marketing (SEM), is still utilized but has become more closely linked with paid search marketing and advertising practices. Both terms, Search Engine Optimization (SEO) and Search Engine Marketing (SEM), coexist harmoniously in the present day, each serving distinct yet interconnected purposes.
You may also like How to Get on the First Page of Google SERP
Search Engine History
Search engines have revolutionized the way we discover information, conduct research, engage in online shopping, find entertainment, and connect with others. They serve as the backbone for nearly every online platform, be it websites, blogs, social networks, or applications. Search engines have become an essential navigational tool, guiding us through the complexities of our daily lives.
But let’s delve into the origins of this transformative technology. To understand the roots of search engines and search engine optimization, we have compiled a timeline of significant milestones that have shaped this crucial aspect of our modern world.
The Start of SEO
During the final decade of the 1900s, the search engine arena was characterized by intense competition. Users had a multitude of options when it came to selecting a search engine, with a range of choices available including human-powered directories and crawler-based listings. Notable names in this space included AltaVista, Ask Jeeves, Excite, Infoseek, Lycos, and Yahoo.
At the onset, the primary focus of SEO was centered around on-page activities, as it represented the only viable approach to optimizing websites for search engines.
This included optimizing for factors such as:
- Making sure the content was good and relevant.
- There was enough text.
- Your HTML tags were accurate.
- You had internal links and outbound links.
During this era, achieving high rankings in search results primarily relied on a simple yet somewhat dubious tactic: repeating keywords abundantly throughout web pages and meta tags.
The prevailing belief was that if a competing page used a particular keyword 100 times, one could surpass them by incorporating the same keyword 200 times. However, this practice, now known as keyword stuffing, has since been recognized as spam.
Let’s explore some key highlights from this period:
1994 History
In a humble campus trailer, Yahoo was founded by Stanford University students Jerry Wang and David Filo. Initially, Yahoo served as an Internet bookmark list and directory, featuring a compilation of intriguing websites.
During this era, webmasters had to undertake the manual task of submitting their web pages to the Yahoo directory for indexing. This ensured that their pages would be accessible to Yahoo’s search engine when users performed searches.
In addition to Yahoo, other prominent search engines like AltaVista, Excite, and Lycos also made their debut, contributing to the rapidly evolving landscape of online search.
1996 History
At Stanford University, Larry Page and Sergey Brin, both students at the time, embarked on the development and testing of a novel search engine called Backrub. This innovative search engine utilized the concepts of inbound link relevancy and popularity to rank websites.
As Backrub evolved and underwent further refinement, it eventually transformed into the widely recognized search engine we know today as Google. Simultaneously, HotBot, another search engine powered by Inktomi, was also introduced to the online search landscape.
You may also like: How Long Does It Take to Build a Website? A Simple Guide
1997 History
Building upon the achievements of “A Webmaster’s Guide to Search Engines,” Danny Sullivan took a significant step forward by establishing Search Engine Watch. This website served as a valuable resource for the latest news in the search industry, web searching tips, and insights on improving website rankings.
It is of note that ten years later, long after he moved on from Search Engine Watch, Sullivan founded another highly respected search publication called Search Engine Land. Today, he is employed by Google, where he is contributing to the search engine community once again.
During this time, Ask Jeeves was also launched, bringing another new search engine into the market. The domain name Google.com was registered, and the nascent stages of what became one of the world’s most preeminent and influential search engines began.
1998 History
Goto.com introduced sponsored links and paid search advertising, being a major development in the search engine landscape. The advertisers were allowed to bid on Goto.com for top positions on the organic search results, which were actually powered by Inktomi. Eventually, Yahoo acquired Goto.com in a deal that further shaped the search engine landscape.
This is the time when DMOZ or Open Directory Project became the hub for SEO practitioners to have their web pages listed and indexed.
MSN also ventured into the search engine field by launching MSN Search, powered at first by Inktomi, further diversifying the field of search engine providers.
1999 History
A notable milestone in the search engine industry was the inaugural Search Engine Strategies (SES) conference, which marked the first comprehensive gathering focused on search marketing. For a retrospective on this significant event, you can find an article by Sullivan here.
You may also like: Site Optimization Tips for eCommerce in 2024
It’s worth mentioning that the SES conference series continued for several years, operating under different names and parent companies, until its final edition in 2016, after which it ceased to run.
The Google Revolution
In the year 2000, Yahoo made a strategic move that would go down in history as one of the worst decisions in the search engine industry. They partnered with Google and allowed Google to power their organic search results instead of relying on Inktomi.
At that time, Google was an unknown search engine and nowhere near the recognition it is today. But this deal with Yahoo proved to be the turning point, as every result in Yahoo showed “Powered by Google,” making Google an overnight sensation and household name.
Prior to Google, search engines mainly relied on on-page content, domain names, directory listings, and other basic site structures to rank websites. But Google’s web crawler and PageRank algorithm introduced revolutionary changes in the way information was retrieved.
Google’s algorithm considered both on-page and off-page factors, but it placed a very strong emphasis on the number and quality of links to a website, as well as anchor text. This focus on links led to the rise of an entire sub-industry of link building, as SEO practitioners recognized their significance in Google’s ranking algorithm.
The race was now on for the next decade to get as many links as possible in hopes of achieving higher rankings. In this pursuit, the links were abused as a strategy, which Google had to rectify in the following years.
In the year 2000, Google also launched the Google Toolbar that enabled SEO practitioners to actually see their PageRank score, a number between 0 and 10 that suggested the perceived importance of a webpage. This led to an extreme number of unsolicited link exchange requests.
Meanwhile, in 2000, Google launched AdWords advertising next to its organic results, a pay-per-click model now called paid search. Around that same time, a few webmasters met up casually in a London pub to discuss various SEO-related topics.
The meeting now considered the first informal SEO meeting, later led to the formation of Pubcon, one of the biggest and most successful search conference series today.
Over the coming months and years, the SEO community came to expect the monthly “Google Dance,” when Google updated its index and rankings changed, sometimes wildly.
While Google co-founder Sergey Brin had once stated that Google did not believe in web spam, his perspective likely evolved by 2003. Updates like “Florida” made SEO much more challenging, as it became evident that repeating keywords a certain number of times was no longer sufficient for optimal rankings.
Monetizing SEO Content on Google AdSense
In 2003, Google acquired Blogger.com and then launched AdSense, allowing publishers to show contextually targeted Google ads against their sites.
With AdSense and Blogger.com, uncomplicated, monetized Web publishing mushroomed, creating nothing short of a blogging phenomenon. Unfortunately for Google, this probably brought with it more problems than it was bargained for.
With AdSense, the introduction brought along spammy tactics and filled the internet with low-quality websites that were full of thin, poor, or stolen content. These websites existed solely to achieve high rankings, generate clicks, and generate revenue.
Local SEO and Personalization
Around 2004, the search landscape changed. Google and other major search engines began to enhance their results for queries with geographic intent. This meant that if a user was searching for local businesses or service providers in their city or town, they would receive more relevant and location-specific results.
In 2006, Google introduced the Maps Plus Box, which, at the time, was a killer addition. The feature married mapping technology with relevant information about businesses, giving users a more complete view of local search results.
It was during this time that the search engines also started using data from the end user such as search history and interests personalize search results. Personalization means that different people may get different search results even for the same query. This marked a new era in search where search was getting more personalized and relevant.
You may also like: Most Important Digital Marketing Strategies Everyone Should Know
In 2005, “nofollow” tags were proposed as a defense against spam. SEO practitioners began to use these as a means of controlling PageRank flow to optimize the internal linking of a website. It then became known as “PageRank sculpting,” allowing webmasters to strategically allocate how link authority flows within the site.
The two milestones in the history of search engine optimization are Jagger and Big Daddy.
- It is believed that Jagger, being an algorithm update, led to a decrease in the number of unsolicited link exchanges that were going on everywhere. This update devalued the importance of heavy link exchanges as a manipulative means of link building. Jagger also introduced search engines to the corruptibility of anchor text as a ranking factor, hence it is not as important anymore.
- Big Daddy, named by Jeff Manson of RealGeeks, brought architectural enhancements to Google’s infrastructure. These moved it to a far more sophisticated level of analysis concerning the value and relationships between links on different sites. The Big Daddy update improved the way Google understood the link graph and could better detect the authority and relevance of a website based on its linking profile.
Both Jagger and Big Daddy were important players in determining the fate of search engine optimization, allowing for much more advanced ranking algorithms, while encouraging a shift in mindset toward a more integrated look at website quality and authority.
YouTube, Google Analytics and Webmaster Tools
In October 2006, Google bought YouTube, a user-generated video-sharing network, for the whopping amount of $1.65 billion. This acquisition made YouTube the second most used search platform in the world.
Fast forward to today, YouTube has a massive user base of 2 billion people worldwide. With such huge popularity, video SEO became an important approach for every brand, business, and person who wanted to be more discoverable and visible.
In 2006, Google introduced two innovative tools that have changed the face of website analytics and optimization:
- Indeed, when released, the web-based and absolutely free tool from Google: Google Analytics instantly became very popular, with webmasters experiencing frequent downtime and ‘under maintenance’ warnings on the web pages due to high demands. Google Analytics gives complete insights into your websites, including knowledge of traffic, usage, and conversions, thereby you to make prudent data-driven decisions in refining your online existence.
- The Google Webmaster Tools, now called the Search Console, came to their rescue. It provided a way for webmasters to get really useful information about how well or poorly their websites perform in search results. The tool provided you with features such as notification of crawling errors, and data about search queries for which a site appeared, and gave you the ability to ask for recrawling when your site had been removed or penalized.
Both Google Analytics and Google Webmaster Tools have become indispensable for website owners and marketers, offering key insights and tools to optimize their online presence to make them more conducive to better visibility in search engine results.
In 2006, an important development took place: the universal adoption of XML sitemaps by search engines.
XML sitemaps offered a valuable means for webmasters to communicate with the search engines by presenting them with an inclusive list of all the available URLs on their websites open for crawling.
These XML sitemaps are more than just a list of URLs. Additionally, they provide information that would help search engines intelligently crawl websites.
The XML sitemap informs the search engine about the importance of the URLs, when they were last modified, and how frequently each one has changed, thus allowing the search engine to prioritize crawling efforts and maintain or improve the structure and content of a website.
General acceptance and usage of XML sitemaps have increased the speed and accuracy of indexing a site by any search engine manifold, thereby enhancing the visibility and accessibility of a website through search engine results.
Universal Search
Beginning in the year 2007, the entire structure of the search engines began to dramatically change by introducing new exciting features.
One such development was Google’s Universal Search, which aimed to provide users with a better way of searching. Before this change, search results were typically a list of ten blue links.
In contrast, Google transformed the way searches were conducted by integrating conventional organic search results along with multiple verticals like news, video, and images.
This was a milestone moment, introducing the biggest differences in the way Google delivered searches and changing the world of SEO more than any event since the Florida update.
Where Universal Search introduced variety to the search result landscape, it also brought about a host of new opportunities and challenges for the SEO practitioner.
You may also like: Site Optimization Tips for eCommerce in 2024
Websites could now hope to appear in many types of search results beyond the traditional web page listings, which demanded an altogether new optimization approach to encompass a variety of media formats.
Setting the stage for continued innovation through the years that followed, 2007 was the year in which new dimensions of evolution in searches set a trend, whereby use would relate with search and the dire need for continued adaptation with respect to an ever-altering landscape.
The Cesspool
In 2008, the then-CEO of Google, Eric Schmidt, made a notable statement, referring to the Internet as a “cesspool” and emphasizing the role of brands as a solution to this issue. Schmidt argued that brands serve as a means to navigate through the overwhelming amount of online content.
Shortly after Schmidt’s remark, Google released an update, called Vince, which seemed to give prominent brands a boost in the SERPs. Google explained, though, that the Vince update was not actually meant to give a boost to brands.
Rather, it was intended to make trust a major element of the algorithm, and large, established brands tended to have more trust attached to them.
Shortly thereafter, Google released another update, Caffeine, which placed a greater emphasis on speeding up and streamlining its indexing process. Caffeine was a next-generation search architecture designed to provide faster and more accurate results while crawling a larger portion of the web.
In 2010, Google responded that the speed of a website would be considered in addition to its ranking. This means that when a website loads, speed is now a factor that determines where it will come up in search results. This move on site speed attested to Google’s devotion to a good user experience.
These efforts throughout the late 2000s and early 2010s marked the constant struggle of Google in developing search results based on trust, speed, and relevance.
Embracing these trends has been of prime importance to brands and website owners so as to stay optimized online and improve their rankings according to these evolving searches.
Bing and The Search Alliance
In 2009, Microsoft Live Search was rebranded as Bing. The name change was meant to give the search engine a new identity and a new way of doing things.
In 2009, Yahoo and Microsoft decided to ink a deal in a bid to challenge Google’s hegemony over the U.S. search market. They formed an alliance christened the Search Alliance in the form of a 10-year search deal; hence, both merged their resources and technologies with a view to offering stern competition. However, the said partnership was overhauled five years later.
You may also like: How to Write a Blog Post: Affordable Writing Services
Through the Search Alliance, Bing would power all of Yahoo’s organic and paid search results. This deal, of course, cemented Bing as the undisputed number two search engine. Even with this combined effort, though, Bing and Yahoo simply couldn’t seem to break Google’s tight grip on the market – both domestically and internationally.
It went through another rebranding in October 2020 to officially become Microsoft Bing, aligning it even closer to its owning company, Microsoft. With all that said, Bing remains a search service, and the dominant player in this regard is still Google.
Social Media and Its Risk
In the late 2000s, social networks became one of the biggest phenomena occurring on the Web. While Google invested substantially in YouTube, although it later failed as Google+ and Facebook, Twitter, and LinkedIn were just some of the other names.
Of course, this was just a starting point; over the coming years, there would be literally hundreds of social networking sites taking off and then sinking without a trace.
With the emergence of social media, speculations began on whether it would affect search rankings. While it is true that social media can indirectly help SEO drive traffic to websites, increase brand awareness, and engage customers, how social signals directly affect search rankings has been a hot topic of debate.
While Google has repeatedly denied social shares-likes, tweets, and +1’s, for example as a ranking factor in themselves, study after study on ranking factors has found strong correlations between social signals and search rankings.
The Schema
In 2011, Schema markup was created and implemented as a form of microdata that would help the search engines understand the context behind queries. Schema.org lists all of the available schema markup types.
It should be noted, however, that Schema markup per se is not a direct ranking factor, and very few pieces of evidence exist to show it helps search performance directly. On the other hand, Schema markup will enable your website to stand out in SERPs by allowing rich and featured snippets.
To have your structured data properly set up, you can always test it with Google’s Structured Data Testing Tool, which will give great insights and feedback.
Panda and Penguin on Google Zoo
Within two consecutive years, 2011 and 2012, two big algorithm updates were introduced, which totally changed the face of SEO practices and whose reverberations are still being felt today. Both these updates had something to do with cleaning up the search results to favor high-quality websites.
In 2011, Google came under criticism for allowing “content farms” churning out reams of low-quality content to dominate its search results.
Unoriginal or auto-generated content filled the SERPs, and scraper sites would often outrank the original creators of the content. Such sites relied very heavily on organic traffic from Google and created a high degree of advertising revenue through systems such as AdSense.
You may also like: On-Page Optimization Tips for Local SEO
With the advent of the Panda update in 2011, though, many sites suddenly saw traffic decline. Panda was intended to help deal with either low-quality or thin content; Google gave advice as to what would be classed as a quality site.
Since its introduction, the Panda update had periodic refreshes in subsequent years before finally being merged into Google’s core algorithm in 2016.
While websites were still reeling from the effect of Panda, Google unleashed another much-anticipated update that came to be known as Penguin. This update targeted particularly link schemes and keyword stuffing, regarded as aggressive spam tactics.
Penguin targeted sites with unusual linking patterns, especially those using excessive exact-match anchor text in alignment with keywords for ranking purposes.
Unlike Panda, though, the Penguin update did not happen very often. Sometimes there were significant spans between some of the updates. Like Panda, in 2016 Penguin began a part of Google’s real-time algorithm.
These algorithmic updates were designed to favor high-quality content and discourage spammy practices, as they create a much more reliable and trustworthy search experience for users. Panda and Penguin continue to affect how SEO strategies are conceptualized today, with much emphasis on the creation of useful, unique content and organic, natural link profiles.
The Knowledge Graph
In May 2012, Google came forth with the Knowledge Graph-one of the major leaps out of keyword-based search toward semantics and user intent understanding.
Former SVP for engineering Amit Singhal described Knowledge Graph, “Search for things, not strings.” By landmarks, celebrities, cities, sports teams, and movies, the meaning also covered a wider area beyond keyword terms.
The Knowledge Graph, in a way, looked for the collective intelligence on the web, understood how real, living people understand things and tried to extract just that from the Web as Search results.
Google integrated the same into its search results and represented information via knowledge panels, boxes, and carousels that would be set up when users searched for billions of entities and facts stored in the Knowledge Graph.
In September 2013, Google launched Hummingbird, a new algorithm that would better address natural language queries and conversational search. With the increasing popularity of mobile and voice search, Google needed to revamp its algorithm to cater to the needs of modern searchers.
Hummingbird was a major update to the core algorithm, considered to be the most significant one since 2001. It was intended to return results faster and more accurately, especially for mobile users.
The update demonstrated Google’s interest in moving and changing with the times to provide an increasingly better search experience for all users on different devices.
Mobile-First
The “Year of Mobile” had been the industry’s question since about 2005, but it appeared that every year was not the year. From 2005-2014, mobile had been consistently talked about and hyped because of its rapid growth. As more people started using smartphones, they began to search for businesses and information on the go.
Well, the Year of Mobile finally did happen, in 2015. It was the first year in which more searches were performed on mobile devices than on desktop devices.
For this reason, although that milestone marked a big leap in search volume, the intent of users and the conversion rates still tended to be very different depending on whether a user was on mobile or desktop.
In 2015, comScore also reported that mobile-only internet users outnumbered desktop-only users. Google responded to the rapidly increasing adoption of mobile with the long-awaited release of its mobile-friendly algorithm update that same year.
The update helped to surface the most relevant results in a timely manner for the user, either from mobile-friendly web pages or mobile apps.
You may also like: Increase Website Traffic By Guest Blogging In 2023
To further enhance the mobile experience, Google introduced Accelerated Mobile Pages in 2016, which focused on the instant loading of content. Many news media outlets and publishers embraced this AMP technology.
Furthermore, Google officially announced in January 2017 that page speed will be a ranking factor, emphasizing the importance of fast loading regarding mobile searches.
In the same month, Google also announced that it would start devaluing pages featuring intrusive pop-ups, with a focus on a much better mobile experience.
Since July 2019, the company started allowing mobile-first indexing for all new web pages, and by March 2021, mobile-first indexing was enabled for all websites, which means mobile optimization’s role in search ranking also became more prominent.
Machine Learning and Intelligent Search
As said above, in 2017, Google’s orientation changed from a mobile-first company to a machine learning-first company, according to the CEO Sundar Pichai.
Today, Google Search is trying to give information and answers instead of giving a list of results to the user. Machine learning has been used in all of Google’s products, including Search, Gmail, Ads, Google Assistant, and many more.
One popular implementation of machine learning in search has been Google’s RankBrain. When it launched in October 2015, the original goal with RankBrain was to understand that 15% of searches which were entirely new to Google – based on the words and phrases entered. This subsequently grew to encompass all searching done on Google.
Although it impacts ranking, RankBrain is not a keyword-style ranking factor in that doing X, Y, and Z will result in better rankings. It’s instead part of how the overall algorithm understands the intent behind the query.
But wait, there’s more to intelligent search. Much more, in fact.
- Voice searches are on the rise.
- Visual search has become ridiculously good
- Users, as well as brands, continue to adopt chatbots and use personal assistants such as Apple’s Siri, Amazon’s Alexa, and Microsoft’s Cortana.
In other words, this all means even more exciting times ahead for those doing SEO.
Google’s Core Updates
Google rolls out algorithm updates daily in its attempt to continuously improve the search system. However, during the year, Google also publishes core updates that make significant changes to its algorithm.
This is done with the aim of offering users a better search experience through the delivery of more relevant and reliable search results.
You may also like: On-Page SEO Tips for Small and Medium Businesses
Unlike targeting to a specific page or site, the core updates of Google focus on improving how the system evaluates and tracks content. According to Google, these updates are similar to working out a list of the best movies in a particular year. As new movies come and as perspectives change, the list would change with the updates.
In March 2018, Google confirmed a broad core algorithm update was rolled out to benefit pages that were previously under-rewarded. Several months later, another broad core update targeted content relevance.
Then in August, another broad core update occurred, widely mistakenly referred to as the “Medic” update, which mainly targeted low-quality content on websites.
The update in March 2019, called Florida 2, was a core change that some SEO experts presumed was a rollout of older algorithms. In June 2019, another broad core algorithm update happened where the emphasis was given on quality or authority and trust of links that come to your site to reveal the gaps about E-A-T.
Periodically, Google rolls out broad core updates that affect search results across the globe. For example, a broad core update in September 2019 focused on promoting sites with overall optimal performance, while another in January 2020 targeted websites specifically in the YMYL categories.
It’s important to note that analyzing your website as a whole, rather than specific pages, is crucial when evaluating the impact of broad core updates. In the most recent update in May 2020, Google targeted thin content landing pages and gave a boost to local search results.
BERT
BERT is a major algorithm update by Google, the biggest development since the introduction of RankBrain. BERT is short for Bidirectional Encoder Representations from Transformers, a method focused on natural language processing improvement.
The main purpose of BERT is to make Google understand the context of search queries better. It solves the problem of words having multiple meanings by understanding the surrounding words and the context in which they are used to give more accurate search results. For example, “bat” can mean a flying mammal or the action of a baseball player.
With BERT, the crawlers of Google can contextualize the meaning of the words around a sentence. That means Google understands your content more precisely.
Suppose the sentence contains phrases like “I went to the bat cave” or “After my bat, I went into the dugout,” then Google is able to make a model by considering the other words in the sentence. This is a critical development toward natural language processing and truly understanding human communication.
Featured Snippets
You might have seen featured snippets, yet not really appreciate their importance. Featured snippets are small text pieces, bullet points, numbers, or tables that appear at the top of Google’s search results.
The main purpose of the featured snippet is to answer the user’s query directly on the search engine results page by eliminating the need to actually click through to a website. But let’s just say that featured snippets can sometimes be very unpredictable and change daily.
Although featured snippets aren’t new, in fact, they’ve been around since 2014, they certainly helped achieve that coveted “position zero” of being above the fold on the SERP without losing the ability to show your listing organically.
In January 2020, Google updated to not duplicate the featured snippet search results. What this means is that a website can appear either in the featured snippet or the organic results, but not in both simultaneously.
Further, in June 2020, Google went one step further with the featured snippet update, highlighting to users exactly where their answer was located in the search result directly.
Featured snippets have their relevant text highlighted in yellow. As voice continues to evolve, optimizing content for featured snippets is a great opportunity to enhance organic visibility.
In Conclusion
Since the year 1990, search engines and SEO have gone through many changes. In the above post, we have hardly touched the tip of the iceberg.
From the fall and rise of various search engines to the inclusion of new features in search engine results pages, new algorithms, and constant testing and updates, SEO history has been an exciting journey. Along this journey, many useful SEO publications, conferences, tools, and experts have also come out.
Despite the immense changes it has undergone, there are still some fundamental truisms: as long as search engines exist, the importance of SEO will not decay.
Would you like to read more about the evolution of search engine optimization-related articles? If so, we invite you to take a look at our other tech topics before you leave!