Google has had several algorithm leaks over the years, but none quite as shocking as the recent Internal Development Google Search algorithm leak.
Over the last month, the algorithm leak has been a hot topic for SEO agencies and digital marketers alike, unveiling insights into the tech giant’s potential ranking mechanisms.
Estimated reading time: 13 minutes
Table of contents
- What We Know About the Leak So Far
- Google’s Response to the Leak
- Insights from the Leak
- What Should Marketers Take from this Data Leak?
- Prioritise Links from High-Traffic Sites
- The Impact of Click-Through Rate (CTR) on Ranking
- Balancing Zero-Click Strategies with CTR
- Take on a Holistic Approach to SEO and Marketing
- Leverage Entity Mentions
- Should You Invest in Press Releases?
- Keep Optimising for E-E-A-T
- Enhance User Experience with Strategic Website Design
- Optimise Your Topical Authority
- Publish Valuable Content to Stay on Top of Search Rankings
- How Does This Leak Affect Pure SEO’s Approach to SEO?
What We Know About the Leak So Far
Experts discovered the leaked documents after they were inadvertently made available on GitHub between March and May 2024 by an automated code tool. The leak revealed numerous features related to Google’s content storage API, providing insights into Google Search and its internal ranking features and signals. The documents include thousands of pages detailing both active ranking features and deprecated ones.
SEO practitioner Erfan Azimi was among the first to discover the leak, sharing it with Rand Fishkin (SparkToro cofounder and CEO), who contacted technical SEO expert Mike King (iPullRank founder and CEO). Mike had learned of the leak from Dan Petrovic (highly regarded search industry expert and director of Dejan Marketing) and conducted an extensive analysis.
In his initial article, Rand Fishkin commented on the significance of the leak: “In the last quarter century, no leak of this magnitude or detail has ever been reported from Google’s search division.”
In a live webinar on Friday, June 7th, Mike King shared, “It is the clearest indication that we have gotten of what Google is actually looking at for computing results for organic search.” The documents provide a rare glimpse into the intricacies of Google’s search engine operations and have significant implications for businesses relying on SEO for visibility.
Experts agree that the leak seems to contradict statements Googlers have made publicly over the years, “particularly the company’s repeated denial that click-centric user signals are employed, denial that subdomains are considered separately in rankings, denials of a sandbox for newer websites, denials that a domain’s age is collected or considered, and more,” Rand states.
Google’s Response to the Leak
In communications with Barry Schwartz (Executive Editor at Search Engine Roundtable), 32 hours after the data leak was reported, Google confirmed the leak but cautioned against making assumptions about Google Search’s internal workings based on the documents. “We would caution against making inaccurate assumptions about Search based on out-of-context, outdated, or incomplete information. We’ve shared extensive information about how Search works and the types of factors that our systems weigh, while also working to protect the integrity of our results from manipulation.”
The online consensus from experts and marketers seems to be to take everything Google states with a grain of salt. In a poll on X(formerly Twitter) by Barry Schwartz, 89.8% of the 1,709 respondents answered “No” to the question, “Will you trust Google going forward?”
Mike King also explained that while he doesn’t fault Google representatives for protecting their proprietary information, he does disagree with the company’s previous discrediting of experts who have shared reproducible insights around the workings of Google. “My advice to future Googlers speaking on these topics: sometimes it’s better to simply say, ‘we can’t talk about that.’ Your credibility matters….”
Insights from the Leak
Experts like Rand Fishkin, Mike King, Barry Schwartz, Aleyda Solis, Kevin Indig, and Andrew Ansley have summarised findings from the leak so far.
The documents revealed over 14,000 potential ranking features associated with Google Search, Maps, YouTube, Lens, and more.
Many of the features in the leak confirm long-held suspicions within the SEO community about what Google considers for Google Search. These include features related to PageRank, user behaviour signals, content relevance, and the importance of backlinks.
Additionally, insights have surfaced around Chrome clickstream data, quality rater feedback, and the impact of entity mentions. There’s a multitude of insights experts have found from the leak; here are some of the most notable ones:
- Chrome Clickstream Data: Google can use data from the Chrome browser to help determine which links count, how to weigh certain signals, and when to show videos and images. This includes analysing user behaviour, such as clicks and engagement, which could potentially indirectly boost organic rankings through paid clicks.
- NavBoost (A.K.A., Glue): NavBoost is one of the ranking features mentioned most frequently post-leak. NavBoost and Glue are ranking systems discovered during the Google antitrust trial. The Glue and NavBoost ranking systems work similarly to boost, demote, or reinforce rankings in the SERPs. They employ click data to help determine what surfaces in the results, and Chrome data is also used in this. NavBoost measures click data in the SERPs, such as longest click (the results that users stay on the longest) and last good click (the last time a user visited a site and stayed).
- Quality Rater Feedback: Feedback from quality raters is more directly used in ranking systems than previously anticipated.
- Whitelists: Google maintains whitelists (a.k.a. ‘safelists’) for specific topics, such as travel, pandemics, and elections, to filter out misinformation and problematic sources.
- Site Type Limitations: Google may limit the number of sites of a given type (e.g., blogs, commercial sites) from surfacing in search results.
- Entity Mentions: Mentions of entities, including brand names, are believed to have more weighting on rankings than previously understood.
- Page Titles: The leak mentioned Google’s use of a ‘titlematchScore’, highlighting the significance of page titles in alignment with page content and their potential site-wide impact.
- PageQuality for Article Pages: According to Andrew Ansley, Google has a measurement called ‘pageQuality’ (PQ) that utilises a large language model (LLM) to determine effort for article pages. Content uniqueness, images, tools, videos, and article comprehensiveness all influence how much a page stands out in ‘effort’ calculations.
- Website anchor text: Over-optimising website anchor text can harm your site and lead to spam demotions for links, particularly if you have a large number of irrelevant or spammy links from third-party sites.
- Site Age Matters: Older sites are trusted more by Google; however, if older sites want to maintain authority, they need to keep producing fresh content and maintaining their websites to ensure that content is current, helpful, and relevant.
- E-E-A-T is somewhat ambiguous in Google’s system: When evaluating the quality of search results, the importance of E-E-A-T (Experience, Expertise, Authoritativeness, and Trustworthiness) as a scoring factor remains vague. E-E-A-T is not a direct ranking factor but remains influential in assessing web page quality. According to the leak, authorship is highlighted as a significant signal that Google tracks. Google monitors and stores information about authors associated with page content and determines if an entity on a page is the page’s author.
What Should Marketers Take from this Data Leak?
Prioritise Links from High-Traffic Sites:
Focus your link-building efforts on acquiring links from high-traffic sites. If you have loads of links, but no one clicks them, where’s the value? Instead of using spammy tactics such as acquiring numerous low-quality or irrelevant links to manipulate search results, focus on getting fewer, high-quality links. Google recognises relevant links that offer real value, strengthening site authority. Contrastingly, low-quality links may damage your ranking and website reputation. It’s worth noting that even no-follow links have value if you acquire them from a reputable source with high traffic.
The Impact of Click-Through Rate (CTR) on Ranking:
Despite previous denials that CTR impacts ranking, it does. So, what should you do, knowing that CTR is a ranking factor?
“The bottom line here is that you need to drive more successful clicks using a broader set of queries and earn more link diversity if you want to continue to rank.” (Mike King).
Driving more qualified traffic and improving user experience on your site will help signal to Google that your page is worth ranking. Additionally, instead of just throwing content on your website and leaving it, consider whether the content offers a sticky user experience. Will users see the content, stay on the page, click to another page on your website, or be pogo-sticking right back to Google search results for a different answer? Ensure your content answers their questions fully to prevent them from returning to Google for a similar search.
Balancing Zero-Click Strategies with CTR:
Zero-click strategies aim to keep users engaged on a platform without having to click away from the site. In the SERPs, zero-click results may include feature snippets or answers in the “People also asked” section. You can also implement zero-click strategies in blog articles and marketing email newsletters (e.g., creating content-rich newsletters that engage readers and keep them on your email).
To balance the benefits of zero-click strategies with the need to drive clicks and traffic to your site, consider combining these approaches with tactics that encourage user engagement and clicks. For instance, show part of blog posts in your newsletter or on social media to entice users to click through to your website to read the full article.
Take on a Holistic Approach to SEO and Marketing
All your marketing efforts online and offline impact your website ranking and click-through rate. For example, paid search is interlinked with organic search – if you rank high in the SERPs both organically and via paid ads, your chances of clicks increase significantly.
Various marketing channels—including paid ads, radio, billboards, podcast advertising, event sponsoring, and TV commercials—are useful for enhancing brand awareness, recognition, and preference. A holistic marketing approach will help you build your brand credibility, speaking to your audience where they naturally congregate. In the eyes of Google, a holistic marketing strategy done right means user partiality to your site due to positive brand exposure, leading to increased click-through rates and higher rankings.
Leverage Entity Mentions
While not directly associated with ranking scores, brand and entity mentions help reinforce your website. They assist Google in navigating and understanding your website and better connecting the relationships when surfacing you in the SERPs. Increase efforts to get entity and brand name mentions to enhance your SEO. What does this look like in practice? Essentially, create great content worthy of mentions, create great content consistently (because Google likes consistency), and leverage your newsletter campaigns and social channels to share it.
Should You Invest in Press Releases?
The answer to this depends on several factors. Press releases can be effective if they’re picked up by high-authority sites, generate clicks, and lead to other sites referencing them. However, getting articles published on major sites is rare, as they are very selective.
On the other hand, if your PR strategy involves using PR tools to distribute the same press release across multiple small, low-traffic sites, it’s likely not worth your time. Such releases often count as duplicate content, which Google may discount.
Mike King shared his perspective: “If I’m going to spend time and money on a press release, I’d rather spend that on creating a piece of content that’s going to naturally just get great links.” Instead of issuing a press release for every new feature, focus on creating high-quality content that can attract valuable links organically and redistribute link equity throughout your site.
Keep Optimising for E-E-A-T
Regardless of E-E-A-T not being an official ranking factor, creating content that aligns with E-E-A-T guidelines is good for your website and visitors. Ultimately, even if Google doesn’t prioritise E-E-A-T as a ranking factor, it’s crucial because users do value it. For instance, mentioning that this blog article has been thoroughly edited and reviewed by our experts enhances the credibility of this post. You should also write author bios and update your content regularly to stay relevant.
Enhance User Experience with Strategic Website Design
If you can, from the start, we highly recommend investing in a well-designed website with intuitive architecture. This will enhance user experience on your site and assist in increasing its frequency of surfacing in the SERPs.
Optimise Your Topical Authority
Ensure you have strong authority in your primary topic. If your website has old content that is not topically relevant to your business, you should remove or block it to boost authority in your primary topic.
Contrastingly, outside of your primary topic, if you have a subtopic that isn’t well-connected to your primary topic, solidify it as a subcategory by creating content around it (i.e., topic pillar content) and ensuring your website includes smooth navigation to it (e.g., including a link on your homepage and in your services page).
Publish Valuable Content to Stay on Top of Search Rankings
To stay on top, you need to continuously publish fresh content that generates engagement and meets user search intent. Rather than optimising only for traffic, focus on providing valuable content that encourages users to return to your site. This is crucial for several reasons:
- Freshness Twiddler: Google prioritises fresh content with a re-ranking system. When search results surface old, outdated content, it’s an opportunity to write fresh content about it. This allows you to rank new content by providing recent information for a query or topic. Additionally, if you have old content that is no longer relevant or driving traffic but addresses a key customer pain point, rewrite and update it as well. Google also considers an article’s published date. Ensure consistency with the publication date across page titles, XML sitemaps, on-page content, and structured data.
- Author Bios: Google explicitly stores the author of a document and looks to identify if an entity is the author. For example, if we list our CEO, Richard Conway, as an author, Google will try to connect this with other mentions of Richard Conway across the web, such as LinkedIn profiles and mentions in articles. This can help reinforce E-E-A-T. Using schema markup can accurately connect authors with other web entities.
- Content Decay: Users are voters, and Google monitors and stores their clicks. Google segments the number of bad clicks by country and device. If your webpage is not getting the expected clicks for its ranking position or is experiencing low engagement, your ranking position will decline. When you see these signals, it is time to refocus your content.
- Backlinks: The quality of your backlinks matters.Quality backlinks are measured by freshness, relevance, and whether they come from high-tier sites.
How Does This Leak Affect Pure SEO’s Approach to SEO?
When asked how these updates could affect Pure SEO’s approach to SEO, Prabin Yonzon, Head of Organic Search and CRO at Pure SEO, shared, “Our focus day in and day out is still the same—continuously improve, track outcomes, optimise based on those outcomes, test new theories, and improve further”.
Our team is constantly moving forward, innovating our practices while adhering to proven methods. As far as digital marketing and SEO go, our strategy remains consistent: to create high-quality digital marketing strategies for our clients that align with their brands and users.
Changes resulting from this data leak will be slow and gradual over time. Be cautious of implementing strategies that veer away from current ranking best practices! While this leak reveals very interesting insights, the underlying strategy for ranking remains the same: create great content that benefits users.
If you’re looking to maintain or improve your website ranking and online authority, sign up for our newsletter. We’re always monitoring these updates and will keep you in the loop with any big industry updates or changes to Google’s algorithm.
For more detailed insights on the Google Search data leak, check out the following: