Introduction
A set of confidential documents presented as part of the U.S. antitrust trial against Google has revealed surprising details about how its search algorithm actually works. Despite the rise of artificial intelligence and language models like BERT, Google’s ranking still heavily relies on hand-crafted signals designed by its engineers.
For those of us working in SEO, these leaks offer a rare opportunity to better understand which factors still matter most when it comes to ranking a website. In this article, we break down the key revelations, explain what truly influences rankings, and what this means for your SEO strategy in 2025.
II. Google’s Ranking System Is Still Hand-Crafted
Despite all the talk about AI and automation, the core of Google’s algorithm still depends on manually tuned signals. These aren’t automated or generated by AI models — each one has been defined, tested, and calibrated by real engineers using actual user data.
One of the leaked documents confirms that most top-level ranking signals are still based on a linear combination of individual scores. That means each factor is added with a specific weight — no black-box magic.
In short: Google hasn’t handed the keys to AI. Machine learning supports the process, but traditional signals still dominate. What worked 10 years ago — good content, quality links, and solid user experience — still works today.
III. The ABC Signals: Still the Core of Ranking
Among the 100+ signals Google uses, three continue to form the backbone of ranking — the so-called ABC signals:
• A – Anchors: These are the links from other sites pointing to yours. It’s “what the web says about you.” The more high-quality links, the better.
• B – Body: This is the content of your page. Google analyzes the structure, clarity, and how well it addresses the user’s query — including titles, text, and formatting.
• C – Clicks: Google measures user behavior. If users click and stay on your page, that’s a positive signal. If they bounce quickly, not so good.
These three together determine topical relevance. They help Google decide if your page truly answers the user’s search intent.
Bottom line: links, content, and user satisfaction remain the heart of SEO in 2025.
IV. What Is Q* (Q-Star)?
One of the lesser-known signals mentioned in the leaks is Q* (pronounced “Q-Star”). It’s an internal quality score used by Google to assess the overall trust and authority of a website.
Unlike other signals, Q* is not query-specific. It doesn’t depend on what the user searches for. It evaluates whether your site is reliable and valuable in general.
Another key detail: Q* does not change easily. If a website has a low score, it is not enough to improve an article or make minor changes. You need to improve the whole site: structure, editorial quality, user experience, update frequency, etc.
This also explains why some new websites take longer to rank, or why certain domains never take off, even with optimised content. If the system detects low overall quality, Q* penalises it.
For those of us who work in SEO, this underlines an important lesson: it is not enough to optimise page by page. The perceived quality of the entire site directly influences the results.
V. Navboost: Not AI, Just a Giant Click Table
One of the most surprising points in the leaked documents is the detailed explanation of Navboost, one of the most influential signals in Google’s algorithm. Despite the technical name, Navboost is not an artificial intelligence system. It is literally a huge table that collects click data from users over the past 13 months.
Here’s how it works: for every search someone does on Google, Navboost stores how many times each result has been clicked. For example, if for the search ‘best running shoes’ one page has received 3 clicks and another 10, that difference is stored and used to adjust the ranking.
No machine learning or complex modelling. Just a massive record of human behaviour, aggregated and classified by device type and location.
This has two important implications for SEO:
1. User behaviour matters a lot. If your title and description do not generate clicks, you will hardly climb positions.
2. History counts. Google doesn’t just rely on what’s happening today, but on click patterns over a year.
Navboost demonstrates that user interaction remains one of the most powerful signals, even ahead of many technological developments.
VI. AI Signals: DeepRank, RankEmbed and Twiddlers
Although traditional signals are still the core of the algorithm, Google has been incorporating some layers of artificial intelligence to refine the results. These include three systems: DeepRank, RankEmbed and the so-called Twiddlers.
DeepRank
Based on the BERT model, DeepRank attempts to better understand the relationship between a user’s query and the content of a page. It does not replace traditional signals, but helps to better interpret the context. For example, it differentiates between an informational and a transactional search, or between two terms that sound the same but mean different things.
RankEmbed
This is one of the most recent and advanced systems. It uses models trained on large amounts of text to represent pages and queries in a vector space. Translation? Google can measure similarity between terms even if they do not match literally. It is useful to provide more relevant results even if the words do not match exactly.
Twiddlers
Twiddlers are internal tools that allow Google to reorder the results after all the initial filters have been applied. For example, they can lower a page if it is considered too commercial or unhelpful, even if it meets other signals. They also serve to temporarily adjust results after important events or updates.
VII. How does the internal structure of the Google search engine work?
The leaked documents also show an unusual diagram detailing how information flows from the time a web page is found until it appears in search results. This structure, which Google calls its ‘Search Stack’, is made up of different layers:
1. Multiverse – Data Acquisition
This is the starting point. Here Google collects information from everywhere: web pages, structured databases, external feeds and other formats. Googlebot crawls the web and Multiverse organises the raw content.
2. Preparation and indexing
The content is cleaned, normalised and structured. This part includes adding information to the Knowledge Graph, which feeds the rich results (such as panels, accordions or information cards).
3. Understanding the consultation
Google uses components such as the Query Understanding Service (QUS) and a system called Superroot to interpret what the user really means. It’s not just finding keywords, but understanding the intent behind the search.
4. GWS – Results server
Once the query has been processed, Google Web Server (GWS) is responsible for displaying the results to the user, also taking into account personal context: location, history, type of device, etc.
5. Registration and adjustment
Everything is recorded in the Logging Stack, which allows engineers to review interactions, detect patterns and adjust signals if necessary.
This process confirms what many already suspected: Google’s search engine is not just an index, but a complex and dynamic system that adjusts results in real time according to thousands of variables.
VIII. What does all this mean for SEO in 2025?
Despite the growth of artificial intelligence and advanced models, the basic rules of SEO remain in place. The leaks from the trial confirm what many professionals already sensed: Google has not replaced its classic system, but supplemented it.
This is what really matters today:
1. Content is still king
It is not enough to mention keywords. Google evaluates whether your content is useful, clear, well-structured and relevant to the search intent. The Q* signal penalises sites of low overall quality.
2. Links are still important
The link profile remains one of the key signals. It is not just about quantity, but about quality and context. Links from relevant sites still make a difference.
3. User experience counts more than ever
Navboost makes it clear: if users click on your page and stay, that’s a good thing. If they get in and out quickly, your ranking drops. Attractive titles, loading speed, readability and responsive design are essential.
4. Artificial intelligence does not do everything
RankEmbed, DeepRank and Twiddlers help interpret and refine results, but they are not a substitute for traditional signals. If you don’t have a solid foundation, AI is not going to rescue you from the bottom of the rankings.
5. Don’t think about gimmicks, think about consistency.
Signals like Q* are based on the overall quality of the site. You can’t fix your ranking with a couple of technical tweaks. You need to build a useful, reliable and consistent site for the long term.
These revelations reinforce a simple truth: real SEO is not magic, it is work well done.
IX. Summary
The leaked documents from the antitrust case against Google have given us a rare – and highly unusual – glimpse into the inner workings of the world’s most influential search algorithm.
Far from being a 100% AI-driven system, Google continues to rely on a combination of classic signals, user behaviour data and some new layers of processing with models such as RankEmbed and DeepRank. Content quality, site structure, links and user behaviour remain the cornerstones of ranking.
For those who manage websites or do serious SEO work, the message is clear:
✅ Don’t get hung up on technical gimmicks or fads.
✅ Invest in useful content.
✅ Make sure your users find what they are looking for.
✅ And maintain a consistent long-term strategy.
Because, in the end, the algorithm changes… but the logic behind what Google rewards remains the same: quality, relevance and real utility for the user.