6 Takeaways From Semrush’s 2024 Ranking Factors Study

EHzh...6AzB
28 Feb 2024
31

6 Takeaways From Semrush’s 2024 Ranking Factors Study

Semrush’s 2024 Ranking Factors Study offers insight into the different factors that may influence a webpage's ranking on search engines.
It’s a promising resource for content creators and publishers looking to optimize their SEO strategies.
As with any study, though, it’s important to approach the findings with a discerning eye: this way, we can tease out the most important details and turn them into actionable, practical insights for improving performance.
We’ve gone through the study and Semrush’s data, and have come up with six key takeaways that we feel provide the greatest value for content marketers to focus on. As always, the key lies in applying these insights thoughtfully, collecting data, and seeing how they impact real-life performance and results.
Different factors will impact different types of content in different ways, so experimentation is key to making the most of this valuable study.
Let’s dive in!

1. Content is (still) king


The fact that text relevance is a key ranking factor is unsurprising. SEO is all about content and links—and in this case, we’re talking about high-quality, relevant content. Its importance has been further reinforced recently by the Helpful ContentUpdate and Google’s revised guidelines for creating “helpful, reliable, people-first content.”
Incidentally, we were particularly impressed with Semrush’s approach of using BERT (Bidirectional Encoder Representations from Transformers) to benchmark content, as this is key component of how Google describes the inner workings of Search.
But what does “Text Relevance” actually mean?

Semrush includes this technical-sounding description in the report:
“We used embeddings (numerical representations of text) for this factor. By using BERT model embeddings to compare content on a page to content on the SERPs, we were able to assess the similarity (or "relevance") between all content on the SERPs and content on specific URLs. This allowed us to explore content factors beyond the usage of exact match keywords but also based on semantics and context.”
The process described above neatly follows how Google search works. Google has been using Natural Language Processing (NLP) and BERT (Bidirectional Encoder Representations from Transformers) in search results since 2019. They are used to understand the overall context in which words appear and better match users’ search intent to online content.What this means for marketers: 
For content creators, this underscores the importance of creating content that is not just keyword-focused but also contextually rich and topically comprehensive. In many ways, it looks like topical coverage is more important than focusing on individual keywords.
Leveraging tools like Surfer or Frase can be beneficial as they incorporate aspects of Google's own NLP entities to gauge content quality. But use these tools to assist you in improving your content—not as the end-all-be-all. The most important benchmark is still whether the content is “helpful, reliable, and people-first.”
Finally, marketers should aim to create original content that’s unique and stands out from existing SERPs. Google is starting to prioritize uniqueness and originality in the search results. Gone are the days where all ten results look exactly the same, with the same titles and similar meta descriptions and content. This is evident with the inclusion of more User Generated Content (UGC) and aligns with the principles set forth in the Helpful Content Update.

2. Including “A URL’s Organic Traffic” is a surprise

The fact that Semrush chose to include a Url’s organic traffic in second position (or as a ranking factor at all) came as a surprise to us—and others in the SEO community.

Like Ben Luong, we would argue that a URL’s organic traffic is more of a result of page rank and not necessarily a main contributing factor to ranking.
In Semrush’s defense, they do address this “Chicken and egg” scenario in the notes:
Nonetheless, even seasoned SEO’s and content marketers may have a difficult time interpreting its meaning. Or whether it’s meaningful at all in the context of actionable outcomes of the study: in this case, we would say that correlation doesn’t equal causation.
And here’s a cheeky image to drive that point home.

3. Domain Authority is a ranking factor that evades Google’s “relevancy” algorithm

Domain Authority Score comes in as the sixth most important ranking factor on Semrush’s study.
Based on our own observations of search results over the last few years, this is something we wholeheartedly agree with.
Furthermore, it seems to be something Google’s “relevancy” algorithm is really struggling with. Websites with no particularly strong, relevant “E-E-A-T” signals (or completely absent them) are ranking for terms well outside of their core content.
For example, the medical website, WebMD, published an article “How to Recycle Tires”—which featured on the first page of Google’s search results!
Some SEOs pointed out how strange this was on Twitter, particularly as the article was “Medically Reviewed by Mahammad Juber, MD”. It wasn’t long before the article was removed from the website, but you can still find the page on the Internet Archive.


This will come as no surprise to anyone who has followed Forbes’ SEO strategy over the last few years. The over 100-year-old business magazine, now online, has diversified away from personal finance content to review consumer goods, including everything from pots and pans to baby clothes.


Forbes does this in a section of the website called “Vetted”. Their description for what this section covers is extremely broad and gives them license to write about any subject.

Powered by the reach and scale of the world's largest business media brand, Forbes Vetted curates the best products and services for influential innovators who want to live their most satisfying lives without wasting a penny or sacrificing quality.

Why is this a problem?

It has been a trend over the last few years for large publishers to squeeze out smaller, niche sites and use affiliate marketing to increase their revenue.

Source: https://twitter.com/thesearchonion

It doesn’t matter if other content is more in depth, more helpful, and written by true topic experts. The search results suggest that Google “trusts” brands like Forbes and WebMD more because of their domain authority.
This really drives home the importance of backlinks and how they can create a moat that is hard for small publishers to compete with.
Some websites with high domain authority sell guest posts and this has given rise to something called Parasite SEO. Third-parties can now place content (with their affiliate links) onto these domains and it will rank quickly in Google.
This is why we believe Google is likely to tweak its search algorithm in future and hopefully domain authority will become less of a ranking factor, especially for content that’s not relevant to a company’s primary mission and position.

4. Content quality should be your “North Star”


Ultimately, ‘quality’ has to do with bringing substantial value to your readers. Review your content with this as your North Star.
We love this. It’s a sentiment we put into every brief we create and every piece of content we write.
How can we provide value that other search results do not? How can we give the reader a return on their attention? 
At Eleven, our solution to this is often to provide actionable advice and insights. What we call “use-today” information. So often, we find that our clients’ competitors cover a topic at a fairly superficial level. It’s noted that something is important, but not how to do it.
For example, while preparing a brief for an article on Accurate Financial Reporting, our account manager left this comment for the writer: 
Consider the "Save your receipts" section in the [REDACTED] reference article.

It does a good job of saying which ones... But where do I store them? What best practices should I use? (Should I scan them? Reconcile them immediately in software? Any software recommendations for it? Etc.)

The reader should never be left wondering, "But hoowww do I do it? Can somebody please just tell me!

"99% of content on the web doesn't tell us how, just that "it's important to" and "we should do." Imagine how frustrating this must be for readers ;)
Another important note we give to writers is to avoid fluff, complicated phrasing, and awkward syntax—to instead “write naturally.”
We emphasize that, while most readers probably can understand what you’re saying…If it’s too much work, they’ll just find a simpler explanation elsewhere.
Plus, removing fluff and repetition and using clear, “common-language” wording helps NLPs and BERT better discern the context. As seen above, these are important ranking factors.
Pro Tip: Need some help eliminating fluff from your writing? Check out our 9 Tips to Eliminate Fluff in Your Writing.

5. Speed Was Not Found To Be A Ranking Factor—But It’s Still Very Important!

Longest Contentful Paint (LCP) is a performance metric that measures the time it takes for the largest content element on a page to load, which is critical for understanding user experience.
Despite none of the speed metrics showing significant correlation with higher rankings (“minimal or no correlation to higher ranking”, page 20), LCP stands out as particularly interesting because it’s one of the speed metrics that many companies struggle with the most.
It’s often debated by SEOs if speed really matters, and this study seems to suggest that it doesn’t.
But that would be the wrong takeaway. Page rank is just one of several metrics you should be watching, and there are very compelling reasons to monitor speed, notably for the potential to improve conversion rates.
How significant is that potential? Let’s take a look:

  • Walmart found a two-to-one relationship between page speed and conversions: for every one second a page loaded faster, conversions increased by 2%.Mozilla experimented with page speed and found that shaving 2.2 seconds off the average load time increased download conversions by 15.4%
  • According to Ali Express, an additional 2 seconds of page loading during transactions increases cart abandonment by 87%.

Page speed may not matter for SEO and ranking, but it definitely matters for conversions and sales revenue.
That’s why we chose to highlight this finding: to demonstrate that the results of the Semrush study must be understood in context.

6. E-E-A-T only gets a brief mention

This came as a surprise to us, given how often E-E-A-T* is mentioned in the Search Quality Raters Guidelines.
There could be a few reasons for this. For example, it may be due to the types of pages Semrush tested. That is, they mainly tested home pages and services pages—where author profiles are less important.So does E-E-A-T matter?
Definitely, yes.
With the rise of AI, we expect to see E-E-A-T signals become more and more important with subsequent Google updates.
Very often, Google paints a “theoretical picture” of what they believe the web should look like before implementing updates that push brands and publishers in that direction. We believe Google wants to drive home the point that Experience, Expertise, Authority and Trustworthiness are essential—but its ability to evaluate this is still evolving.
That doesn’t mean it’s unimportant, though. On the contrary: now is the time to start building E-E-A-T signals across your website to avoid a crash when the next Google update comes along.
Plus, just like page speed, E-E-A-T can impact other (more) important outcomes like conversions and sales revenue and brand trust and loyalty.
This is why E-E-A-T is a core component of our own approach to SEO and content marketing.
It’s one of the primary reasons we work exclusively with topic-expert writers with authoritative bylines.
It’s also why, at the start of a collaboration, we place so much emphasis on building out communications between a company’s marketing department and other critical parts of the business, such as sales, operations, product, customer success, and leadership.
This is where the company’s most valuable knowledge lies, and getting it into the hands of marketers in an efficient, effective way is key to producing high-quality content that’s truly relevant to your audience.
Pro Tip: Not sure how to start adding E-E-A-T signals to your website? Check out our article on How To Use Google’s Guidelines To Improve Your Website’s E-E-A-T.

What We’d Like To See Added In The Next Rankings Report

The Semrush study is a goldmine of useful information, and they’ve done a great job of categorizing a wide range of ranking factors.
Here’s what we’d like to see in their next ranking report:

Percentage of pages that get traffic

We did wonder about this takeaway from Semrush:

Moving from position 2 to position 1 can lead to an increase of +50% in organic traffic.Start with pages on your website with which you’re already ranking on the first page of Google. Invest time into updating and improving them in order to improve their organic performance”.

Without further evidence, this seems like sound advice.

However, the documentation around the Helpful Content Update (HCU) includes mention of a sitewide “classifier” that can affect how all content on a website ranks.

Any content — not just unhelpful content — on sites determined to have relatively high amounts of unhelpful content overall is less likely to perform well in Search, assuming there is other content elsewhere from the web that's better to display. For this reason, removing unhelpful content could help the rankings of your other content.

This classifier process is entirely automated, using a machine-learning model. It is not a manual action nor a spam action. Instead, it's just a new signal and one of many signals Google evaluates to rank content.
This has led many SEOs to speculate how exactly would Google be able to judge if content is helpful or not? One logical suggestion is to identify how many pages on a website do not get any search impressions or clicks. If the number (or ratio) of these pages were too high, then a sitewide classifier would be added and all published content could be affected.

If Semrush was able to add this data in the next report, this could help prove or disprove the theory. This would be incredibly helpful as websites are loath to remove published content regardless of the number of clicks and impressions. It would also help SEOs know what to focus on following significant traffic drops.

Duration and consistency of ranking

It’s common for many sites to get started, build traffic, get on Google’s radar… And then fall afoul of the algorithm.
Anecdotal evidence suggests that sites started after the last HCU update (September, 2023) are doing well. But in our experience, it’s common for SEO issues to arise at around the 18-month mark.
Essentially, what we’re hoping for here is longitudinal rather than cross-sectional data. The current Semrush report provides a “snap-shot” of the Internet. This is in contrast to the data that most marketers use in their day-to-day work: we look at changes over time and try to attribute them to our own activities. Having this data on a very large scale would be invaluable.

Get fast shipping, movies & more with Amazon Prime

Start free trial

Enjoy this blog? Subscribe to jihad

0 Comments