Google Penalty Archives | IMSR https://www.improvemysearchranking.com/category/google-penalty/ Improve My Search Ranking Thu, 14 Nov 2024 09:26:29 +0000 en-GB hourly 1 Google announces June 2024 spam update https://www.improvemysearchranking.com/google-june-2024-spam-update/ Wed, 26 Jun 2024 15:47:39 +0000 https://www.improvemysearchranking.com/?p=22916 Search engine optimisation (SEO) is a constant dance between providing valuable content to users and adhering to the ever-evolving algorithms of search engines like Google. On June 20th, 2024, Google announced the rollout of its June 2024 Spam Update. This update aims to further refine search results by targeting websites that violate Google’s Webmaster Guidelines. […]

The post Google announces June 2024 spam update appeared first on Improve My Search Ranking.

]]>
Search engine optimisation (SEO) is a constant dance between providing valuable content to users and adhering to the ever-evolving algorithms of search engines like Google.

On June 20th, 2024, Google announced the rollout of its June 2024 Spam Update. This update aims to further refine search results by targeting websites that violate Google’s Webmaster Guidelines.

The announcement itself was brief, delivered via Google’s Search Central Twitter account: “Today we released the June 2024 spam update. It may take up to 1 week to complete, and we’ll post on the Google Search Status Dashboard when the rollout is done.” While concise, the message carries significant weight for website owners and SEO professionals.

Google’s Search Liaison says this update is not the algorithmic component of the site reputation abuse update.

Understanding Google spam updates

Google regularly rolls out spam updates throughout the year. These updates are crucial for maintaining the integrity of search results by identifying and penalizing websites that employ deceptive or manipulative tactics to gain higher rankings.

Spam updates typically target a variety of techniques, including:

  • Keyword stuffing: Unnaturally cramming keywords into content to the point of sacrificing readability.
  • Cloaking: Serving different content to search engines than what users see, often with the goal of manipulating search rankings.
  • Hidden text or links: Placing invisible text or links on a webpage to manipulate search engine algorithms.
  • Automated content: Using software to generate low-quality, irrelevant content.
  • Link schemes: Building unnatural or paid links to a website to artificially inflate its authority.

By identifying and penalising such practices, Google ensures that search results prioritise high-quality, informative websites that offer genuine value to users.

The potential impact of the June 2024 spam update

While Google hasn’t released specific details regarding the June 2024 update’s focus, it’s likely to target a combination of the aforementioned spam tactics. This could lead to a shakeup in search rankings, potentially impacting websites that have been relying on outdated or manipulative techniques.

Here’s a breakdown of the potential outcomes:

  • Visibility loss for spammy websites: Websites heavily reliant on spam tactics might see a significant drop in search rankings, potentially disappearing from search results altogether.
  • Rise of high-quality content: This update can be seen as a positive development for websites that prioritise user experience and publish valuable, informative content. Such websites might see a rise in their organic search visibility.
  • Fluctuations for grey-area sites: Websites that fall in a grey area, employing tactics that might not be explicitly classified as spam but push boundaries, could experience ranking fluctuations. This may prompt them to re-evaluate their SEO strategies.

Safeguarding your website from the update’s impact

Here’s what you can do to ensure your website remains unscathed by the June 2024 Spam Update:

  • Review Google’s Search Essentials: Familiarise yourself with the latest guidelines to ensure your website adheres to Google’s best practices.
  • Focus on user experience: Prioritise creating high-quality content that provides genuine value and addresses your target audience’s needs.
  • Build links naturally: Earn backlinks organically through high-quality content and collaboration rather than resorting to paid or manipulative link-building schemes.
  • Monitor your website: Utilise tools like Google Search Console to monitor your website health and identify any potential spam flags raised by Google.
  • Stay updated: Keep yourself informed about the latest SEO trends and Google algorithm updates to adjust your strategy accordingly.

By following these steps, you can build a website that is not only Google-friendly but also genuinely valuable to your users.

Conclusion

The June 2024 Spam Update serves as a reminder that sustainable SEO success lies in providing valuable content and building trust with your audience. While short-term tactics might yield temporary gains, focusing on long-term value creation will ensure your website thrives in the face of algorithm updates and shifting trends.

Regularly review your SEO strategy, stay informed, and prioritise user experience to navigate the ever-evolving SEO landscape with confidence.

Checkout the following resources if you want to learn more:

 

 

The post Google announces June 2024 spam update appeared first on Improve My Search Ranking.

]]>
The great H1 vs. Title tag debate: Does Google even care? https://www.improvemysearchranking.com/title-tag-optimisation/ Mon, 17 Jun 2024 11:36:22 +0000 https://www.improvemysearchranking.com/?p=22906 Title tag optimisation has been a hot topic among SEO professionals for years, particularly regarding the importance of matching a web page’s H1 tag with its title tag. In the age of relentless keyword stuffing, it seemed like a no-brainer: cram those keywords into both spots for maximum ranking power! But with Google’s ever-evolving search […]

The post The great H1 vs. Title tag debate: Does Google even care? appeared first on Improve My Search Ranking.

]]>
Title tag optimisation has been a hot topic among SEO professionals for years, particularly regarding the importance of matching a web page’s H1 tag with its title tag. In the age of relentless keyword stuffing, it seemed like a no-brainer: cram those keywords into both spots for maximum ranking power!

But with Google’s ever-evolving search algorithms, is this method of title tag optimisation even relevant anymore? In this article, we find out what H1s and title tags actually mean, explore how Google uses them, and see what the search engine giant itself has to say about this SEO mystery

What is a title tag?

The title tag, nestled within the <head> section of your web page’s code, is like a billboard for your content. It’s the concise description that appears in search engine results pages (SERPs) when someone searches for a term related to your web page’s topic. Think of it as a first impression, a quick and informative blurb that entices users to click and explore further.

What is an H1?

The H1 tag, the largest heading on your webpage, acts as a headline. It introduces the main topic of your content and gives visitors a clear idea of what they’re about to read.

Unlike the title tag, the H1 tag isn’t directly visible in SERPs, but it plays a crucial role in structuring your webpage and making it easier for users to navigate.

How does Google use H1s and title tags?

Google uses both H1s and title tags to understand the content of your webpage and determine its relevance to search queries. They also play a role in crafting the title link – the clickable text displayed in SERPs that users see before clicking through to your page.

Here’s the interesting part: Google doesn’t necessarily prioritise matching the title tag with the H1 tag.

Twenty years ago, keyword-stuffed title tags were the norm, but with advancements in natural language processing (NLP) and machine learning, Google can now grasp the meaning and context of your content without needing an exact keyword match.

Is it important for H1 and title tags to match for title tag optimisation?

According to Google’s Gary Illyes, the answer is a resounding no.

In a Google Office Hours podcast, Illyes stated that you should prioritise user experience: “No, just do whatever makes sense from a user’s perspective.”

This makes perfect sense.

While keyword relevance is still important, cramming your H1 and title tag with the same keywords can sound unnatural and off-putting to users. The goal is to create clear, concise, and informative descriptions that accurately reflect your content and entice users to click.

Here’s a breakdown of how Google uses H1s and title tags, and why you shouldn’t get hung up on an exact match:

  • Title Tags: Google prefers titles that are descriptive and concise, accurately reflecting the content of your webpage. Keywords are still valuable, but prioritise user clarity over stuffing keywords
  • H1 Tags: The H1 tag is your chance to jump a bit deeper than the title tag. Think of it as a more specific introduction to your content.
  • Title Links: Google prioritises using the title tag for the title link displayed in SERPs. However, if the title tag isn’t descriptive enough, Google might use the H1 tag or even pull content from elsewhere on your page to create a better title link.

Conclusion

The age-old question of matching H1s and title tags can finally be put to rest. While both elements remain crucial for SEO, Google prioritises user experience and content clarity over an exact match. Understanding how Google utilises H1s and title tags empowers you to craft a winning strategy that benefits both search engines and users.

The key lies in prioritising the user. Write clear, concise, and informative H1s and title tags that accurately reflect your content and grab user attention. Keyword relevance is important, but prioritise natural language over keyword stuffing.

View your H1 and title tag as complementary pieces, not identical twins. The title tag offers a general overview, while the H1 delves a little deeper into the specific content of your webpage.

Finally, prioritise your title tag (but have a backup plan). Google prefers to use the title tag for the title link displayed in SERPs. However, if your title tag isn’t descriptive enough, Google might use the H1 tag or even pull content from elsewhere on your page. Ensure both your title tag and H1 are clear and informative.

Remember, the ultimate goal is to provide valuable content that keeps users engaged and coming back for more. By following these principles and prioritising user intent, you can create a user-friendly and SEO-friendly experience that lets your fantastic content shine.

The post The great H1 vs. Title tag debate: Does Google even care? appeared first on Improve My Search Ranking.

]]>
Google warns: Fact-check AI-generated content before publishing https://www.improvemysearchranking.com/google-warns-fact-check-ai-generated-content-before-publishing/ Thu, 30 May 2024 14:33:13 +0000 https://www.improvemysearchranking.com/?p=22865 The increasing capabilities of artificial intelligence have sparked a content creation revolution. AI-powered tools can analyze data, generate online content, and even translate languages – all at lightning speed. This has led many to believe that AI could revolutionize content marketing, churning out high-quality content in a fraction of the time. However, a recent episode […]

The post Google warns: Fact-check AI-generated content before publishing appeared first on Improve My Search Ranking.

]]>
The increasing capabilities of artificial intelligence have sparked a content creation revolution. AI-powered tools can analyze data, generate online content, and even translate languages – all at lightning speed. This has led many to believe that AI could revolutionize content marketing, churning out high-quality content in a fraction of the time.

However, a recent episode of Google’s “Search Off The Record” podcast served as a reality check, highlighting the potential pitfalls of AI-generated content.

Their key message?

Human oversight remains crucial to avoid spreading misinformation.

Key points from the Google podcast

  • AI-generated content may contain factual errors: Google highlights a significant risk – factual inaccuracies in content created by generative AI tools.
  • Outdated SEO advice a real threat: The podcast showcases a specific example: an AI-suggested social media post promoting an outdated SEO practice (using rel=”prev/next” for pagination) that Google no longer supports.
  • Human fact-checking is essential: Google emphasizes the necessity of human review before publishing AI-generated content.

The experiment: AI & outdated SEO advice

The Google Search Relations team decided to put AI-generated content to the test.

They used Gemini, an in-house large language model (LLM), to create content on technical SEO concepts for social media posts. While exploring Gemini’s capabilities, they discovered a major limitation:

Gary Illyes on AI and factuality

“My bigger problem with pretty much all generative AI is the factuality – you always have to fact-check whatever they are spitting out. That kind of scares me that now we are just going to read it live, and maybe we are going to say stuff that is not even true,” says Gary

Illyes’ quote underlines a critical concern – AI-generated content requires verification to ensure accuracy. Especially in technical fields like SEO, factual errors can have significant consequences. Incorrect information can mislead readers and damage your website’s credibility with search engines.

Outdated advice

The experiment highlighted a potential pitfall: outdated information can be woven into AI-generated content.

Lizzi Sassman on outdated training data

“If there’s enough myth circulating or a certain thought about something or even outdated information that has been blogged about a lot, it might come up in our exercise today, potentially,” according to Lizzi.

Sassman’s point proved true almost immediately. Gemini recommended using rel=”prev/next” for pagination, a practice Google no longer supports, as confirmed by John Mueller:

John Mueller on rel=”prev/next”: “It’s gone. It’s gone. Well, I mean, you can still use it. Don’t get me wrong. But it has no effect.”

This isn’t just a minor technical detail. Following outdated SEO advice can negatively impact your website’s ranking in search results.

The Takeaway: Human Oversight is Still Critical

The Google Search Relations team’s discussion underscores the importance of human oversight, echoing wider concerns about responsible AI adoption. While AI tools can be a valuable asset for content creation, they shouldn’t replace human expertise.

Why we should care

While AI tools can be helpful for content creation and analysis, it’s crucial to approach their output with a critical eye. Content creators and marketers should use AI as a tool to brainstorm ideas, generate drafts, and conduct research, but verification and revision by human experts remain essential.

Blind trust can be costly

Relying solely on AI-generated content can lead to publishing inaccurate or misleading information, potentially damaging your SEO and reputation. Furthermore, plagiarism can be a concern, as AI may unintentionally copy phrases or sentence structures from its training data.

Conclusion

In conclusion, while AI-powered content creation tools hold promise, careful human review remains essential to ensure factual accuracy and avoid spreading misinformation. Additionally, human creativity and editorial judgment are irreplaceable in crafting content that is not only informative but also engaging and persuasive.

The ideal scenario involves a symbiotic relationship between AI and human content creators, where AI augments human capabilities to produce high-quality content at scale.

You can watch the full podcast episode here:

https://youtu.be/AGoN0oeuZ2k

The post Google warns: Fact-check AI-generated content before publishing appeared first on Improve My Search Ranking.

]]>
Understanding Google’s New Guidelines on Website Ranking Drops https://www.improvemysearchranking.com/understanding-googles-new-guidelines-on-website-ranking-drops/ Tue, 07 May 2024 10:19:57 +0000 https://www.improvemysearchranking.com/?p=22868 Google recently updated its guidelines on diagnosing and managing website ranking drops, providing a deeper insight into the potential causes of traffic decreases and offering advice on recovery strategies. This update is crucial for webmasters and SEO professionals as it sets new expectations and clarifies several aspects of site management and search engine optimization. Let’s […]

The post Understanding Google’s New Guidelines on Website Ranking Drops appeared first on Improve My Search Ranking.

]]>
Google recently updated its guidelines on diagnosing and managing website ranking drops, providing a deeper insight into the potential causes of traffic decreases and offering advice on recovery strategies.

This update is crucial for webmasters and SEO professionals as it sets new expectations and clarifies several aspects of site management and search engine optimization.

Let’s decode and understand what changed:

Understanding Google’s new approach to ranking drops

The latest documentation from Google introduces a more realistic approach toward the recovery from ranking drops.

Unlike the previous guidance, which hinted that most ranking declines could be remedied, the new update stresses the complexity involved in identifying specific causes.

This change indicates a shift towards accepting that not all ranking drops are easily reversible, highlighting the intricate nature of SEO.

Clarifications in graphical representations

Significant improvements have been made in how information is presented in the guide. The new documentation includes enhanced graphics with clearer labels, making it easier to understand the factors contributing to traffic declines.

For example, what was previously categorized under “Site-level technical issue” has now been split into more precise categories like “Large drop from an algorithmic update” or “site-wide security or spam issues.”

This delineation is vital as it helps differentiate between issues caused by external changes and those stemming from internal site problems.

Expanded section on algorithmic changes

A notable enhancement in the guide is the expanded section dedicated to algorithmic changes.

This section merges what were previously separate discussions on policy violations and algorithm updates into a comprehensive analysis. It helps website owners understand the impact of Google’s core and minor updates on their site’s visibility and ranking.

The guide now divides the impact of these updates into two categories: small drops and large drops in position.

This is particularly useful as it offers tailored advice on how to respond based on the severity of the ranking change.

Practical advice on dealing with traffic drops

Google’s revised guidelines provide actionable advice for webmasters experiencing drops in search rankings.

  • For smaller, less significant position changes (e.g., dropping from position 2 to 4), Google suggests monitoring without making drastic changes, as these fluctuations can be normal and might self-correct over time.
  • For more substantial declines (e.g., dropping from position 4 to 29), a thorough reassessment of the entire site is recommended, focusing on enhancing the user experience, reliability, and content quality.

It’s important to note that changes to a website might not yield immediate results. Some updates may take effect quickly, while others could require months to influence search rankings positively.

Google also emphasizes that no changes guarantee a ranking improvement, as the search algorithm prioritizes content that it deems most relevant and useful to users.

Revised recommendations for webmasters

The updated documentation includes revised recommendations for addressing ranking drops due to algorithmic updates. It advises webmasters to first analyze the performance of their top pages in the Search Console to determine prior rankings. Depending on whether the drop is small or large, the approach should vary.

As explained above, small drops typically do not require drastic actions, whereas significant drops might necessitate a comprehensive site review to ensure that the content aligns well with user needs and Google’s quality expectations.

Subtle adjustments for enhanced understanding

In addition to major updates, Google has made several minor tweaks for clarity.

These include the rewording of certain headings for better precision, like changing “You recently moved your site” to “Site moves and migrations,” which provides clearer guidance on specific scenarios.

Conclusion

Overall, Google’s updated documentation on website ranking drops serves as a detailed guide for navigating the complexities of SEO with a more nuanced and realistic approach. It underscores the importance of adapting to changes within the search landscape and setting appropriate expectations for traffic recovery and website performance.

You can check out the full updated guidance here: https://developers.google.com/search/docs/monitor-debug/debugging-search-traffic-drops

The post Understanding Google’s New Guidelines on Website Ranking Drops appeared first on Improve My Search Ranking.

]]>
How Much Duplicate Content is Acceptable for Google? https://www.improvemysearchranking.com/what-equals-duplicate-content-google/ Wed, 05 Oct 2022 09:55:56 +0000 https://www.improvemysearchranking.com/?p=21211 It is estimated that approximately 25 to 30 percent of all web content is duplicate. That’s because different quotes, case studies, and tips are shared across the Internet on different websites. But that’s usually considered “duplicate content” that should be penalised by Google. It’s important to look at that in full context. However, if you […]

The post How Much Duplicate Content is Acceptable for Google? appeared first on Improve My Search Ranking.

]]>
It is estimated that approximately 25 to 30 percent of all web content is duplicate. That’s because different quotes, case studies, and tips are shared across the Internet on different websites.

But that’s usually considered “duplicate content” that should be penalised by Google. It’s important to look at that in full context.

However, if you do duplicate content with the idea of plagiarising it, that usually leads to Google penalties. In addition to plagiarism, creating content that’s not unique to each page on your own website can also lead to the same problem.

The second scenario is something that local businesses often run into.

When local businesses have to create different product or service pages, or different location-specific landing pages, they run into the potential problem of duplicating content.

The general rule is that such duplicate content must be avoided.

But … what percentage equals duplicate content for Google?

Is it 10%? 20%? 50%?

This was the question that Bill Hartzer on Twitter recently asked Google’s John Muller.

“Is there a percentage that represents duplicate content?

For example, should we be trying to make sure pages are at least 72.6 percent unique than other pages on our site?

Does Google even measure it?”

Google’s John Mueller responded that there is no fixed percentage or number that Google uses to measure duplicate content.“There is no number (also how do you measure it anyway?),” replied  John Mueller.What percentage equals duplicate content for Google

So, how does Google detect duplicate content?

If there is no specific number, how does Google exactly detect duplicate content?

We have 2 quotes from Google search experts that may share some insights into it.

The first bit of information came from Matt Cutts in 2013. According to Matt:

“[When Google finds bits and pieces of duplicate content on a web page, they] try to group it all together and treat if it is just one piece of content.

“It’s just treated as something that we need to cluster appropriately. And we need to make sure that it ranks correctly.”

Matt also explained that Google first (1) chooses which pages to show on the SERPs for a specific query and (2) then filters out duplicate pages so the user experience can be improved.

The second bit of information came from Gary Illyes in 2020 during the Search Off the Record podcast, while discussing if duplicate content detection and canonicalisation are the same things.

Gary Illyes explained:

“First, you have to detect the dupes, basically cluster them together, saying that all of these pages are dupes of each other, and then you have to basically find a leader page for all of them.

And that is canonicalization.

So, you have the duplication, which is the whole term, but within that, you have cluster building, like dupe cluster building, and canonicalization.“

Then Gary Illyes explained how Google detects duplicate content:

“So, for dupe detection, what we do is, well, we try to detect dupes.

And how we do that is perhaps how most people at other search engines do it, which is, basically, reducing the content into a hash or checksum and then comparing the checksums.”

You can read more about checksums here.

Here are a few resources on detecting duplicate content:

More importantly, however, local businesses should focus on avoiding duplicate content in the first place.

Here are a few tips to help you with that:

  • Organize your content in thematically related topic clusters with unique H1 headings and meta titles.
  • Redirects duplicate pages if you only want to keep one version.
  • If you want to keep both versions on your website for some reason but want Google to only index one page, use canonicalization.
  • For the master versions, always use a self-referential canonical tag.
  • Avoid parameterized URLs as Google can index parameterized URLs on the SERPs in addition to the main URL, creating duplicate versions of a page in the Google index.

 

Related Posts:

The post How Much Duplicate Content is Acceptable for Google? appeared first on Improve My Search Ranking.

]]>
Google’s John Mueller on the importance of Core Web Vitals https://www.improvemysearchranking.com/googles-john-mueller-importance-core-web-vitals/ Mon, 16 Aug 2021 11:00:50 +0000 https://www.improvemysearchranking.com/?p=16097 We know that Core Web Vitals (CWV) would play a big role as a search engine ranking factor, but by how much?  And is it really that big of a deal that some SEOs — and, especially, Google — have been making it? Google’s John Mueller recently confirmed the importance of Core Web Vitals and […]

The post Google’s John Mueller on the importance of Core Web Vitals appeared first on Improve My Search Ranking.

]]>
We know that Core Web Vitals (CWV) would play a big role as a search engine ranking factor, but by how much?  And is it really that big of a deal that some SEOs — and, especially, Google — have been making it?

Google’s John Mueller recently confirmed the importance of Core Web Vitals and termed it as “more than a tie-breaker.”

 

Is CWV a big deal?

 

There has been some skepticism about how important Core Web Vitals would be. This started because John Mueller once said about Core Web Vitals that “relevance is still by far much more important.”

A Core Web Vitals FAQ published by Google also downplayed the importance of CVW. The FAQ page mentioned the following:

“Page experience is just one of many signals that are used to rank pages.

Keep in mind that intent of the search query is still a very strong signal, so a page with a subpar page experience may still rank highly if it has great, relevant content.”

 

More than a tie-breaker

 

A post on Reddit recently questioned the importance of Core Web Vitals by asking:

“Anyone else not buying Core Web Vitals?

I just find it hard to believe that this actually becomes a greater part of the ranking algo. Has anyone seen dramatic gains or decreases based on it so far?”

Another poster agreed with the concern and replied:

“I believe Google admitted it’s basically just a tie breaker.”

That’s when Google’s John Mueller chimed in and confirmed that Core Web Vitals is indeed important and worth paying attention to.

“It is a ranking factor, and it’s more than a tie-breaker,” says John Mueller. 

“But it also doesn’t replace relevance. Depending on the sites you work on, you might notice it more, or you might notice it less.

As an SEO, a part of your role is to take all of the possible optimizations and figure out which ones are worth spending time on. Any SEO tool will spit out 10s or 100s of “recommendations”, most of those are going to be irrelevant to your site’s visibility in search.

Finding the items that make sense to work on takes experience.”

 

User experience

 

John also highlighted that CVW helps improve user experience, apart from being an important search engine ranking factor.

“The other thing to keep in mind with core web vitals is that it’s more than a random ranking factor, it’s also something that affects your site’s usability after it ranks (when people actually visit).

If you get more traffic (from other SEO efforts) and your conversion rate is low, that traffic is not going to be as useful as when you have a higher conversion rate (assuming UX/speed affects your conversion rate, which it usually does).

CWV is a great way of recognizing and quantifying common user annoyances.”

 

Conclusion

 

As Google is focusing more and more on user experience, we expect Core Web Vitals to be a bigger part in the future. We also expect Google to introduce similar factors that focus more on user experience — beyond just being search ranking factors.

Learn more about Core Web Vitals

The post Google’s John Mueller on the importance of Core Web Vitals appeared first on Improve My Search Ranking.

]]>
Google clarifies that spam reports represent only a small portion of manual actions https://www.improvemysearchranking.com/google-clarifies-spam-reports-represent-small-portion-manual-actions/ Wed, 22 Jul 2020 10:50:32 +0000 https://www.improvemysearchranking.com/?p=13118 Do spam reports lead to manual actions by Google? Nobody knew about it precisely, but many SEO professionals believed that the two are correlated. However, Google’s Gary Illyes recently clarified that spam reports only represent a small portion of manual actions. Instead, these reports are generally used to improve Google’s spam detection capabilities and search […]

The post Google clarifies that spam reports represent only a small portion of manual actions appeared first on Improve My Search Ranking.

]]>
Do spam reports lead to manual actions by Google? Nobody knew about it precisely, but many SEO professionals believed that the two are correlated.

However, Google’s Gary Illyes recently clarified that spam reports only represent a small portion of manual actions. Instead, these reports are generally used to improve Google’s spam detection capabilities and search results.

“Thanks to our users, we receive hundreds of spam reports every day. While many of the spam reports lead to manual actions, they represent a small fraction of the manual actions we issue. Most of the manual actions come from the work our internal teams regularly do to detect spam and improve search results.”

Google’s spam detection capabilities are very effective. On average, It filters out 25 billion spam pages every day. Google search results are 99% spam-free, but there is “always room for improvement.”

“The reality is that while our spam detection systems work well, there’s always room for improvement, and spam reporting is a crucial resource to help us with that. Spam reports in aggregate form help us analyse trends and patterns in spammy content to improve our algorithms,” said Gary Illyes.

Content quality and spam detection

Google’s immense focus on high-quality content has been pivotal in improving its spam-detection capabilities in the SERPs.

According to Gary Illyes:

“Overall, one of the best approaches to keeping spam out of Search is to rely on high quality content created by the web community and our ability to surface it through ranking.”

A clear distinction

Google has also made a clear distinction between manual actions against websites and spam reports in Google Webmasters Guidelines.

A new paragraph in the Webmaster Guidelines explicitly states:

“If you believe that another site is abusing Google’s quality guidelines, please let us know by filing a spam report. Google prefers developing scalable and automated solutions to problems, and will use the report for further improving our spam detection systems.”

In summary, a spam report against a website will not necessarily lead to a manual action. 

You can find more information about this in Google’s blog post.

The post Google clarifies that spam reports represent only a small portion of manual actions appeared first on Improve My Search Ranking.

]]>
5 SEO tactics you should focus on in 2020 https://www.improvemysearchranking.com/5-seo-tactics-focus-2020/ https://www.improvemysearchranking.com/5-seo-tactics-focus-2020/#comments Wed, 06 Nov 2019 12:00:58 +0000 https://www.improvemysearchranking.com/?p=10666 2020 is here. Have you prepared a list of what you are going to do better this year in terms of search engine optimisation? As we move forward, SEO professionals and webmasters will continue to face new and exciting challenges. In a dynamic SEO world, you need to be fully prepared. To help you do […]

The post 5 SEO tactics you should focus on in 2020 appeared first on Improve My Search Ranking.

]]>
2020 is here. Have you prepared a list of what you are going to do better this year in terms of search engine optimisation?

As we move forward, SEO professionals and webmasters will continue to face new and exciting challenges. In a dynamic SEO world, you need to be fully prepared.

To help you do that, here are 5 SEO tactics that we believe could play an important role in 2020.

 

1. Create excellent content

There really is no alternative to great content, and that will also be the case next year. If you want to rank higher in the search engine results pages (SERPs), keep creating high-quality, informative, and useful content for your readers.

Here are a few tips you should remember when approaching content creation:

  • Put your readers at the center of every content piece. Write for human readers — not for search engines.
  • Make sure your content is accurate and trustworthy (more on this in the next point)
  • Include exclusive and helpful information that they aren’t likely to find anywhere else on the web.
  • Make sure that the tone and style of your writing resonate with your readers.

 

2. Focus on E-A-T

Google’s E-A-T has been in the spotlight ever since its August 2018 update. E-A-T stands for expertise, authority, and trustworthiness, and it helps Google penalise websites with inaccurate content.

The August 2018 “medic update” penalised a lot of websites offering health and medical advice. But that crackdown wasn’t limited to just health-related websites. Google also penalised several other websites with inaccurate content and information.

The idea is that Google doesn’t only want relevant content to rank higher in the SERPs. The search engine also wants that content to have accurate information. This trend is only just getting started and will become a major deciding factor in the coming years, with 2020 expected to be a pivotal point in this next SEO evolution.

Therefore, if your content is well-optimised and has loads of links, that’s not enough. Your site’s content also needs to have 100% accurate and credible information — especially if you are in the medical, health, finance, safety, and similar niches.

 

3. Provide a great user experience

User experience continues to gain an important position in the overall SEO. Engagement metrics — which are often directly impacted by the user experience — help Google determine which pages should rank higher.

Engagement metrics, such as bounce rate, dwell time, and organic click-through rate are becoming more and more crucial.

In 2020, make sure that you continue to focus on these engagement metrics and the overall user experience. Otherwise, no matter how awesome your content is, you will have a tougher time ranking your web pages on top if they do not offer a great user experience and/or have poor engagement metrics.

 

4. Experiment with content clusters

Content clusters or topic clusters is fast becoming a popular SEO strategy. And in 2020, it is expected to play an even bigger role.

As per this strategy, you create “pillar content” on the basis of broad topics. Then you create smaller, narrower web pages that focus on long-tail keywords and sub-topics around that pillar content. In the end, you interlink the smaller cluster pages with the pillar content.

HubSpot recently shifted to the topic cluster strategy and experienced significantly better results in terms of rankings. This is how HubSpot’s content map used to look before implementing the topic cluster strategy:

 

5 SEO tactics you should focus on in 2020 hubspot before

 

And this is how it looks after they started using the content cluster SEO strategy:

 

5 SEO tactics you should focus on in 2020 hubspot after

 

As you can see, HubSpot’s website now tackles different topics as different clusters. They create a pillar content for each main topic and then build several smaller web pages for each sub-topic. In the end, all web pages covering sub-topics are interlinked with the main pillar content.

HubSpot found that “that the more interlinking they did, the better the placement in search engine results pages (SERPs). Impressions (or views) also increased with the number of links they created.”

 

5. Speed up your website

As more and more SEO professionals continue using best SEO practices recommended by Google, the differences between websites and their SEO prowess continue to diminish.

More webmasters and content marketers are now creating useful content than they were 15 years ago. Therefore, metrics such as faster loading speed of web pages will continue to become more important.

Every second counts, and it has a negative impact on conversions, traffic, search engine rankings, your website’s credibility, and overall user experience.

In 2020, make sure your website loads as soon as possible — ideally within a second.

For more information on how to optimise the loading speed of your website, read our free guide on website speed optimisation.

The post 5 SEO tactics you should focus on in 2020 appeared first on Improve My Search Ranking.

]]>
https://www.improvemysearchranking.com/5-seo-tactics-focus-2020/feed/ 1
How Big Was Google’s June 2019 Core Update? https://www.improvemysearchranking.com/big-googles-june-2019-core-update/ Mon, 17 Jun 2019 07:10:46 +0000 https://www.improvemysearchranking.com/?p=9513 Google’s June 2019 core update is a big one, and the effects of that update were felt across the web by large and small websites alike. When Google announced the core update, it was mentioned that the update wouldn’t be a “big” one when compared to previous core algorithm updates. However, the data that has […]

The post How Big Was Google’s June 2019 Core Update? appeared first on Improve My Search Ranking.

]]>
Google’s June 2019 core update is a big one, and the effects of that update were felt across the web by large and small websites alike.

When Google announced the core update, it was mentioned that the update wouldn’t be a “big” one when compared to previous core algorithm updates.

However, the data that has been gathered so far shows a slightly different picture. Some websites gained traffic and rankings, while for others traffic dropped significantly.

Sistrix shared the data they got in a blog post. Here is what it shows.

How big was Googles June 2019 core update?

As you can see, there is a jump in traffic for some websites. According to Sistrix, “today’s data clearly shows the impact of the core update. In the daily Visibility Index of the Toolbox, you can see changes from 05.06. on 06.06.”

According to data presented by Sistrix, here are the top 10 UK websites that gained more visibility after the core algorithm update.

 

Websites that gained more visibility after the core algorithm update

 

On the other hand, the following are the top 10 websites that significantly lost visibility after Google rolled the June 2019 update.

 

Websutes that lost visibility after the core algorithm update

Steve Paine from Sistrix mentioned, “We are seeing many YMYL websites, but there are also classical news sites, retail, and many others. It appears that this Google Core Update is broader than the last updates.”

 

How to fix the problem?

If you have a website that is affected by this core algorithm update, Google’s John Mueller has some advice for you. However, it is not very specific.

John mentioned that core algorithm updates are not about a specific issue (relevance, for example) that the webmaster can fix. Here is what he said:

“With a lot of the relevance updates, a lot of the kind of quality updates, the core updates that we make, there is no specific thing where we’d be able to say you did this and you should have done that, and therefore we’re showing things differently.”

He also added, “Sometimes the web just evolved. Sometimes what users expect evolves and similarly, sometimes our algorithms are, the way that we try to determine relevance, they evolve as well.”

 

Focus on quality

At the end of the day, quality is all that matters. Make sure you are focusing on quality as much as you can, and your website will be safe from most core algorithm updates.

For more detailed information on what ‘quality’ means, refer to this article by Google in which the author poses several questions that you should ask yourself to determine if you have a “quality” website as per Google’s standards.

The post How Big Was Google’s June 2019 Core Update? appeared first on Improve My Search Ranking.

]]>
5 big reasons why your traffic might drop (and how to deal with them) https://www.improvemysearchranking.com/5-big-reasons-why-your-traffic-might-drop-and-how-to-deal-with-them/ Tue, 28 May 2019 09:53:35 +0000 https://www.improvemysearchranking.com/?p=9370 However good your website and content is, you may still lose search engine rankings and traffic. After all, there are so many factors involved — from content quality to performance of competitors and algorithm changes — that may have an impact. When that happens, it is important to identify the reason so you can take […]

The post 5 big reasons why your traffic might drop (and how to deal with them) appeared first on Improve My Search Ranking.

]]>
However good your website and content is, you may still lose search engine rankings and traffic. After all, there are so many factors involved — from content quality to performance of competitors and algorithm changes — that may have an impact.

When that happens, it is important to identify the reason so you can take the necessary steps and resolve the problems.

Here are five big reasons why your traffic might drop and what to do about it.

1. Lost backlinks

Backlinks not only help websites gain better rankings in the search engine results pages, but they are also often responsible for driving referral traffic.

If you lose a bunch of important backlinks, your website may lose a lot of valuable traffic, because:

  • The referral traffic coming from those backlinks will stop, and
  • A drop in the number of backlinks may also result in your website losing search engine rankings. For instance, if your website moved from 1st page to the 2nd page in the SERPs, you are likely to lose a substantial amount of organic traffic.

Use an SEO tool like Ahrefs or Moz to identify if there has been a drop in backlinks. When it comes to regenerating those backlinks and more, make sure you only target relevant and high-quality backlinks for your website.

2. Changes in search algorithms

Google regularly changes and improves its search engine algorithm. Depending on that update, you may gain or lose some traffic.

In all honesty, there isn’t much you can do to avoid a drop in traffic if it is because of a change in search algorithms. Just continue following Google’s best practices and guidelines, and your site is unlikely to be negatively affected by algorithm updates.

In case your website does get affected by an algorithm update, learn more about the algorithm. An algorithm update usually targets a specific aspect, e.g., the relevancy of backlinks, user experience, content quality, etc.

You can take appropriate steps to resolve those issues once you learn what causes a drop in traffic.

3. Manual penalties

A sudden drop in search engine rankings and traffic may be a result of a manual penalty. Manual actions are not a result of an algorithm update.

5-big-reasons-why-your-traffic-might-drop-penalties

Another sign of a manual penalty is that your website will lose traffic from one search engine and will continue to perform well in others.

Your goal should be to get that penalty removed as soon as possible.

Log in to your Google Search Console account and check the notifications. The Manual Actions section will contain all the details you may need in case any of your webpages does not comply with Google’s guidelines. Follow the suggestions mentioned there to fix the problem.

4. Competition from other websites

There are only so many websites that can appear on Google’s first page. It means that if other competitors improve their content and websites, they may overtake you and push you to the next pages.

That will result in a loss of traffic.

When that happens, know that the loss of traffic may not be a result of a manual action or a search engine algorithm. It just means that your competitor(s) outdid you.

Run a detailed analysis of what your competitors are doing, what type of content they are creating, what’s their website loading speed, how is the user experience on their websites, and more.

Then compare those metrics with your website and try to improve on the fronts you are lacking.

5. Website loading speed

The loading speed of a website is although a relatively newer but very important search engine ranking factor.

Over time, websites tend to get slow because of more resources, images, and plugins. If your website becomes too slow, you may lose higher search engine positions and, therefore, a good amount of traffic.

Make sure to keep track of your website’s loading speed of your website over a time to see if it’s slowing down. You can use free tools like PingdomGTmetrix, and Google PageSpeed Insights for that.

For more information on how to improve the loading speed of your website, read our free guide on speed optimisation.

Conclusion

When you lose rankings and traffic, it is easy to get panic, but that doesn’t solve anything. Instead, take it step-by-step and identify the main reason.

Once you know the reason, you can easily take the appropriate steps to resolve those issues and get your traffic back.

The post 5 big reasons why your traffic might drop (and how to deal with them) appeared first on Improve My Search Ranking.

]]>