Zentury Spotlight – June 2024 Spam Update Rollout is Completed

For this Zentury spotlight, we have many significant movements in the digital landscape to cover. The first we will touch upon will be Google’s completion of its June 2024 Spam Update. The update marks an incredibly significant step in Google’s plan to improve search engine quality by mitigating spammy content. As this update has many implications website owners and SEO experts have been closely monitoring it.

It is important to know that the update heavily impacts the ranking and visibility of websites in their industries. The update will benefit fair practitioners as websites directly violating Google’s spam policies are the only target. Users also benefit as the update enhances the landscape ensuring that only high-quality and relevant content ranks well on search engine result pages.

The June 2024 Spam Update utilizes Google’s advanced algorithms to better identify and punish websites that engage in manipulative practices such as keyword cloaking, stuffing, and link schemes. Overall, the reception has been mixed as some websites have experienced significant traffic fluctuations. Despite this, the overarching results are positive, and as these systems get optimized Google’s long-term vision of spam-free internet will come true.

june 2024 spam update

30th Anniversary of Robots.txt Existence – Happy Birthday!

With the 30th Robots.txt anniversary, Google has taken the opportunity to emphasize its importance through a commemorative event. Despite its first introduction back in 1994, robots.txt has remained relevant allowing website wonders to instruct search engine crawlers directly. This has many benefits mainly the impeccable protection of sensitive information and crawl budget optimization.

Over time robots.txt has evolved to accommodate the growing complexity of the internet. Google has highlighted this and robot.txt’s continued relevance in relevant and future SEO strategies. They have even encouraged webmasters to regularly review and update their robots.txt files to improve optimization. Through the successful configuration of this file, webmasters can prevent inadvertent indexing of classified data and improve site performance as well. Overall, robots.txt is here to stay and is looking to maybe start playing a bigger role in future SEO strategies, so it is important to keep an eye on it.

Robots txt 30th anniversary

Google’s EEAT and Story About Perfect Ranking Signal!

SEO specialists have never known exactly how Google’s search algorithms work as they are shrouded in mystery. One of the ultimate goals of SEO experts is to see the perfect ranking signal. In the past year, surprisingly so, Google has shed some light on these very specific criteria through the EEAT framework.

EEAT stands for Experience, Expertise, Authoritativeness, and Trustworthiness. This trifecta forms the foundation of Google’s assessment of content quality and relevance. 

This revelation has had major implications for the world of SEO as it has proven some theories right. Furthermore, through EEAT’s focus on credible, well-researched, and relevant content we have seen more important elements.

If we take websites that demonstrate high levels of these factors they usually rank well. Trustworthiness stands out as it is bolstered by factors such as secure connections and transparent information about who is creating the content.

These factors give some implications as to what Google wants to see from webmasters. In the current SEO landscape just make sure to focus on the quality of content and fair practices on your websites.

In the end, we may never know exactly what the perfect ranking signal will be. However, as time goes on we are slowly getting closer to perfecting SEO strategies. Naturally, these strategies and parameters will change as new Google updates come. This makes keeping up with all of the new and relevant information paramount for success. 

Google EEAT and Mystery

Soft 404 errors and how they impact SEO

Another important topic for this month in SEO has been soft 404 errors. These errors occur when a server returns a 200 OK status code for pages that don’t exist. This usually prompts a ‘’page not found’ message to users. We can see that soft 404 errors differ from traditional 404 errors in a very interesting way as they confuse search engines and negatively impact SEO.

The impact of these ‘’broken links’’ is very significant and must be handled immediately. The heavy hit to SEO performance comes from Google’s interpretation of these errors as extremely low-quality pages.

If you want to mitigate the impact of soft 404 errors you should ensure that any non-existent pages return a proper 404 status code. Additionally, to further fix the issue you should create custom 404 pages that guide users back to relevant content and minimize bounce rates. This will dramatically improve your SEO rankings as it will eliminate soft 404 errors completely. 

Furthermore, consistent site audits can help identify and fix soft 404 errors as soon as possible. By doing this you will maintain the integrity of a website’s SEO health and gain additional insights into your web page’s performance. As Google has warned about these errors we should expect a direct statement from them within the coming months.

404 error

Google Search Console Bug – Performance Reports Delay ( 90h for now )

Finally, the last news we have for you today is a report of a bug in the Google Seach Console. This bug has caused significant delays in performance report updates, with some users experiencing lag of 90 + hours. This bug has raised many concerns in the world of SEO, especially for those who rely on real-time data. Because of the prevalence of the bug, many organizations cannot make informed decisions about their digital strategies.

This is a massive issue, and Google has acknowledged the bug assuring users that their engineering team is working diligently to resolve the issue. If you are someone that is impacted by this bug it is best to put your digital strategies on pause and wait until the issue is resolved.

google search console bug

Alternatively, you can utilize alternative analytics tools to monitor website performance and execute your strategies. While the numbers may not be as accurate as with Google, you will still get some relevant information at the moment instead of waiting for up to 76 hours.

This issue should not persist for long as Google’s prompt response and commitment to transparency are reassuring for all impacted by it. We can expect the issue to be resolved very quickly and the impact shouldn’t be too bad in the long term. 

Leave a Comment