Zentury Spotlight – Elon Musk is Rebranding Twitter to X

Google Shares if It Is Good Idea to Copy Their Way of Rewriting Titles

Is it a good idea to copy the title tags that Google modifies for the SERPs?

John Mueller from Google responded to the question of whether it makes sense to write titles that are similar to how Google rewrites them in search engine results pages (SERPs).

On Mastodon, someone discovered that Google was altering the title components on their webpages, frequently eliminating the site name.

They take it to mean that possibly Google considers the site name to be unnecessary and that they should perhaps simply remove the site name entirely from the title tag.

Mueller suggests keeping the site’s name in there since it makes it simpler to verify a site’s name that Google puts above the title. He would not presume that a revised version is superior (for SEO or for users). He mentions that it is a well-established pattern that shouldn’t be altered only for Google.

As for the title elements, Google advises that they should be concise and descriptive, and not vague in any way.

At last, Google suggests that the title should remain concise and that the site’s name should not be overused, meaning that marketing slogan should not be repeated throughout the entire site.

google logo

Elon Musk is Rebranding Twitter to X

Elon Musk shared that over the past weekend, Twitter started rebranding as X in an effort to build an all-encompassing platform with unlimited interactivity.

The recognizable bird design was swapped out for the new X logo across the social network, and the Twitter official account’s name was also changed to X.

Elon Musk stated that Twitter was acquired by X Corp, primarily to ensure freedom of speech, but as well to accelerate the progress of X, the “everything” app. He wrote on his (now) X profile that the “Twitter” name made sense when there were only short messages going back and forth, comparing it to birds tweeting. Eventually, the app escalated and now you are able to post anything, from short messages to long videos. Since new options, like comprehensive communications and the ability to conduct your entire financial world, will be added on the app, the name Twitter does not make much sense for Musk.

Those who follow Musk are not surprised by the renaming because he has already spoken about his intentions for a full social platform that provides more than just text-based updates.

A credibility score is incorporated into the new platform, which was long-envisioned, to make it simpler to identify bots and false material.

The acquisition of Twitter sped up the plans for X by three to five years rather than starting from scratch.

Yaccarino claims that X wants to develop into a global marketplace for AI-powered concepts, goods, and services.

twitter rebranding

Google Warns Against Low-Cost TLDs Due to Spam Risks

Google warns against low-cost TLDs and claims that TLD keywords do not boost SEO.

In the hierarchical DNS (domain name system) of the Internet, a TLD, short for top-level domain, is the most generic domain. It is also known as the last component of a domain name, for example, “io” in “zentury.io”.

Google’s Search Relations Team recently highlighted, in their podcast, how website owners should approach selecting a top-level domain (TLD).

The conversation focused on the need to balance cost, credibility, and spam dangers as it considered the trade-offs between pricey “.com” domains and inexpensive options like “.xyz”.

Google members, Mueller, Illyes, and Splitt, cautioned against selecting the least expensive TLDs, stating that spammers usually target free and inexpensive domains, which could lead to a negative impact on search rankings.

The idea that having a TLD that matches your keywords automatically gives you a ranking benefit was disproved by Google’s Search Relations team.

Splitt questioned whether a coffee business might get any SEO advantages from having a domain like fantastic.coffee, and Illyes replied clearly “No.”

The final conclusion of the podcast was to avoid cheap or free TLDs, as they are prone to spam, which can harm your reputation and search ranking.

google warns against

Google Shares What Is Less Worse: 404 Error Pages or 301 Redirects

Gary Illyes from Google provided a response to a query concerning which is less worse to use: 301 redirects or 404 error pages. Illyes clarified what to think about both status codes.

He clearly stated that both status codes are completely harmless, and that you should decide for yourself which status code is better for your scenario and go with it.

Illyes refers to the 404 and 301 responses as “status codes”. Both of them are responses by a server to a request for a webpage. 

Browser visiting a webpage is actually requesting a webpage from the server. The server in response to the browser’s request shares a status of that request through a message. That is why Illyes references the 404 and 301 codes as status codes.

They are also known as response codes since they are practically responses from the server to the browser. 

Technically, 301 and 404 are known as status codes, according to their inventors, the World Wide Web Consortium (W3C).

In theory we differ five kinds of status codes:

  • 1xx (Informational): The request was received, continuing process
  • 2xx (Successful): The request was successfully received, understood, and accepted
  • 3xx (Redirection): Further action needs to be taken in order to complete the request
  • 4xx (Client Error): The request contains bad syntax or cannot be fulfilled
  • 5xx (Server Error): The server failed to fulfill an apparently valid request

A publisher can 301 redirect old or out-of-date pages to the new pages that are comparable in content if the page is absent as a result of the merging of two sites.

However, if the sites don’t have a comparable topic, they may get 404 errors, indicating that the page isn’t there. Therefore, Illyes suggests that you should find which scenario is more suitable for you.

New Findings Show That Quality of ChatGPT Has Worsened

Researchers have been evaluating ChatGPT over the past several months and found that output quality has declined

Given that OpenAI chatbots are continuously updated, and are not static technologies, it is possible that fluctuations happen.

Many of the modifications made to GPT 3.5 and 4 are not announced by OpenAI, let alone what changes were made. The result is that users sense a difference but are unsure of what has changed which led to open discussions on Twitter and Facebook ChatGPT groups as well as on OpenAI’s community platform.

The researchers, who collaborate with Berkeley and Stanford Universities (as well as the CTO of DataBricks), wanted to see how well the GPT versions 3.5 and 4 performed over time.

chatGPT has worsened twitter rebranding

The researchers said there is no explanation for why the models looked to deteriorate over time because OpenAI doesn’t explain every change they make to the system.

One of the researchers suggested potential explanations on Twitter, speculating that the training technique known as Reinforcement Learning With Human Feedback (RHLF) may be at its limit.

In the end, the researchers came to the conclusion that due to the lack of consistency in the output, businesses that rely on OpenAI should think about implementing routine quality assessment in order to keep an eye out for unforeseen changes.