Posted on August 10, 2024

During the session with Gary Illyes and Lizzi Sassman from Google, the panelists explained that there were three signals that would prompt the search engine to bring in Googlebot crawling in larger quantity. Although they muted the importance of regularly crawling, they conceded that there exists manner through which Googlebot could be compelled to revisit the website.

Influence of quality content on crawling frequency.

The one thing that was mentioned regarding the quality of a website was that it has the potential to change. Many individuals continue to struggle with the discovered not indexed problem and that is sometimes due to some SEO techniques that individuals have learned and which they feel are good for the website. Currently, the level of SEO experience I have is 25 years and what has remained constant in this industry is that what the industry deems the best practices are often many years behind what Google is doing. If such a person feels that they are doing everything right, there is no way one can convince them that there is something wrong. At the 4:42 minute of the presentation, Gary Illyes presented a reason for the high crawl frequency, stating that there are indications of a high quality which Google algorithms pick. “…often if the site has good quality, if the viewers find it useful and in general they like it, then Googlebot well Google prefers to return more from this site…”There is much more subtlety to the above statement that I have omitted like what are those signs of high quality and helpful content to Google that will signal it to check back more often?Anyway the following are some of the speculations that I would like to make on this subject:We also know that there are patents in the field of certain branded search where branded searches by users are counted as implied clicks. It is also important to note that some people confused “implied links” with “brand mentions” which are actually not indicated in the patent in question at all.

Next there is the Navboost patent that dates back to the year 2004. Some people relate the Navboost patent to clicks but if you go through the whole patent that was filed in 2004, you will not find the term click through rates (CTR). It speaks about the usage signals of a user. Clicks as a concept was a hot favored in the early part of the this century but if you look at the actual research papers and patents it’s easy to get a clear vision of what I mean when in fact it is not as neat a ‘the monkey clicks the website listed in the SERPS, Google ranks it higher, monkey gets banana. ’

All in all, I want to add that signals that people visit a site and consider that the site is useful, I believe that can contribute to the ranking of a web site. And sometimes that can be giving people what they expect to see, never giving people what they expect to see.

Site owners will say that Google is ranking all sorts of rubbish and when I sit down and look at those site I meet half-way their ridiculous claims. At the same time, the content is feeding people what fully expects to see because they have no idea how to differentiate between what they should receive and true quality content (the Froot Loop theory).

What is the method of claim 1 using Froot Loops? It is an aftermath of Google’s tendency of using user satisfaction indicators to determine if its search results make the user happy. Here’s what I previously published about Google’s Froot Loops algorithm:Here’s what I previously published about Google’s Froot Loops algorithm:

Have you ever stroll at the supermarket, walk through the aisle just only for cereal and see how many of these contains sugar? That is what is entailed by the achievement of user satisfaction. A consumer would logistically expect to find sugar bomb cereals in the cereal aisle, and supermarkets meet this user desire.

At times I stand still in front of the cereal packet looking at the Froot Loops and wondering who eats that I guess many people do, for the box is placed on the supermarket shelf where people anticipate it to be.

Google is posing the same problem as the supermarket is posing in this scenario. Google is delivering the results that are most likely to meet the user’s desires quickly, the same way one gets the cereals they want. ”

An example of a garbagey site that satisfies the users is one of the most visited recopies; they mushroom simple recipes that are not real and include canned cream of mushroom soup for the recipe. I have cooked severally in the kitchen and those recipes to me are nauseating. Meanwhile, people I know have fondly embraced that site for the simple reason that they truly don’t know anything better, they need an easy recipe.

What the helpfulness conversation is actually all about is catering to the viewers of the online platform and feeding their wishes, although these might not be the wishes that should be fed. Providing what people want, which I believe will be deemed useful by the searchers, while ringing Google’s helpfulness signal bells.

Increased Publishing Activity

Illyes and Sassman said that another thing that could make Googlebot crawl more is publishing frequency or if a site all of a sudden publishes a lot more pages. But Illyes stated that in the context of a hacked site that all of a sudden started pumping out more web pages. Indeed, a hacked site that is regularly publishing many pages would result to Googlebot crawling more often.

If we try to see that statement in the perspective of the forest then, it will be quite clear that he is hinting that an increased publication activity may lead to an increased crawl activity. That is not that your site was hacked is that it is being crawled more by Googlebot because the rate of publishing has grown.

Here is where Gary cites a burst of publishing activity as a Googlebot trigger:Here is where Gary cites a burst of publishing activity as a Googlebot trigger:

“…it can also mean that, I don’t know, the site was hacked, then a bunch of new URLs excite Googlebot and then out it goes and crawls like a madman. ”

A lot of new pages is what makes Googlebot happy and helps it ‘go bananas’ and crawl a site is the moral of the story there. There is no more to add on the above points, I suggest we proceed to the next level.

Consistency Of Content Quality

 Further, Gary Illyes says that, Google might start taking into consideration the overall site quality and cause a decrease in crawling frequency.

 Here’s what Gary said:

“…if we are not crawling much or we are gradually slowing down with crawling, that might be low quality content signals or we modified the site’s quality.

When Gary say that Google “rethought the quality of the site?” What I understood is that this term is a vague term to show that sometimes the overall site quality of a site can be dragged down by some parts of the site which is not of the similar standard as the original site quality. I kinda understand this as well; probably someday the quantity of the low quality contents might outdo the good ones and then drag all the other contents as well.

Categories: Information Technology

Leave a Reply

Your email address will not be published. Required fields are marked *

Enquire Now