SaaS SEO

What is GoogleOther – Google’s new Crawler ???

Pushkar Sinha
Apr 26, 2023
4 mins read
What is GoogleOther – Google’s new Crawler ???
Google has Introduced GoogleOther to optimize GoogleBot’s performance and to take some strain off Googlebot

We all know about Googlebot – The crawler that sniffs and crawls and documents our sites and any updates we make so users get the updated site indexed and ranked on Google’s search engines.

So, what is a Crawler?

As per Google’s Documentation –

“ Google uses crawlers and fetchers to perform actions for its products, either automatically or triggered by user request.

“Crawler” (sometimes also called a “robot” or “spider”) is a generic term for any program that is used to automatically discover and scan websites by following links from one web page to another. Google’s main crawler is called Googlebot.”

Google’s main crawler is called Googlebot. As per Google’s documentation, there are three types of crawlers –

  • Common Crawlers – Googlebot
  • Special -Case Crawlers
  • User Triggered fetchers

Images from Google documentation explain in detail what these are and their functions –

A new Crawler called GoogleOther has been introduced by Google to reduce the burden on GoogleBot and also to streamline tasks. GoogleOther will be mainly doing non-essential crawling tasks for R&D crawling.

In his LinkedIn post – Gary Illyes mentioned-

‘’ We added a new crawler, GoogleOther to our list of crawlers that ultimately will take some strain off of Googlebot. This is a no-op change for you, but it’s interesting nonetheless I reckon.

As we optimize how and what Googlebot crawls, one thing we wanted to ensure is that Googlebot’s crawl jobs are only used internally for building the index that’s used by Search. For this we added a new crawler, GoogleOther, that will replace some of Googlebot’s other jobs like R&D crawls to free up some crawl capacity for Googlebot.

The new crawler uses the same infrastructure as Googlebot and so it has the same limitations and features as Googlebot: hostload limitations, robotstxt (though different user agent token), http protocol version, fetch size, you name it. It’s basically Googlebot under a different name ’’

For SEO teams and webmasters, we do not see any major changes caused by the Introduction of another crawler but it would be prudent to keep an eye on your site and check periodically the following:

  • Robots.txt file – Ensure it is up to date and customized to GoogleOther if needed.
  • Search Console statistics – Keep tabs on crawl stats in the search console to ensure that the crawl budget and frequency are in control and make changes as necessary.
  • Website – User engagement , web speed, time to load, the bounce rate of pages, and engagement of users have all to be monitored as usual and kept an eye on.
  • Logs in the server – ensure to check server logs to keep a note of the requests made by GoogleBot/ GoogleOther to see discrepancies if any.
  • Ranking Drops – Although there could be other reasons and updates responsible for this- keeping a lookout for this also gives SEO teams peace of mind as any of the above factors might cause rankings to fluctuate.
About Pushkar Sinha

Pushkar Sinha SaaS SEO expert Pushkar Sinha

Pushkar Sinha is the Head of Digital Marketing at FirstPrinciples Growth Advisory. With 15+ years of expertise, he specializes in SEOfor European, American, and Indian markets, both in agency and in-house roles. His holistic skill set encompasses Google Ads, Affiliate Marketing, SEO, SEM, PPC, E-Commerce, and Project Management. Pushkar is...

About Pushkar Sinha

Schedule a Personalized Strategy Session to Get More MQLs!

Join The SaaS Tribe

Newsletter Form

  • This field is for validation purposes and should be left unchanged.