Google Updates Googlebot File Size Limit Docs

Google Updates Googlebot File Size Limit Docs

Google updated its Googlebot documentation to clarify information about file size limits.

The change involves moving information about default file size limits from the Googlebot page to Google’s broader crawler documentation. Google also updated the Googlebot page to be more specific about Googlebot’s own limits.

What’s New

Google’s documentation changelog describes the update as a two-part clarification.

The default file size limits that previously lived on the Googlebot page now appear in the crawler documentation. Google said the original location wasn’t the most logical place because the limits apply to all of Google’s crawlers and fetchers, not just Googlebot.

With the defaults now housed in the crawler documentation, Google updated the Googlebot page to describe Googlebot’s specific file size limits more precisely.

The crawling infrastructure docs list a 15 MB default for Google’s crawlers and fetchers, while the Googlebot page now lists 2 MB for supported file types and 64 MB for PDFs when crawling for Google Search.

The crawler overview describes a default limit across Google’s crawling infrastructure, while the Googlebot page describes Google Search–specific limits for Googlebot. Each resource referenced in the HTML, such as CSS and JavaScript, is fetched separately.

Why This Matters

This fits a pattern Google has been running since late 2025. In November, Google migrated its core crawling documentation to a standalone site, separating it from Search Central. The reasoning was that Google’s crawling infrastructure serves products beyond Search, including Shopping, News, Gemini, and AdSense.

In December, more documentation followed, including faceted navigation guidance and crawl budget optimization.

The latest update continues that reorganization. The 15 MB file size limit was first documented in 2022, when Google added it to the Googlebot help page. Mueller confirmed at the time that the limit wasn’t new. It had been in effect for years. Google was just putting it on the record.

When managing crawl budgets or troubleshooting indexing on content-heavy pages, Google’s docs now describe the limits differently depending on where you look.

The crawling infrastructure overview lists 15 MB as the default for all crawlers and fetchers. The Googlebot page lists 2 MB for HTML and supported text-based files, and 64 MB for PDFs. Google’s changelog does not explain how these figures relate to one another.

Default limits now live in the crawler overview documentation, while Googlebot-specific limits are on the Googlebot page.

Looking Ahead

Google’s documentation reorganization suggests there will likely be more updates to the crawling infrastructure site in the coming months. By separating crawler-wide defaults from product-specific documentation, Google can more easily document new crawlers and fetchers as they are introduced.


#Google #Updates #Googlebot #File #Size #Limit #Docs

Leave a Reply

Your email address will not be published. Required fields are marked *