Google can render JavaScript. That’s not up for debate. However that doesn’t imply it at all times does — or that it does so immediately or completely.
Since Google’s 2024 feedback suggesting it renders all HTML pages, many builders have questioned whether or not no-JavaScript fallbacks are nonetheless essential. Two years later, the reply is clearer and extra nuanced.
Google’s stance on JavaScript rendering
In July 2024, Google sparked debate throughout an episode of Search Off the Report titled “Rendering JavaScript for Google Search.” When requested how Google decides which pages to render, Martin Splitt stated:
- “If it’s so costly, how can we determine which web page ought to get rendered and which one doesn’t?”
Zoe Clifford, from Google’s rendering staff, replied:
- “We simply render all of them, so long as they’re HTML, and never different content material varieties like PDFs.”
That remark rapidly led builders, particularly these constructing JavaScript-heavy or single-page purposes, to argue that no-JavaScript fallbacks had been not essential.
Many SEOs weren’t satisfied. The comment was casual, untested at scale, and missing element. It wasn’t clear:
- How rendering match into Googlebot’s course of.
- Whether or not pages had been queued for later execution.
- How the system behaved underneath useful resource constraints.
- Whether or not Google would possibly fall again to non-rendered crawling underneath load.
With out readability on timing, consistency, and limits, eradicating fallbacks solely nonetheless felt dangerous.
Your customers search everywhere. Make sure your brand shows up.
The SEO toolkit you know, plus the AI visibility data you need.
Start Free Trial
Get started with


What Google’s documentation truly says
Google’s documentation now provides us a a lot clearer image of how JavaScript rendering truly works. Let’s begin with the “JavaScript SEO basics” web page:


What Google says:
- “Googlebot queues all pages with a 200 HTTP standing code for rendering, until a robots meta tag or header tells Google to not index the web page. The web page could keep on this queue for a couple of seconds, however it may possibly take longer than that. As soon as Google’s assets enable, a headless Chromium renders the web page and executes the JavaScript. Googlebot parses the rendered HTML for hyperlinks once more and queues the URLs it finds for crawling. Google additionally makes use of the rendered HTML to index the web page.”
Google clearly states that JavaScript rendering doesn’t essentially occur on the preliminary crawl. As soon as assets enable, a headless browser is used to parse JavaScript.
Googlebot possible gained’t click on on all JavaScript components, so this in all probability solely contains scripts that don’t require person interactions to fireplace.
That is necessary as a result of it tells us Google could make some primary determinations earlier than JavaScript is rendered, through subsequent execution queues.
If content material is generated behind components (content material tabs, and so forth.) that Google doesn’t click on, it possible gained’t be found with out no-JavaScript fallbacks.
Taking a look at Google’s “How Search works” documentation:


The language is far less complicated. Google states it would try, in some unspecified time in the future, to execute any found JavaScript. There’s nothing right here that immediately contradicts what we’ve seen thus far in different Google documentation.
On March 31, Google revealed a publish titled “Inside Googlebot: demystifying crawling, fetching, and the bytes we process,” which additional clarifies JavaScript crawling.


The notes on partial fetching are significantly attention-grabbing. Google will solely crawl as much as 2MB of HTML. If a web page exceeds this, Google gained’t discard it solely, however as an alternative examines solely the primary 2MB of returned code.
Google explicitly states that excessive useful resource bloat, together with giant JavaScript modules, can nonetheless be an issue for indexing and rating.
In case your JavaScript approaches 2MB and seems on the high of the web page, it could push HTML content material far sufficient down that Google gained’t see it. The 2MB restrict additionally applies to particular person assets pulled right into a web page. If a CSS file, picture, or JavaScript module exceeds 2MB, Google will ignore it.
We’re starting to see that Google’s declare that it renders all pages comes with necessary caveats.
In follow, it appears unlikely {that a} web page without any consideration for server-side rendering (SSR) or no-JavaScript fallbacks could be dealt with optimally. This highlights why it’s dangerous to take feedback from Googlers at face worth with out following how the small print evolve over time.
The query we opened with can be evolving. It’s much less “Do I would like blanket no-JavaScript fallbacks in 2026?” and extra “Do I nonetheless want critical-path fallbacks and resilient HTML inside my utility?”
Google’s current search documentation updates add extra context:


Google has not too long ago softened its language round JavaScript. It now says it has been rendering JavaScript for “a number of years” and has eliminated earlier steering that prompt JavaScript made issues more durable for Search.
It additionally notes that extra assistive applied sciences now assist JavaScript than up to now.
Inside that very same documentation, Google nonetheless recommends pre-rendering approaches, comparable to server-side rendering and edge-side rendering.


So whereas the language is softer, Google isn’t suggesting builders can ignore how JavaScript impacts search engine optimisation.
Trying once more on the December 2025 updates:


Google states that non-200 pages could not obtain JavaScript execution. This implies no-JavaScript fallbacks for inside linking inside customized 404 pages should be necessary.
Google additionally notes that canonical tags are processed each earlier than and after JavaScript rendering. If supply HTML canonicals and JavaScript-modified canonicals don’t match, this will trigger important points. Google suggests both omitting canonical directives from the supply HTML in order that they’re solely evaluated after rendering, or guaranteeing JavaScript doesn’t modify them.
These updates reinforce an necessary level: whilst Google turns into extra succesful at rendering JavaScript, the preliminary HTML response and standing code nonetheless play a crucial function in discovery, canonical dealing with, and error processing.
Dig deeper: Google removes accessibility section from JavaScript SEO section
Get the publication search entrepreneurs depend on.
What the info exhibits
JavaScript rendering is introducing new inconsistencies throughout the net, in keeping with recent HTTP Archive data:


We will see that since November 2024, the share of crawled pages with legitimate canonical hyperlinks has dropped.
By way of the HTTP Archives 2025 Almanac:


About 2-3% of rendered pages exhibit a “modified” canonical URL, one thing Google’s documentation explicitly states will be complicated for its indexing and rating programs. That 2-3% doesn’t clarify the bigger drop in legitimate canonical deployment since November 2024.
Different components are possible at play, such because the adoption of latest CMS platforms that don’t correctly deal with canonicals. The rise of vibe-coded web sites utilizing instruments like Cursor and Claude Code may be contributing to those points throughout the net.
In July 2024, Vercel published a study to assist demystify Google’s JavaScript rendering course of:


It analyzed greater than 100,000 Googlebot fetches and located that every one resulted in full-page renders, together with pages with complicated JavaScript. Nonetheless, 100,000 fetches is a comparatively small pattern given Googlebot’s scale.
The research was additionally restricted to websites constructed on particular frameworks, so it’s unwise to imagine Google at all times renders pages completely. It’s additionally unclear how deeply these renders had been analyzed.
It does counsel that Google makes an attempt to totally render most pages it encounters. Broadly talking, Google can generate JavaScript-modified renders, however the high quality of these renders remains to be up for debate. As famous earlier, the 2MB web page and useful resource limits nonetheless apply.
As a result of this research dates to mid-2024, any contradictions with Google’s up to date 2025–2026 documentation ought to take priority.
Vercel additionally revealed a notable finding:
- “Most AI crawlers don’t execute JavaScript. We examined the main ones (ChatGPT, Claude, and others), and the outcomes had been constant: none of them render client-side content material. In case your Subsequent.js website ships crucial pages as JavaScript-dependent SPAs, these pages are inaccessible to the programs shaping how folks uncover info.”
So even when Google is way extra succesful with JavaScript than it was once, that’s not true throughout the broader net ecosystem. Many programs nonetheless depend on HTML-first supply. That’s why you shouldn’t rush to take away no-JavaScript fallbacks — they might nonetheless be crucial to your future visibility.
Cloudflare’s 2025 evaluate can be value noting:


Cloudflare reported that Googlebot alone accounted for 4.5% of HTML request traffic. Whereas this doesn’t immediately clarify how Google handles JavaScript, it does spotlight the dimensions at which Google continues to crawl the net.
Dig deeper: How the DOM affects crawling, rendering, and indexing
See the complete picture of your search visibility.
Track, optimize, and win in Google and AI search from one platform.
Start Free Trial
Get started with


No-JavaScript fallbacks in 2026
The query we got down to reply was whether or not no-JavaScript fallbacks are required in 2026.
Google is way extra succesful with JavaScript than in earlier years. Its documentation exhibits that pages are queued for rendering, and that JavaScript is executed and used for indexing. For a lot of websites, heavy reliance on JavaScript is not the pink flag it as soon as was.
Nonetheless, the small print of Google’s rendering course of nonetheless matter. Rendering isn’t at all times speedy. There are useful resource constraints, and never all behaviors are supported.
On the similar time, the broader net ecosystem hasn’t essentially saved tempo with Google. The danger of eradicating all no-JavaScript fallbacks hasn’t disappeared — it’s simply modified form.
Key takeaways:
- Google doesn’t essentially render JavaScript on the primary crawl. There’s a rendering queue, and execution occurs when assets enable.
- Technical limits nonetheless exist, together with a 2MB HTML and useful resource cap, and restricted interplay with user-triggered components.
- Non-200 responses could not obtain rendering therapy, which retains primary HTML and linking necessary in some circumstances.
- Variations between uncooked HTML and rendered output nonetheless exist at scale throughout the net.
- Google’s steering nonetheless leans towards SSR (server-side rendering), pre-rendering, and resilient HTML for crucial content material.
- Different crawlers, particularly AI-driven ones, usually don’t execute JavaScript in any respect. As these programs grow to be extra necessary, the necessity for fallbacks could enhance once more.
- Blanket, site-wide no-JavaScript fallbacks aren’t universally required in 2026, however crucial content material, hyperlinks, and indicators shouldn’t rely solely on JavaScript. Many fashionable crawlers nonetheless depend on HTML-first supply.
For now, no-JavaScript fallbacks for crucial structure, hyperlinks, and content material are nonetheless strongly really helpful, if not required going ahead.
Contributing authors are invited to create content material for Search Engine Land and are chosen for his or her experience and contribution to the search group. Our contributors work underneath the oversight of the editorial staff and contributions are checked for high quality and relevance to our readers. Search Engine Land is owned by Semrush. Contributor was not requested to make any direct or oblique mentions of Semrush. The opinions they specific are their very own.
#NoJavaScript #fallbacks #crucial

