In a podcast interview, Google VP of Search Liz Reid described two methods LLMs are altering what Google can index and the way it ranks outcomes for particular person customers.
Reid advised the Access Podcast that multimodal AI fashions now enable Google to grasp audio and video content material at a deeper degree than was beforehand potential. She additionally pointed to a future the place search outcomes adapt primarily based on a person’s paid subscriptions.
What’s New
Multimodal Understanding Is Increasing What Google Can Index
Reid stated LLMs being multimodal has opened up content material codecs that Google beforehand struggled to course of.
Reid advised the hosts:
“The beauty of LLM is that they’re multimodal. So we are able to truly perceive audio content material and video content material truly at a degree we couldn’t years in the past.”
She went additional, describing how Google can now transcend fundamental transcription when analyzing video.
“Now you’ll be able to perceive audio a lot better. Now you’ll be able to perceive video a lot better. Now you’ll be able to perceive not simply the video transcription however like what’s the video extra about or what’s the type or different issues like that.”
Reid related this to a long-standing hole in how search works for non-English audio system. For customers in India who converse Hindi or different languages, the online usually lacks the data they want of their language. Beforehand, translating all internet content material into each language wasn’t scalable. LLMs modified that.
“Now with an LLM, you’ll be able to take info in a single language, perceive it, after which output in one other language. Like that opens up info.”
Google has been shifting on this route for a while. In October 2025, Reid told the Wall Street Journal that Google had adjusted rating to floor extra short-form video, boards, and user-generated content material.
The feedback additionally add context to Google’s Audio Overviews experiment launched in Search Labs final June, which generates spoken AI summaries of search outcomes.
That wasn’t potential a number of years in the past. In 2021, Google and KQED tested whether audio content could be made searchable and located that speech-to-text accuracy wasn’t excessive sufficient, notably for correct nouns and regional references. Reid’s feedback counsel that the barrier has fallen.
Subscription-Conscious Search May Change How Outcomes Are Personalised
Reid additionally outlined a route for personalization that goes past Google’s present Preferred Sources feature.
She advised the hosts Google desires to floor content material from shops a person pays for, not paywalled outcomes from sources they’ll’t entry.
“If you happen to love this supply and also you do have a relationship with it then that content material ought to floor extra simply for you on Google.”
Reid gave a sensible instance. Say 20 interviews on a subject are paywalled however a person subscribes to at least one outlet. Google ought to make it straightforward to search out the one they’ll learn.
“We should always floor the one which they’re paying for and never the six that they’ll’t get entry to extra.”
She instructed the corporate has “taken small steps thus far however need to do extra” to strengthen how audiences and trusted sources join via search. She additionally talked about the potential for micropayments for particular person articles, although she acknowledged that mannequin hasn’t taken off traditionally.
Google expanded Preferred Sources globally for English-language customers in December, and introduced a function that highlights hyperlinks from customers’ paid information subscriptions. Google stated it could prioritize these hyperlinks in a devoted carousel, beginning within the Gemini app, with AI Overviews and AI Mode to observe. On the time, Google stated customers who choose a most popular supply click on to that web site twice as usually on common. Reid’s feedback counsel the corporate sees subscription-aware search as a broader evolution of that very same route.
Why This Issues
The multimodal capabilities Reid pointed to broaden which content material codecs get found via search. Podcasts, video sequence, and audio-first content material have traditionally been more durable for Google to judge past metadata and transcripts. Google’s rising means to evaluate relevance and depth from audio and video instantly adjustments who will be discovered via search and the way.
For manufacturers and creators investing in non-text codecs, Google’s means to floor that work is catching as much as the place the viewers already is.
The subscription-aware personalization route issues for any writer with a paywall or membership mannequin. Search outcomes that adapt to what particular person customers pay for would tighten the connection between subscriber retention and search visibility. Paywalled content material may carry out higher for the viewers that issues most to the writer, slightly than being deprioritized as a result of most customers can’t entry it.
Wanting Forward
Reid didn’t connect timelines to both improvement. The multimodal indexing capabilities she talked about seem like present, whereas the subscription-aware personalization is a acknowledged route with some present options already in place.
Google I/O is scheduled for Might 19-20. Reid stated on the podcast that the corporate is “actively constructing” however that the tempo of AI improvement means some options may come collectively as late as April and nonetheless make it to the stage.
Featured Picture: Mawaddah F/Shutterstock
#Googles #Liz #Reid #LLMs #Unlock #Audio #Video #Indexing

