Advertisers are leaning into AI to help optimize pay-per-click (PPC) campaigns. But how reliable is that AI guidance?
Five popular AI tools – ChatGPT, Google AI Overviews, Google Gemini, Perplexity, Meta AI – were asked 45 PPC-related questions by data Insights company Wordstream.
The result? One in five answers were flat-out wrong.
The experiment. The hypothesis was that LLM-based AI tools can’t consistently provide accurate PPC advice. Sample queries included:
- “What bidding strategies are available in Google Ads in 2025?”
- “How do I improve my Quality Score?”
- “Write a Google Ads script to pause high CPC keywords.”
The results. Twenty percent of all AI answers were inaccurate.

Breakdown by tool:
- Google AI Overviews: 26% incorrect
- ChatGPT: 22% incorrect
- Meta AI: 20% incorrect
- Perplexity: 13% incorrect
- Google Gemini: 6% incorrect
Why we care. While widely used in PPC, AI tools are often wrong enough to be concerning. That’s not a small error rate – it’s a potential revenue loss. As AI becomes more embedded in marketing workflows, knowing which tools to trust (like Gemini over others) and when to verify answers is essential for protecting ad spend and optimizing performance.
Responses by tool. The type of AI tool often influenced the nature of the errors.
- Meta AI often framed answers through a Facebook Ads lens—even when asked about Google Ads strategy.
- ChatGPT gave performance feedback that leaned encouraging over critical, a possible reflection of its recent history with being overly agreeable.
- Gemini offered more precise and current information, aligning closely with updated 2025 benchmark data.
Coding capability. Accuracy isn’t the only issue. Gemini and AI Overviews declined to offer coding support when asked to generate a script that pauses high-cost keywords, citing complexity and potential for misuse.
- That kind of refusal raises questions about how much autonomy these tools are willing to give advertisers – and whether it’s influenced by internal company policies rather than user need.
Keyword research. Many tools also failed to deliver on keyword research requests. Instead of recommending niche, cost-effective terms, most listed broad, high-competition keywords despite being asked for the opposite.
- Only Gemini provided recommendations that showed an understanding of CPC dynamics.
Self-promotion. ChatGPT was the only tool to promote itself as a Google Ads helper. When asked who could assist with account management, it listed “ChatGPT (me!)” alongside actual PPC platforms and agencies. Whether that’s helpful or self-serving depends on the user’s expectations, but it stood out in a test focused on neutrality and accuracy.
What to do:
- Use the right AI for the right platform. Gemini is best for Google Ads; Meta AI for Facebook Ads. No one-size-fits-all solution here.
- Craft better prompts. AI accuracy often depends on context. The more precise your question, the more useful the answer.
- Trust, but verify. AI can support your PPC work – but don’t rely on it blindly. Cross-reference with real benchmarks and trusted experts.
The report. Can You Trust What AI Tells You About PPC? We Tested It!
Search Engine Land is owned by Semrush. We remain committed to providing high-quality coverage of marketing topics. Unless otherwise noted, this page’s content was written by either an employee or a paid contractor of Semrush Inc.
#Google #Gemini #trusted #LLM #PPC #strategy