Recently, I’ve been spending most of my day inside Cursor working Claude Code. I’m not a developer. I run a digital advertising and marketing company. However Claude Code inside Cursor has change into the quickest approach for me to deal with many duties I need to do, together with pulling and analyzing knowledge from Google Search Console, GA4, and Google Advertisements.
The setup takes about an hour. After that, you possibly can ask issues like “which key phrases am I paying for that I already rank for organically?” and get a solution in seconds as a substitute of spending a day with spreadsheets. (I wouldn’t have been the one spending a day with spreadsheets anyway, however now no one has to.)
Right here’s the step-by-step course of I developed whereas analyzing knowledge for our company shoppers. If this appears too technical, paste the URL of this text into Claude and ask it to stroll you thru it step-by-step.
What you’re constructing
What you find yourself with is a mission listing the place Claude Code has entry to Python scripts that pull dwell knowledge out of your Google APIs. You fetch the info, it lands in JSON information, and you then simply speak to it.
No dashboards to construct. No Looker Studio templates to keep up. You’re principally giving Claude Code the identical knowledge your group would have a look at, and letting it do the cross-referencing.
seo-project/
├── config.json # Consumer particulars + API property IDs
├── fetchers/
│ ├── fetch_gsc.py # Google Search Console
│ ├── fetch_ga4.py # Google Analytics 4
│ ├── fetch_ads.py # Google Advertisements search phrases
│ └── fetch_ai_visibility.py # AI Search knowledge
├── knowledge/
│ ├── gsc/ # Question + web page efficiency
│ ├── ga4/ # Site visitors by channel, prime pages
│ ├── adverts/ # Search phrases, spend, conversions
│ └── ai-visibility/ # AI quotation knowledge
└── reviews/ # Generated evaluationYour customers search everywhere. Make sure your brand shows up.
The SEO toolkit you know, plus the AI visibility data you need.
Start Free Trial
Get started with

Step 1: Arrange Google API authentication
The whole lot runs by way of a Google Cloud service account. One service account covers each GSC and GA4, which is good. Google Advertisements wants its personal OAuth setup, which is much less good however manageable.

Service account (for GSC + GA4)
- Create a mission in Google Cloud Console.
- Allow the Search Console API and Google Analytics Knowledge API.
- Create a service account below IAM & Admin > Service Accounts.
- Obtain the JSON key file.
- Add the service account electronic mail as a consumer in your GSC property (learn entry is sufficient).
- Add it as a Viewer in your GA4 property.
The service account electronic mail appears like [email protected]. You’ll add this electronic mail tackle to every consumer’s GSC and GA4 properties, similar approach you’d add any group member.
For companies: one service account works throughout all shoppers. Add it to every property, replace a config file with the property IDs, and also you’re set.
Google Advertisements authentication
Google Advertisements is completely different. You want:
- A developer token from the Google Advertisements API Heart (below Instruments & Settings > Setup > API Heart).
- OAuth 2.0 credentials from Google Cloud (not the service account, a separate OAuth consumer).
- A one-time browser authentication to generate a refresh token.
The developer token requires an software. For company use, describe it as “automated reporting for advertising and marketing shoppers.” Approval often takes 24-48 hours.
For those who’re utilizing a Supervisor Account (MCC), one developer token and one refresh token cowl all sub-accounts. You simply change the shopper ID per consumer.
For those who don’t have API entry or MCC, perhaps it’s a brand new consumer and also you’re nonetheless getting arrange, you possibly can skip the API completely. Obtain 90 days of key phrase and search phrases knowledge as CSVs from the Google Advertisements UI, drop them in your knowledge listing, and Claude Code will work with these simply as nicely. That’s how we deal with shoppers who aren’t in our MCC but.
Set up the Python dependencies
All of the examples beneath assume you’re working within the terminal on a Mac or Linux machine. For those who’re on Home windows, the best path is Home windows Subsystem for Linux (WSL).
pip set up google-api-python-client google-auth google-analytics-data google-adsStep 2: Construct the info fetchers
Every fetcher is a brief Python script that authenticates, pulls knowledge, and saves JSON. I didn’t write these from scratch. I described what I wished to Claude Code and it wrote them.
One factor that genuinely shocked me: I by no means needed to learn the API documentation. Not for GSC, GA4, or Google Advertisements.
I’d say one thing like “I need to pull the highest 1,000 queries from Search Console for the final 90 days,” and Claude Code would work out the authentication, endpoints, and question parameters. It already is aware of these APIs. You simply inform it what knowledge you need.
Right here’s what the scripts seem like.
Google Search Console fetcher
from google.oauth2 import service_account
from googleapiclient.discovery import construct
SCOPES = ['https://www.googleapis.com/auth/webmasters.readonly']
def get_gsc_service():
credentials = service_account.Credentials.from_service_account_file(
'service-account-key.json', scopes=SCOPES
)
return construct('searchconsole', 'v1', credentials=credentials)
def fetch_queries(service, site_url, start_date, end_date):
response = service.searchanalytics().question(
siteUrl=site_url,
physique={
'startDate': start_date,
'endDate': end_date,
'dimensions': ['query'],
'rowLimit': 1000
}
).execute()
return response.get('rows', [])You get again queries with clicks, impressions, CTR, and common place. Put it aside as JSON.
GA4 fetcher
from google.analytics.data_v1beta import BetaAnalyticsDataClient
from google.analytics.data_v1beta.varieties import (
RunReportRequest, DateRange, Metric, Dimension
)
def get_ga4_client():
credentials = service_account.Credentials.from_service_account_file(
'service-account-key.json',
scopes=['https://www.googleapis.com/auth/analytics.readonly']
)
return BetaAnalyticsDataClient(credentials=credentials)
def fetch_traffic_by_channel(consumer, property_id, start_date, end_date):
request = RunReportRequest(
property=f"properties/{property_id}",
date_ranges=[DateRange(start_date=start_date, end_date=end_date)],
dimensions=[Dimension(name="sessionDefaultChannelGroup")],
metrics=[
Metric(name="sessions"),
Metric(name="totalUsers"),
Metric(name="bounceRate"),
]
)
return consumer.run_report(request)Google Advertisements fetcher
Google Advertisements makes use of one thing known as Google Advertisements Question Language (GAQL). For those who’ve ever written a SQL question, this can look acquainted. For those who haven’t, don’t fear, Claude Code will write it for you:
from google.adverts.googleads.consumer import GoogleAdsClient
consumer = GoogleAdsClient.load_from_storage("google-ads.yaml")
ga_service = consumer.get_service("GoogleAdsService")
question = """
SELECT
search_term_view.search_term,
metrics.impressions,
metrics.clicks,
metrics.cost_micros,
metrics.conversions
FROM search_term_view
WHERE segments.date DURING LAST_30_DAYS
ORDER BY metrics.impressions DESC
"""
response = ga_service.search(customer_id="1234567890", question=question)This pulls the identical knowledge because the Search Phrases report you’d obtain from the Google Advertisements UI: impressions, clicks, value, conversions, match sort, marketing campaign, and advert group.
Get the e-newsletter search entrepreneurs depend on.
Step 3: Create a consumer config
One JSON file per consumer. Nothing fancy, simply the property IDs and a few context:
{
"identify": "Consumer Title",
"area": "instance.com",
"gsc_property": "https://www.instance.com/",
"ga4_property_id": "319491912",
"google_ads_customer_id": "9270739126",
"trade": "Greater Schooling",
"rivals": [
"https://competitor1.com/",
"https://competitor2.com/"
]
}Step 4: Ask cross-source questions
So now you’ve obtained JSON information from GSC, GA4, and Advertisements sitting in your mission listing. Claude Code can learn all of them without delay and reply questions that might usually imply loads of tab-switching and VLOOKUP work.
The paid-organic hole evaluation
The one most beneficial query I’ve discovered:
- “Examine the GSC question knowledge in opposition to the Google Advertisements search phrases. Discover key phrases the place we’re paying for clicks however have already got sturdy natural positions. Additionally, discover key phrases the place we’re spending on adverts with zero natural visibility. These are content material gaps.”

Once I ran this for a better schooling consumer, it recognized:
- 2,742 search phrases with wasted advert spend (impressions, zero clicks).
- 351 alternatives to cut back paid spend on phrases the place natural was already sturdy.
- 33 high-performing natural queries that paid might amplify.
- 41 content material gaps the place paid was the one presence (no natural).
That evaluation took about 90 seconds. The equal guide course of (downloading CSVs from GSC and Advertisements, VLOOKUPing throughout them, categorizing the overlaps) takes most of a day.
Different questions price asking
After getting GSC + GA4 + Advertisements knowledge loaded:
- “Which pages get probably the most impressions in GSC however have low CTR? What’s the visitors from GA4 for those self same pages?” (identifies meta description/title alternatives)
- “What are the highest 20 natural queries by impression that we’re not working adverts in opposition to?” (paid amplification candidates)
- “Group the GSC queries by matter cluster and present me which clusters have probably the most impressions however lowest common place.” (content material funding priorities)
- “Which pages in GA4 have excessive bounce charges however sturdy GSC positions? These may want content material enchancment.”
Claude Code isn’t doing something a human couldn’t do with spreadsheets. It’s doing it in seconds, and you may comply with up with one other query with out rebuilding the entire evaluation from scratch.
Step 5: Add AI visibility monitoring
Conventional SERP positions aren’t the entire image anymore. Between Google’s AI Overviews, AI Mode, Copilot, ChatGPT, and Perplexity, it is advisable to know whether or not AI programs are citing your content material.
That is very true in verticals like larger schooling, the place potential college students more and more begin their analysis in AI search instruments.
In case you have a monitoring platform
Instruments like Scrunch, Semrush’s AI Visibility toolkit, or Otterly.ai will monitor your model’s presence throughout ChatGPT, Perplexity, Gemini, Google AI Overviews, and Copilot.
Export the info as CSV or JSON and drop it in your knowledge listing. Claude Code can then cross-reference AI citations in opposition to your GSC and Advertisements knowledge.

Once I did this for our personal web site, we found two weblog posts competing for a similar AI citations on GEO-related queries.
One had 12 occasions as many Copilot citations as the opposite, regardless of each focusing on related intent. That led to a consolidation resolution we wouldn’t have made based mostly solely on conventional rank knowledge. This sort of AI search cannibalization is one thing most search engine optimisation groups aren’t but checking for.
For those who don’t have a monitoring platform
You don’t want an enterprise device to begin. There are a number of APIs that allow you to pull AI search knowledge immediately, and the prices are decrease than you’d assume.
DataForSEO AI Overview API: Probably the most accessible choice. Pay-as-you-go at about $0.01 per question, with a $50 minimal deposit. You ship a key phrase, and it returns the total AI Overview content material from Google SERPs, together with which URLs are cited. It additionally has a separate LLM Mentions API that tracks how LLMs reference manufacturers throughout platforms.
# DataForSEO AI Overview — simplified instance
payload = [{
"keyword": "best higher education marketing agencies",
"location_code": 2840, # US
"language_code": "en"
}]
response = requests.publish(
"https://api.dataforseo.com/v3/serp/google/ai_overview/dwell/superior",
headers=auth_headers,
json=payload
)
# Returns: AI Overview textual content, cited URLs, referencesSerpApi: Begins at $75/month for five,000 searches. Returns structured JSON for the total Google SERP, together with AI Overviews. Good documentation, Python consumer library, and a free tier for testing.
SearchAPI.io: Just like SerpApi, begins at $40/month. Additionally gives a separate Google AI Mode API that captures AI-generated solutions with citations.
Shiny Knowledge SERP API: Pay-as-you-go beginning round $1.80 per 1,000 requests. Set brd_ai_overview=2 to extend the probability of capturing AI Overviews. Additionally has an MCP server in order for you tighter agent integration.
Bing Webmaster Instruments: Free, and the one first-party AI quotation knowledge accessible from any main platform proper now. Reveals how usually your content material seems as a supply in Copilot and Bing AI responses, with page-level knowledge and the “grounding queries” that triggered citations. No API but (Microsoft says it’s on the backlog), however you possibly can export CSVs.
DIY: Direct LLM API Calls: The most cost effective method for small-scale monitoring. Write a Python script that sends a constant set of prompts to the OpenAI, Anthropic, and Perplexity APIs, then parses responses for model mentions. Perplexity’s Sonar API is particularly helpful right here as a result of it consists of net citations in responses, and quotation tokens are free. Complete value: below $20/month for a modest immediate library.
The overall sample: Decide one SERP API for Google AI Overview knowledge, use Bing Webmaster Instruments (it’s free), and complement with direct LLM API calls or a devoted tracker if funds permits.
The workflow in observe
So what does this really seem like on a Tuesday morning?
Setup: As soon as per consumer, ~quarter-hour
- Add service account electronic mail to consumer’s GSC and GA4
- Get their Google Advertisements buyer ID (or export search phrases in the event that they’re not within the MCC)
- Create a config.json with property IDs
Month-to-month knowledge pull: ~5 minutes
python3 run_fetch.py --sources gsc,ga4,advertsEvaluation (as wanted): Open Claude Code within the mission listing and ask questions. The information is true there.
Output: Claude Code generates a markdown report. Once I want one thing client-facing, I push it to Google Docs utilizing a separate device I constructed known as google-docs-forge. It converts markdown right into a correctly formatted Google Doc, so the output doesn’t seem like it got here from a terminal.

The entire course of takes about 35 minutes for a brand new consumer: setup, fetch, evaluation. Month-to-month refreshes take about 20 minutes, together with evaluation time. Examine that to the guide different of downloading CSVs from three completely different platforms, cross-referencing in spreadsheets, and writing up findings.
What this doesn’t exchange
I don’t need to oversell this. Claude Code is studying your knowledge and discovering patterns throughout sources sooner than you possibly can manually. It’s not telling you what to do about these patterns. You continue to want somebody who understands the consumer’s enterprise, their aggressive scenario, and what they’re really making an attempt to perform. The device finds the attention-grabbing knowledge. The strategist decides what to do with it.
You additionally have to confirm what it provides you. LLMs can hallucinate, and that features knowledge evaluation. I’ve seen Claude Code confidently report a quantity that didn’t match the JSON file. It’s uncommon, but it surely occurs.
Deal with the output such as you’d deal with work from a brand new analyst: belief however confirm, particularly earlier than something goes to a consumer. Spot-check the numbers in opposition to the supply knowledge. If one thing appears too clear or too dramatic, go have a look at the uncooked file.
It additionally doesn’t exchange your present platforms. For those who want historic pattern knowledge, automated alerts, or a client-facing dashboard, you continue to need a Semrush or an Ahrefs. What this provides you is the power to ask advert hoc questions throughout a number of knowledge sources, which none of these platforms does nicely on their very own.
And the GEO/AI visibility monitoring house remains to be immature. The information from AI quotation instruments is directionally helpful. Wind sock, not GPS. Google doesn’t publish AI Overview or AI Mode quotation knowledge by way of any official API, so each third-party device is approximating. Bing’s Copilot knowledge is probably the most dependable as a result of it’s first-party, but it surely solely covers the Microsoft ecosystem.
See the complete picture of your search visibility.
Track, optimize, and win in Google and AI search from one platform.
Start Free Trial
Get started with

Begin with GSC, layer in the remainder
If you wish to give this a shot:
- Begin with GSC solely. It’s the best API to attach (service account, read-only entry, free). Fetch your queries and pages for the final 90 days. Ask Claude Code to group queries by matter, determine page-2 rating alternatives, and discover pages with excessive impressions however low CTR.
- Add GA4 second. Similar service account. Now you possibly can ask cross-source questions: “Which pages rank nicely in GSC however have excessive bounce charges in GA4?”
- Add Google Advertisements once you’re prepared. The OAuth setup is extra concerned, however the paid-organic hole evaluation alone justifies the hassle.
- Layer in AI visibility final. Begin with Bing Webmaster Instruments (free) and one SERP API for AI Overview knowledge.
Every layer builds on the final. You don’t want all 4 to get worth. The GSC + GA4 mixture alone surfaces insights that take hours to search out manually.
Contributing authors are invited to create content material for Search Engine Land and are chosen for his or her experience and contribution to the search group. Our contributors work below the oversight of the editorial staff and contributions are checked for high quality and relevance to our readers. Search Engine Land is owned by Semrush. Contributor was not requested to make any direct or oblique mentions of Semrush. The opinions they specific are their very own.
#flip #Claude #Code #search engine optimisation #command #middle

