Search API Connector Documentation
Import Google Search Console Data to Sheets
In this guide, we’ll walk through how to pull Google Search Console data data directly into Google Sheets, using the API Connector add-on for Sheets.
Contents
- Before You Begin
- Part 1: Connect to the Google Search Console API
- Part 2: Fetch Data from Google Search Console
- Part 3: Create a Custom API Request
- Part 4: Handle Pagination
- Part 5: API Documentation
Before You Begin
Click here to install the API Connector add-on from the Google Marketplace.
Part 1: Connect to the Google Search Console API
The easiest way to get started with the Google Search Console API is through API Connector’s built-in integration.
- Select Google Search Console from the drop-down list of applications
- Under Authorization, click Connect to Google Search Console
- You will be directed to google.com and asked to allow API Connector to view your Google Search Console data. Click Allow.
- You’ll then be returned to your Google Sheet, and can verify that your Google Search Console connection is active.
Part 2: Fetch Data from Google Search Console
Now that we’re connected, let’s pull some data into Sheets.
- Under Endpoint, choose the”Get search analytics” endpoint.
- Under siteUrl, fill in your site domain using the syntax
https%3A%2F%2Fmixedanalytics.com
. Make sure to encode it or the API won’t recognize it. If you’re using a domain property, enter it likesc-domain:mixedanalytics.com
. If you aren’t sure of your exact URL, run a request to the/sites
endpoint first, as that endpoint will return the correct URL. - Now fill in a date range (required), and any optional parameters. Here I’ve selected
query
to retrieve a list of keywords for my site. - Set a destination sheet, name your request, and hit Run.
Part 3: Create a Custom API Request
Alternatively, you can run a custom request instead of using API Connector’s built-in integration, using any of the data points shown in Google Search Console’s API documentation. Here is an example setup:
- Request method:
POST
- Request URL:
https://searchconsole.googleapis.com/webmasters/v3/sites/sc-domain:mixedanalytics.com/searchAnalytics/query
- OAuth:
Google Search Console
- Headers: Key =
Content-Type
, Value =application/json
- Request body:
{"dimensions":["QUERY"],"startDate":"2021-12-01","endDate":"2021-12-31","rowLimit":1000}

Part 4: Handle Pagination
By default, Google Search Console will send 1,000 records unless you use the rowLimit and startRow parameters as described in their documentation.
In API Connector you can loop through these pages automatically with the following settings:
- Pagination type: offset-limit body
- Offset body parameter: startRow
- Limit body parameter: rowLimit
- Limit value: 25000
- Run until: choose when to stop fetching data
Part 5: API Documentation
Official API documentation: https://developers.google.com/webmaster-tools/v1/api_reference_index
Hi Ani,
It seems the connector is not working with Domain level authorization in Google Search Console.
Can you confirm, to know if we are wrong 🙂
I just tested and it worked on my domain property. You’ll need to change “https://” to “sc-domain”.
Is it possible to have relative start and end dates based on the current date?
Sure, you can make a Sheets formula like
=today()-1
, and then use that cell in your requests.Hi there, just purchased and this looks great!
With my first test I noticed that only the top 1000 keywords / rows popped into Sheets. Is it possible to get the complete data set and not just the first 1000 rows? (still learning)
Thanks for your time.
Figured it out by checking the Google resource page listed and testing some different options… however a new wrinkle has popped up.
We work with some large websites that often go well beyond 25,000 rows of data in a given day / month. Some sites have over 100,000 pages.
Can you help me to understand how to run the following:
All Queries (by Page) with a start and end date… but pulling ALL rows … even if it goes beyond the 25,000 row limit.
I am hoping a multi query type call is the answer.. but again.. I am a beginner and still learning the ropes 🙂
Thank you for your time and patience. This is super cool.
That’s awesome you figured it out, I just edited the article to make it more clear, too (previously, this article had a screenshot showing how to set the row limits, but no text).
Google limits each batch to 25,000 records, so to get more you need to pull multiple batches. Each batch has a start and end point defined by the “startRow” and “rowLimit” parameters. API Connector doesn’t currently have a preset pagination option for this, so we’ll use the “Multiple request bodies” function to achieve something similar. You’d set it up by pasting the following into the Request body input box:
{"startDate":"2021-01-01","endDate":"2021-10-31","dimensions":["query"],"startRow":0,"rowLimit":25000}
:::BREAK:::
{"startDate":"2021-01-01","endDate":"2021-10-31","dimensions":["query"],"startRow":25000,"rowLimit":25000}
:::BREAK:::
{"startDate":"2021-01-01","endDate":"2021-10-31","dimensions":["query"],"startRow":50000,"rowLimit":25000}
(and so on…)
By doing this you can theoretically get all rows, but as you can see you have to manually add in the request bodies, for now there’s no “get all” option. I also think you might have trouble because Sheets slows down once you start pulling in so many records, I’m not sure it can really handle multiple sets of 100,000+ records. But hopefully this helps you test what’s possible. Just shoot me a message here or over at support and I’ll be happy to help further.
Update: You can now do this automatically with our offset-limit body pagination option.
This worked great as a starting point! For very large sites I can pull a shorter time period. I tried 100k by Page, Query and it seemed to work ok.
Thanks again.
We’ve added an option to cycle through automatically using offset-limit body pagination handling, I’ve updated the article to show how it works.
Great extension!
I want to build a sheet that looks up how much Discover-traffic, Search-traffic and News-traffic a URL has gotten. I want to be able to paste a list of URL:s into the sheet and get the data for each URL.
I think I’m half way there, but I don’t know how to tell the API to look for the URL in a specific cell of my sheet?
I guess I need to replace “expression”:”jul” in the code below…
{
“startDate”:”2021-12-01″,
“endDate”:”2021-12-31″,
“dimensions”:[
“page”,
],
“type”: “discover”,
“dimensionFilterGroups”:[
{
“filters”:[
{
“dimension”:”page”,
“operator”:”contains”,
“expression”:”jul”
}
]
}
],
“rowLimit”:3000
}
Hey there, please see this article, it shows how to use cell values in your request: https://mixedanalytics.com/knowledge-base/api-connector-create-api-request-based-on-cell/
I read through you post, but don’t understand how to referens a url or ist of url:s in the sheet. Tried this, but it didn’t work:
{
“startDate”:”2021-12-01″,
“endDate”:”2021-12-31″,
“dimensions”:[
“page”,
],
“type”: “discover”,
“dimensionFilterGroups”:[
{
“filters”:[
{
“dimension”:”page”,
“operator”:”contains”,
“expression”:”+++QuerySheet!A2+++”
}
]
}
],
“rowLimit”:3000
}
What am I doing wrong?
That looks fine, are you referencing the correct location? (cell A2 in a tab called QuerySheet)
Feel free to contact support if you’d like me to take a look at your sheet.
Yay! It worked. Thank you!’
Hi,
I would like to be able to import Google search results for Shared google sheets or shared google stock sheets . for example when I use url https://www.google.com/search?q=docs.google.com/stock+spreadsheet/edit+site:docs.google.com&lr&safe=active&as_qdr=all&start=0&ved=2ahUKEwjG67S_x6n1AhUhUGwGHSLbCl04ZBDy0wN6BAgBEDY&dpr=1.25, it displays About 3,520 results (1.26 seconds) . How do I use this connector to import the results into google sheets.
Google Search Console is for getting search performance statistics for your own website. If you’re looking for Google search results for a keyword you should check out the Google Search API instead. This article provides a tutorial on how to access it: https://mixedanalytics.com/knowledge-base/build-your-own-search-engine-results-pages-serp-dashboard-sheets/
Hey! I’ve tried the multiple request bodies as outlined in the conversation with Daryl above but I continue to be limited to 5000 records in return. Has Search console changed their record limits?
You should be able to get 25,000 records in one go. Can you add in a row limit of say 6000 to your request? If you still only get 5000 records then, it may be that your data doesn’t contain more records than that.
Thank you so much for this! This is so cool and fun to play around with. I have a question around aggregating queries per page, but limiting the number of rows returned per page based on click count..
I figured out the aggregation per page part (thanks to your guide!) but I want to only return rows with clicks > or = 1 (so no queries with 0 clicks come in, even if they get impressions). Is there a way to do that?
Here’s what I’ve got so far:
{
"startDate":"2021-12-01",
"endDate":"2022-02-28",
"dimensions":[
"query",
"page"
],
"dimensionFilterGroups":[
{
"filters":[
{
"dimension":"page",
"operator":"contains",
"expression":"/path/page-example/"
}
]
}
],
"rowLimit":3000
}
Where can I put the click value > 0 (if it’s possible)? Thanks for this amazing guide!
Thank you, I’m glad you are having fun! 🙂 And that’s such a good question. I thought there must be a way to do this since you can add a > 0 filter in the interface, but I just checked and as far as I can tell, the API only allows filtering on dimensions (query, page, device, or country). That’s too bad, but you still have some options:
1) simply filter out those “0” rows with a Google Sheets filter.
2) use a JMESPath filter. Just copy/paste this expression into the JMESPath field to include only those records where clicks are greater than 0:
rows[?clicks > '0']
Awesome, thanks so much for the quick response!
I have hundreds of domains in my GSC account that I need to delete. Doing it manually is PAINFUL. What is the best way to do this with API Connector from Mixed Analytics?
Hmm unfortunately our preset connection only enables the /auth/webmasters.readonly scope. You would need to create a custom OAuth connection that enables the /auth/webmasters scope, i.e. write access. From there you could cycle through a list of websites and delete them all with a request like
DELETE
https://www.googleapis.com/webmasters/v3/sites/+++siteUrls!A1:A300+++