Search API Connector Documentation

Print

Import Google Search Console Data to Sheets

premium

In this guide, we’ll walk through how to pull Google Search Console data data directly into Google Sheets, using the API Connector add-on for Sheets.

Contents

Before You Begin

Click here to install the API Connector add-on from the Google Marketplace.

Part 1: Connect to the Google Search Console API

If you haven’t connected API Connector to Google Search Console before, you’ll first need to initiate the connection as follows:

  1. Open up Google Sheets and click Extensions > API Connector > Manage Connections.
  2. In the list of available connections, find “Google Search Console” and click Connect.
    google-search-console-img1
  3. You will be directed to google.com and asked to allow API Connector to view your Google Search Console data.
    google-search-console-img2
  4. You’ll then be returned to your Google Sheet, and can verify that your Google Search Console connection is active in the Connections screen.

Part 2: Create a Google Search Console API Request

For our first request, we’ll pull in summary search performance statistics for your website. Search analytics data from the Google Search Console API is accessed by sending POST requests to this API Request URL:

https://searchconsole.googleapis.com/webmasters/v3/sites/https%3A%2F%2Fmixedanalytics.com/searchAnalytics/query

Notes:

  • Where it says https%3A%2F%2Fmixedanalytics.com, substitute in your own URL. Make sure to encode it or the API won’t recognize it.
  • If you’re using a domain property, you’ll need to change https%3A%2F%2F to sc-domain:, like this:
https://searchconsole.googleapis.com/webmasters/v3/sites/sc-domain:mixedanalytics.com/searchAnalytics/query

Breakdown dimensions and dates are specified in the POST body, like this:

{
  "dimensions": [
    "QUERY"
  ],
  "startDate": "2021-12-01",
  "endDate": "2021-12-31",
  "rowLimit":1000
}

Part 3: Pull Google Search Console API Data into Sheets

We can now copy our request settings into API Connector.

  1. In API Connector, click Create, choose POST from the dropdown menu, and paste in the Request URL we created above.
    google-search-console-img3
  2. Choose ‘Google Search Console’ from the authentication dropdown.
    google-search-console-connected
  3. Under headers, enter Key = content-type, Value = application/json.
    google-search-console-img4b
  4. In the POST Body section, paste in the POST body we created above.
    google-search-console-img4
  5. Create a new tab and click Set current to use that tab as your data destination.
  6. Name your request and click Run. A moment later you’ll see search analytics stats for each of your keywords populate your sheet.
    google-search-console-results1

Part 4: Other Google Search Console API Data

View the full list of available search analytics parameters in the documentation. One easy way to get started is to use their API Explorer tool, which provides an interface to build your request:
google-search-console-img20

Once you’ve created your request, you can copy and paste it into API Connector to get the response data in Sheets. For example, this following query includes both an increased row limit and a keyword filter:

{
   "startDate":"2021-08-01",
   "endDate":"2021-08-31",
   "dimensions":[
      "query",
       ],
   "dimensionFilterGroups":[
      {
         "filters":[
            {
               "dimension":"query",
               "operator":"contains",
               "expression":"mixed analytics"
            }
         ]
      }
   ],
   "rowLimit":3000
}

Part 5: Handle Pagination

By default, Google Search Console will send 1,000 records unless you use the rowLimit and startRow parameters as described in their documentation.
google-search-console-pagination

In API Connector you can loop through these pages automatically with the following settings:

  • Pagination type: offset-limit body
  • Offset body parameter: startRow
  • Limit body parameter: rowLimit
  • Limit value: 25000
  • Run until: choose when to stop fetching data
    google-search-console-pagination-offset-limit-body

Part 6: API Documentation

Official API documentation: https://developers.google.com/webmaster-tools/v1/api_reference_index

Previous Import Google PageSpeed Insights Data to Google Sheets
Next Import Harvest Data to Google Sheets

23 thoughts on “Import Google Search Console Data to Sheets”

    • I just tested and it worked on my domain property. You’ll need to change “https://” to “sc-domain” so the whole request URL looks like this:
      https://searchconsole.googleapis.com/webmasters/v3/sites/sc-domain:mixedanalytics.com/searchAnalytics/query

      Reply
  1. Hi there, just purchased and this looks great!

    With my first test I noticed that only the top 1000 keywords / rows popped into Sheets. Is it possible to get the complete data set and not just the first 1000 rows? (still learning)

    Thanks for your time.

    Reply
    • Figured it out by checking the Google resource page listed and testing some different options… however a new wrinkle has popped up.

      We work with some large websites that often go well beyond 25,000 rows of data in a given day / month. Some sites have over 100,000 pages.

      Can you help me to understand how to run the following:

      All Queries (by Page) with a start and end date… but pulling ALL rows … even if it goes beyond the 25,000 row limit.

      I am hoping a multi query type call is the answer.. but again.. I am a beginner and still learning the ropes 🙂

      Thank you for your time and patience. This is super cool.

      Reply
      • That’s awesome you figured it out, I just edited the article to make it more clear, too (previously, this article had a screenshot showing how to set the row limits, but no text).

        Google limits each batch to 25,000 records, so to get more you need to pull multiple batches. Each batch has a start and end point defined by the “startRow” and “rowLimit” parameters. API Connector doesn’t currently have a preset pagination option for this, so we’ll use the “Multiple request bodies” function to achieve something similar. You’d set it up by pasting the following into the Request body input box:
        {"startDate":"2021-01-01","endDate":"2021-10-31","dimensions":["query"],"startRow":0,"rowLimit":25000}
        :::BREAK:::
        {"startDate":"2021-01-01","endDate":"2021-10-31","dimensions":["query"],"startRow":25000,"rowLimit":25000}
        :::BREAK:::
        {"startDate":"2021-01-01","endDate":"2021-10-31","dimensions":["query"],"startRow":50000,"rowLimit":25000}

        (and so on…)
        By doing this you can theoretically get all rows, but as you can see you have to manually add in the request bodies, for now there’s no “get all” option. I also think you might have trouble because Sheets slows down once you start pulling in so many records, I’m not sure it can really handle multiple sets of 100,000+ records. But hopefully this helps you test what’s possible. Just shoot me a message here or over at support and I’ll be happy to help further.
        Update: You can now do this automatically with our offset-limit body pagination option.

  2. This worked great as a starting point! For very large sites I can pull a shorter time period. I tried 100k by Page, Query and it seemed to work ok.

    Thanks again.

    Reply
  3. Great extension!
    I want to build a sheet that looks up how much Discover-traffic, Search-traffic and News-traffic a URL has gotten. I want to be able to paste a list of URL:s into the sheet and get the data for each URL.
    I think I’m half way there, but I don’t know how to tell the API to look for the URL in a specific cell of my sheet?
    I guess I need to replace “expression”:”jul” in the code below…

    {
    “startDate”:”2021-12-01″,
    “endDate”:”2021-12-31″,
    “dimensions”:[
    “page”,
    ],
    “type”: “discover”,
    “dimensionFilterGroups”:[
    {
    “filters”:[
    {
    “dimension”:”page”,
    “operator”:”contains”,
    “expression”:”jul”
    }
    ]
    }
    ],
    “rowLimit”:3000
    }

    Reply
  4. Hey! I’ve tried the multiple request bodies as outlined in the conversation with Daryl above but I continue to be limited to 5000 records in return. Has Search console changed their record limits?

    Reply
    • You should be able to get 25,000 records in one go. Can you add in a parameter like “rowLimit”:6000 to your request body? If you still only get 5000 records then, it may be that your data doesn’t contain more records than that.

      Reply
  5. Thank you so much for this! This is so cool and fun to play around with. I have a question around aggregating queries per page, but limiting the number of rows returned per page based on click count..

    I figured out the aggregation per page part (thanks to your guide!) but I want to only return rows with clicks > or = 1 (so no queries with 0 clicks come in, even if they get impressions). Is there a way to do that?

    Here’s what I’ve got so far:
    {
    "startDate":"2021-12-01",
    "endDate":"2022-02-28",
    "dimensions":[
    "query",
    "page"
    ],
    "dimensionFilterGroups":[
    {
    "filters":[
    {
    "dimension":"page",
    "operator":"contains",
    "expression":"/path/page-example/"
    }
    ]
    }
    ],
    "rowLimit":3000
    }

    Where can I put the click value > 0 (if it’s possible)? Thanks for this amazing guide!

    Reply
    • Thank you, I’m glad you are having fun! 🙂 And that’s such a good question. I thought there must be a way to do this since you can add a > 0 filter in the interface, but I just checked and as far as I can tell, the API only allows filtering on dimensions (query, page, device, or country). That’s too bad, but you still have some options:
      1) simply filter out those “0” rows with a Google Sheets filter.
      2) use a JMESPath filter. Just copy/paste this expression into the JMESPath field to include only those records where clicks are greater than 0: rows[?clicks > '0']

      Reply
    • Hmm unfortunately our preset connection only enables the /auth/webmasters.readonly scope. You would need to create a custom OAuth connection that enables the /auth/webmasters scope, i.e. write access. From there you could cycle through a list of websites and delete them all with a request like DELETE https://www.googleapis.com/webmasters/v3/sites/+++siteUrls!A1:A300+++

      Reply

Leave a Comment

Table of Contents