Search API Connector Documentation

Print

Import Google Search Console Data to Sheets

premium

In this guide, we’ll walk through how to pull Google Search Console data data directly into Google Sheets, using the API Connector add-on for Sheets.

Contents

Before You Begin

Click here to install the API Connector add-on from the Google Marketplace.

Part 1: Connect to the Google Search Console API

The easiest way to get started with the Google Search Console API is through API Connector’s built-in integration.

  1. In Sheets, open API Connector and create a new request (Extensions > API Connector > Open > Create request)
  2. Select Google Search Console from the drop-down list of applications
    googlesearchconsole-application
  3. Under Authorization, click Connect to Google Search Console
    googlesearchconsole-authorization
  4. You will be directed to google.com and asked to allow API Connector to view your Google Search Console data. Click Allow.google-search-console-img2
  5. You'll then be returned to your Google Sheet, and can verify that your Google Search Console connection is active.

Part 2: Pull Data from Google Search Console to Sheets

Now that we’re connected, let’s pull some data into Sheets.

  1. Under Endpoint, choose the"Get search analytics" endpoint.
    googlesearchconsole-endpoints
  2. Under siteUrl, fill in your site domain using the syntax https%3A%2F%2Fmixedanalytics.com. Make sure to encode it or the API won't recognize it. If you're using a domain property, enter it like sc-domain:mixedanalytics.com. If you aren't sure of your exact URL, run a request to the /sites endpoint first, as that endpoint will return the correct URL.
    sc-domain:mixedanalytics.com
  3. Now fill in a date range (required), and any optional parameters. Here I've selected query to retrieve a list of keywords for my site.
  4. Set a destination sheet, name your request, and hit Run.
    googlesearchconsole-response

Part 3: Create a Custom API Request

Alternatively, you can run a custom request instead of using API Connector’s built-in integration, using any of the parameters shown in Google Search Console's API documentation. Here is an example setup:

  • Application: Custom
  • Request method: POST
  • Request URL: https://searchconsole.googleapis.com/webmasters/v3/sites/sc-domain:mixedanalytics.com/searchAnalytics/query
  • OAuth: Google Search Console
  • Headers:
    • Content-Type: application/json
  • Request body: {"dimensions":["QUERY"],"startDate":"2021-12-01","endDate":"2021-12-31","rowLimit":1000}

Part 4: Handle Pagination

By default, Google Search Console will send 1,000 records unless you use the rowLimit and startRow parameters as described in their documentation.
google-search-console-pagination

In API Connector you can loop through these pages automatically with the following settings:

  • Pagination type: offset-limit body
  • Offset body parameter: startRow
  • Limit body parameter: rowLimit
  • Limit value: 25000
  • Run until: choose when to stop fetching data
    google-search-console-pagination-offset-limit-body

Part 5: API Documentation

Official API documentation: https://developers.google.com/webmaster-tools/v1/api_reference_index

43 thoughts on “Import Google Search Console Data to Sheets”

  1. Hi there, just purchased and this looks great!

    With my first test I noticed that only the top 1000 keywords / rows popped into Sheets. Is it possible to get the complete data set and not just the first 1000 rows? (still learning)

    Thanks for your time.

    Reply
    • Figured it out by checking the Google resource page listed and testing some different options... however a new wrinkle has popped up.

      We work with some large websites that often go well beyond 25,000 rows of data in a given day / month. Some sites have over 100,000 pages.

      Can you help me to understand how to run the following:

      All Queries (by Page) with a start and end date... but pulling ALL rows ... even if it goes beyond the 25,000 row limit.

      I am hoping a multi query type call is the answer.. but again.. I am a beginner and still learning the ropes 🙂

      Thank you for your time and patience. This is super cool.

      Reply
      • That's awesome you figured it out, I just edited the article to make it more clear, too.

        Google limits each batch to 25,000 records, so to get more you need to pull multiple batches. Each batch has a start and end point defined by the "startRow" and "rowLimit" parameters. API Connector doesn't currently have a preset pagination option for this, so we'll use the "Multiple request bodies" function to achieve something similar. You'd set it up by pasting the following into the Request body input box:
        {"startDate":"2021-01-01","endDate":"2021-10-31","dimensions":["query"],"startRow":0,"rowLimit":25000}
        :::BREAK:::
        {"startDate":"2021-01-01","endDate":"2021-10-31","dimensions":["query"],"startRow":25000,"rowLimit":25000}
        :::BREAK:::
        {"startDate":"2021-01-01","endDate":"2021-10-31","dimensions":["query"],"startRow":50000,"rowLimit":25000}

        (and so on...)
        By doing this you can theoretically get all rows, but as you can see you have to manually add in the request bodies, for now there's no "get all" option. I also think you might have trouble because Sheets slows down once you start pulling in so many records, I'm not sure it can really handle multiple sets of 100,000+ records. But hopefully this helps you test what's possible. Just shoot me a message here or over at support and I'll be happy to help further.
        Update: You can now do this automatically with our offset-limit body pagination option.

  2. This worked great as a starting point! For very large sites I can pull a shorter time period. I tried 100k by Page, Query and it seemed to work ok.

    Thanks again.

    Reply
  3. Great extension!
    I want to build a sheet that looks up how much Discover-traffic, Search-traffic and News-traffic a URL has gotten. I want to be able to paste a list of URL:s into the sheet and get the data for each URL.
    I think I'm half way there, but I don't know how to tell the API to look for the URL in a specific cell of my sheet?
    I guess I need to replace "expression":"jul" in the code below...

    {"startDate":"2021-12-01","endDate":"2021-12-31","dimensions":["page"],"type":"discover","dimensionFilterGroups":[{"filters":[{"dimension":"page","operator":"contains","expression":"jul"}]}],"rowLimit":3000}

    Reply
      • I read through you post, but don't understand how to referens a url or ist of url:s in the sheet. Tried this, but it didn't work:

        {"startDate":"2021-12-01","endDate":"2021-12-31","dimensions":["page"],"type":"discover","dimensionFilterGroups":[{"filters":[{"dimension":"page","operator":"contains","expression":"+++QuerySheet!A2+++"}]}],"rowLimit":3000}

        What am I doing wrong?

      • That looks fine, are you referencing the correct location? (cell A2 in a tab called QuerySheet)
        Feel free to contact support if you'd like me to take a look at your sheet.

    • Google Search Console is for getting search performance statistics for your own website. If you're looking for Google search results for a keyword you should check out the Google Custom Search API instead. This article provides a tutorial on how to access it.

      Reply
  4. Hey! I've tried the multiple request bodies as outlined in the conversation with Daryl above but I continue to be limited to 5000 records in return. Has Search console changed their record limits?

    Reply
    • You should be able to get 25,000 records in one go. Can you add in a row limit of say 6000 to your request? If you still only get 5000 records then, it may be that your data doesn't contain more records than that.

      Reply
  5. Thank you so much for this! This is so cool and fun to play around with. I have a question around aggregating queries per page, but limiting the number of rows returned per page based on click count..

    I figured out the aggregation per page part (thanks to your guide!) but I want to only return rows with clicks > or = 1 (so no queries with 0 clicks come in, even if they get impressions). Is there a way to do that?

    Here's what I've got so far:
    {"startDate":"2021-12-01","endDate":"2022-02-28","dimensions":["query","page"],"dimensionFilterGroups":[{"filters":[{"dimension":"page","operator":"contains","expression":"/path/page-example/"}]}],"rowLimit":3000}

    Where can I put the click value > 0 (if it's possible)? Thanks for this amazing guide!

    Reply
    • Thank you, I'm glad you are having fun! 🙂 And that's such a good question. I thought there must be a way to do this since you can add a > 0 filter in the interface, but I just checked and as far as I can tell, the API only allows filtering on dimensions (query, page, device, or country). That's too bad, but you still have some options:
      1) simply filter out those "0" rows with a Google Sheets filter.
      2) use a JMESPath filter. Just copy/paste this expression into the JMESPath field to include only those records where clicks are greater than 0: rows[?clicks > '0']

      Reply
    • Hmm unfortunately our preset connection only enables the /auth/webmasters.readonly scope. You would need to create a custom OAuth connection that enables the /auth/webmasters scope, i.e. write access. From there you could cycle through a list of websites and delete them all with a request like DELETE https://www.googleapis.com/webmasters/v3/sites/+++siteUrls!A1:A300+++

      Reply
    • Sure, I believe you can set it up like this:
      Method: POST
      URL: https://content-searchconsole.googleapis.com/v1/urlInspection/index:inspect?alt=json
      OAuth: Google Search Console
      Headers: Accept : application/json
      Request body:{"inspectionUrl":"https://mixedanalytics.com/api-connector/","siteUrl":"sc-domain:mixedanalytics.com"}

      (the siteURL will be http://www.example.com/ for a URL-prefix property, or sc-domain:example.com for a Domain property.)

      If you'd like to reference multiple URLs from your sheet, you can run a multi-query request with multiple request bodies.

      Reply
    • Sure, to do that you'll need to create a custom request so you can add in filters to the request body. Here's an example: {"dimensions":["QUERY"],"startDate":"2022-01-01","endDate":"2022-02-28","rowLimit":1000,"dimensionFilterGroups":[{"filters":[{"dimension":"QUERY","expression":"brand1|brand2","operator":"EXCLUDING_REGEX"}]}]}
      This filter uses regex to exclude whatever you substitute in where it says brand1 and brand2. (If you prefer to use an operator other than a regex exclude, the other available options are EQUALS, NOT_EQUALS, CONTAINS, NOT_CONTAINS, INCLUDING_REGEX).

      Reply
    • Yes, but the Mobile Friendly API seems to require an API key rather than OAuth (info). Get your key as described here, then plug it into a request like this:

      • Method: POST
      • Request URL:https://searchconsole.googleapis.com/v1/urlTestingTools/mobileFriendlyTest:run?key=yourAPIkey
      • Headers: Content-Type:application/json
      • Request body:
        {"url":"https://domain1.com"}:::BREAK:::
        {"url":"https://domain2.com"}:::BREAK:::
        {"url":"https://domain3.com"}
      • Substitute in your own URLs where you see domain1, domain2, etc.

      Reply
    • Hi Arnaud, unfortunately Google doesn't provide a month dimension for this API so you have 2 options: 1) include the date dimension in your request, such that you retrieve metrics for each date. Then use a Sheets function or pivot table to aggregate the data by month yourself, or 2) set your startDate and endDate parameters to the start/end of the month, and run the query WITHOUT including the date dimension. That will give you total metrics for that month, and you can then change the dates and re-run the query to fetch the next month of data, and so on.

      Reply
  6. Hi there! What permission level do you need to access the data? I currently a full user on our site and the tool says I don't have the right permission.

    Reply
    • If you have access via the Search Console interface (at any level -- full, restricted, or owner), then you should have access via the API. What is the exact error message you see? I would make sure that you're authenticated to the right account, and also that you're using the right syntax (i.e. "sc-domain:" for a domain property, "https" otherwise).

      Reply
      • Thanks for the reply!

        Here is the error:

        1): Completed with errors
        - We received an error from googleapis.com (403) show response
        { "error": { "code": 403, "message": "User does not have sufficient permission for site 'sc-domain:redacted.com'. See also: https://support.google.com/webmasters/answer/2451999.", "errors": [ { "message": "User does not have sufficient permission for site 'sc-domain:redacted.com'. See also: https://support.google.com/webmasters/answer/2451999.", "domain": "global", "reason": "forbidden" } ] } }

      • Hi Allie, thank you for sending on. Can you please try the following?
        1) disconnect and reconnect, making sure you're authenticating with the email address you use with Search Console. This is to make sure you're logging in with the right email address, in case you have a few.
        2) using our preset integration, select and run a request to the /sites endpoint. This endpoint doesn't require any parameters so avoids any potential syntax errors, and also will return the exact URLs that you do have access to.
        Let me know if that works or you still have trouble connecting after that.

  7. Thanks for the reply!

    I’m using the right email and the right website to pull the data but it still says I don’t have the right permission

    Reply
    • You received that error even from the /sites endpoint? In that case, I'm not sure what the issue could be... I tested on my side with some different email addresses and it always worked. If you search Google for "User does not have sufficient permission for site", you can find many people with the same issue, but as far as I can tell the solutions presented are also just to make sure you have access to the website and have entered it using the right syntax. Please feel free to contact support with a screenshot of your setup, I can take a look and see if anything jumps out.

      Reply
    • The preset Search Console integration currently only allows fetching data for one website at a time (we're planning to update it soon so you can add in multiple websites). However if you set up a custom request, you can list out multiple URLs, one for each site, such that API Connector cycles through that list and runs a request for each one (this is called a multi-query request).

      Reply
  8. Hi, thanks for your work.
    Starting from a list of URL, is it possibile to have clicks of relative start and end dates based on the current date?

    Reply
    • Hey Antonio, if I understood correctly, you're asking how to use dynamic dates. You can do this by creating relative date cells in your sheet, e.g. create a tab called Dates and enter the function =text(today()-7,"yyyy-mm-dd") into cell A1.
      Then in the startDate parameter, you can reference that cell by entering +++Dates!A1+++. Every time the request runs, the start date will always be today - 7 (or whatever value you use in your function). Please let me know if that answers your question!

      Reply
      • Hi Ana, thanks for your reply, it answers in part to my question. I also wanted to know, how can i obtain clicks and impressions from GSC API starting from a list of URL.
        Basically I want to check performance of a list of URL before and after the refresh date of the contents. Thanks again

      • I don't think you can do that exactly, but you can set the dimensions parameter to "page". That will list all the pages (URLs) on your site along with clicks & impressions. You can then filter that list in Sheets or use VLOOKUP to pull in just the URLs you're interested in. Does that work for you?

Leave a Reply to Darryl Cancel reply