Share this post
This post is part of the Guide on Python for SEO and the guide on the Google Search Console API
This tutorial is for SEOs that would like to use Google Search Console API with Python.
I will show the steps to:
- Connect to the developer console,
- Get Your API Keys and authenticate to the API OAuth2
- Make your first API call.
To learn more, view my complete guide on Python for SEO.
Contenus masquer
1 Make Your First Google Search Console API call with Python
2 Step 1: Get Your Google Search Console API Key
3 Step 2: Import Libraries
4 Step 3. Authenticate to the API How to Use Google Search Console ap... How to Use Google Search Console api with Python - Tutorial
4.1 Connect Using a JSON API Key
4.2 Run OAuth 2.0
5 Step 4: Make Your API Call
6 Full Code
6.1 Alternative OAuth2 connection: Client_Id and Client_Secrets
7 Conclusion
7.1 Related posts:
Make Your First Google Search Console API call with Python
There are many things that you can do with the Search Console API like getting search traffic and extracting indexed pages.
In this post, we will make a very basic API call: get validated properties.
The steps to use Google Search Console API in Python are:
- Get Your Google Search Console API Key
- Import Libraries
- Authenticate to the API
- Make Your API Call
Step 1: Get Your Google Search Console API Key
You will need an API key to be able to connect to the Google Search Console API.
How to Get Google Search Console API Keys
What is an API Key?
An API key is like your username and password to access the API.
Follow this guide to help you get the detailed steps to get your Google Search Console API key.
Otherwise, here are the simplified steps to get your Google Search Console API keys.
- Go toGoogle’s developers console, and sign-in;
- Go to “Dashboard” and click “Enable APIs and Services” ;
- Search for “Google Search Console API” and enable the API;
- Go to the “credential” tab, click on “create credential” and select “OAuth Client ID”;
- Click on “configure the consent screen ” and give a name to your product;
- Choose “Other” as the application type and click create;
- Copy the client ID and client Secret or go-on and save to download the JSON file.
Download your API key first, and then you’ll be ready to connect to the Search Console API with Python.
Step 2: Import Libraries
To run the OAuth 2.0 authentication you will need to install and import all those libraries.
Alternatively, you can clone the Github Repository that I made and run:
$ pip install -r requirements.txt
import argparseimport httplib2import requestsfrom collections import defaultdictfrom dateutil import relativedeltafrom googleapiclient.discovery import buildfrom oauth2client import clientfrom oauth2client import filefrom oauth2client import tools
Step 3. Authenticate to the API
Now, we will log in to the API using OAuth. Two options are possible. Using a JSON file (recommended), or using the API client_id
and client_secret
.
Choose your preferred solution.
Connect Using a JSON API Key
The authorize_creds()
function will check the credentials file and define a .dat
file to save authorised credentials so you don’t have to go through the login process each time.
def authorize_creds(creds,authorizedcreds='authorizedcreds.dat'): ''' Authorize credentials using OAuth2. ''' print('Authorizing Creds') # Variable parameter that controls the set of resources that the access token permits. SCOPES = ['https://www.googleapis.com/auth/webmasters.readonly'] # Path to client_secrets.json file CLIENT_SECRETS_PATH = creds # Create a parser to be able to open browser for Authorization parser = argparse.ArgumentParser( formatter_class=argparse.RawDescriptionHelpFormatter, parents=[tools.argparser]) flags = parser.parse_args([]) # Creates an authorization flow from a clientsecrets file. # Will raise InvalidClientSecretsError for unknown types of Flows. flow = client.flow_from_clientsecrets( CLIENT_SECRETS_PATH, scope = SCOPES, message = tools.message_if_missing(CLIENT_SECRETS_PATH)) # Prepare credentials and authorize HTTP # If they exist, get them from the storage object # credentials will get written back to the 'authorizedcreds.dat' file. storage = file.Storage(authorizedcreds) credentials = storage.get() # If authenticated credentials don't exist, open Browser to authenticate if credentials is None or credentials.invalid: credentials = tools.run_flow(flow, storage, flags) # Add the valid creds to a variable # Take the credentials and authorize them using httplib2 http = httplib2.Http() # Creates an HTTP client object to make the http request http = credentials.authorize(http=http) # Sign each request from the HTTP client with the OAuth 2.0 access token webmasters_service = build('searchconsole', 'v1', http=http) # Construct a Resource to interact with the API using the Authorized HTTP Client. print('Auth Successful') return webmasters_service
Run OAuth 2.0
To run the code, simply add your credentials and run authorize_creds()
. Once this is done you will be able to run Google Search Console API using the webmasters_service
variable.
if __name__ == '__main__': creds = 'client_secrets.json' webmasters_service = authorize_creds(creds)
The if name equals main line checks whether you are running the module or importing it. If you are importing it, authorize_creds()
will not run.
Step 4: Make Your API Call
Now the glory!
Let’s see what websites we have validated in Google Search Console using the API.
site_list = webmasters_service.sites().list().execute()verified_sites_urls = [s['siteUrl'] for s in site_list['siteEntry'] if s['permissionLevel'] != 'siteUnverifiedUser' and s['siteUrl'][:4] == 'http']for site_url in verified_sites_urls: print( site_url)
Full Code
Here is the full code to let you make your first API call with Python.
oauth.py
#!/usr/bin/env python# oauth.py# Authorize credentials using OAuth2.# @author: Jean-Christophe Chouinard. # @role: Sr. SEO Specialist at SEEK.com.au# @website: jcchouinard.com# @LinkedIn: linkedin.com/in/jeanchristophechouinard/ # @Twitter: twitter.com/@ChouinardJCimport argparseimport httplib2import requestsfrom collections import defaultdictfrom dateutil import relativedeltafrom googleapiclient.discovery import buildfrom oauth2client import clientfrom oauth2client import filefrom oauth2client import toolsdef authorize_creds(creds,authorizedcreds='authorizedcreds.dat'): ''' Authorize credentials using OAuth2. ''' print('Authorizing Creds') # Variable parameter that controls the set of resources that the access token permits. SCOPES = ['https://www.googleapis.com/auth/webmasters'] # Path to client_secrets.json file CLIENT_SECRETS_PATH = creds # Create a parser to be able to open browser for Authorization parser = argparse.ArgumentParser( formatter_class=argparse.RawDescriptionHelpFormatter, parents=[tools.argparser]) flags = parser.parse_args([]) # Creates an authorization flow from a clientsecrets file. # Will raise InvalidClientSecretsError for unknown types of Flows. flow = client.flow_from_clientsecrets( CLIENT_SECRETS_PATH, scope = SCOPES, message = tools.message_if_missing(CLIENT_SECRETS_PATH)) # Prepare credentials and authorize HTTP # If they exist, get them from the storage object # credentials will get written back to the 'authorizedcreds.dat' file. storage = file.Storage(authorizedcreds) credentials = storage.get() # If authenticated credentials don't exist, open Browser to authenticate if credentials is None or credentials.invalid: credentials = tools.run_flow(flow, storage, flags) # Add the valid creds to a variable # Take the credentials and authorize them using httplib2 http = httplib2.Http() # Creates an HTTP client object to make the http request http = credentials.authorize(http=http) # Sign each request from the HTTP client with the OAuth 2.0 access token webmasters_service = build('searchconsole', 'v1', http=http) # Construct a Resource to interact with the API using the Authorized HTTP Client. print('Auth Successful') return webmasters_service# Create Function to execute your API Requestdef execute_request(service, property_uri, request): return service.searchanalytics().query(siteUrl=property_uri, body=request).execute() if __name__ == '__main__': creds = 'client_secrets.json' webmasters_service = authorize_creds(creds)
get_properties.py
#!/usr/bin/env pythonfrom oauth import authorize_creds, execute_requestdef get_property_list(webmasters_service): ''' Get a list of validated properties from GSC ''' site_list = webmasters_service.sites().list().execute() # Filter for verified websites verified_sites_urls = [s['siteUrl'] for s in site_list['siteEntry'] if s['permissionLevel'] != 'siteUnverifiedUser' and s['siteUrl'][:4] == 'http'] return verified_sites_urlsif __name__ == '__main__': creds = 'client_secrets.json' webmasters_service = authorize_creds(creds) verified_sites_urls = get_property_list(webmasters_service)
Alternative OAuth2 connection: Client_Id and Client_Secrets
You might want to connect to the API simply by adding your client secrets and client id instead of using the JSON credential file.
Here is the code below:
Make sure that you don’t forget to add your own client_id and client_secrets. You can view the notebook.
parser = argparse.ArgumentParser( formatter_class=argparse.RawDescriptionHelpFormatter, parents=[tools.argparser])flags = parser.parse_args([])flow = client.OAuth2WebServerFlow(client_id='XXXXXXXXXXXXXXXXXXXXXXXXXX.apps.googleusercontent.com', client_secret='XXXXXXXXXXXXXXXXXXXXX', scope=SCOPES)storage = file.Storage('searchconsole.dat')credentials = storage.get()if credentials is None or credentials.invalid: credentials = tools.run_flow(flow, storage, flags)# Create an httplib2.Http object and authorize it with our credentialshttp = credentials.authorize(http=httplib2.Http())
Conclusion
We are done. You have successfully connected to Google Search Console, calling the API With Python. Now we will learn how to Get All Your Search traffic With Google Search Console API.
4.7/5 - (7 votes)
Related posts:
- Backup Google Search Console Data Into MySQL With Python
- Find Keyword Cannibalization Using Google Search Console and Python
- Intro to GSC API with Python (Video)
- Authorise Requests to GSC API Using OAuth 2.0
FAQs
How do I use Google console API with Python? ›
- Create a service account. ...
- Install the packages. ...
- Store your key path. ...
- Create an API connection using your key. ...
- Create a function to run a query. ...
- Run your Search Console API query.
- Register on Google Api to get a Google Api key (free version)
- Now search for Google Custom Search and set up your free account to get a custom search id.
- Now add this package(google-api-python-client) in your python project (can be done by writing ! pip install google-api-python-client )
- def get_data(self, api):
- response = requests.get(f"{api}")
- if response.status_code == 200:
- print("sucessfully fetched the data")
- self.formatted_print(response.json())
- else:
- print(f"Hello person, there's a {response.status_code} error with your request")
- Step 1: Get Your Google Search Console API Key. ...
- Step 2: Import Libraries. ...
- Step 3: Create Directory for Your Project. ...
- Log in to the API. ...
- Step 5: Execute Request. ...
- Step 6: Create Function to Process the CSV Files. ...
- Step 6: Create Function to Extract All the Data. ...
- Step 7: Execute Request.
- Register your App.
- Enable Microsft Graph Permissions.
- Authorization Step 1: Get an access code.
- Authorization Step 2: Use your access code to get a refresh token.
- Authorization Step 3: Use your refresh token to get an access token.
- Using Windows Task Scheduler.
To interact with an API, specifically a web API in python we can make use of the standard requests module to make the request, because most web service APIs return a response in a format known as JSON it will be useful to know little about it. JSON is a way to store data in an organized, logical manner.
Is it legal to scrape Google search results? ›Google search results fall into the category of publicly available data, so scraping Google search results is legal. But there is still some data you should not be accumulating, such as personal information or copyrighted content. Learn more about regulations and laws connected to scraping at our legality article.
How do I automate Google search in Python? ›...
Installation
- Selenium pip install selenium.
- Chrome browser.
- Chromedriver. Download the chrome browser from here (choose the version for your system) After downloading, extract it and then copy the file in the folder of the script.
With one click, you can open all Google search results in other tabs. Instructions: 0. When searching, go to the "Settings" page and select "Search Settings". In the "Search Results" tab under "Results on page" select " 100 ".
Is Python good for API testing? ›Answer: Yes. Python language is used with Selenium to perform testing. Python API is helpful to connect with the browser through Selenium.
Which Python module helps you to easily access an API? ›
In Python, the most common library for making requests and working with APIs is the requests library. The requests library isn't part of the standard Python library, so you'll need to install it to get started.
How to write a Python script for API? ›- Select GET from the dropdown.
- Type the entry point of our API instance + /users (the endpoint)
- Hit Send.
- Check the status code returned by our API (we should see 200 OK )
- View our API's response, which is users. csv in JSON (like a dictionary) format.
- Configure the API endpoint. An API endpoint can be complex. ...
- Create an API resource. ...
- Store data into a database. ...
- Transform the API data. ...
- Export the data to an application. ...
- Check and maintain the pipeline.
Because both services have different purposes, Google Search Console and Analytics will do things like aggregate data in different ways, and because of that the reports will be different and have the appearance of a discrepancy. The data is accurate but it's simply displayed differently.
What is the difference between Google Search Console and Google Analytics? ›Despite their similarities, however, they aren't the same. Google Search Console helps you monitor your website's performance generally and in search engine results. Google Analytics helps you learn more about your users, including who they are, how they found you, and how they interact with your website.
How do I get responses from a post request in Python? ›Making a Python requests POST Request
In the example above, we simply pass the URL to the /post endpoint into the requests. post() function. We then print the response that we get back, which returns a Response object.
...
Approach:
- Import required modules.
- Assign URL.
- Get the response of the URL using urlopen().
- Convert it to a JSON response using json. loads().
- Display the generated JSON response.
To request JSON data from the server using the Python Requests library, call the request. get() method and pass the target URL as a first parameter. The Python Requests Library has a built-in JSON decoder and automatically converts JSON strings into a Python dictionary. If JSON decoding fails, then response.
How do you automate API in Python? ›- Install Python requests Library for API Automation. ...
- Understanding Get http request calls and get response using Json method. ...
- Validating response status codes and headers using response object. ...
- Understand automating Post http request with Payload and headers.
Some languages and frameworks used to write APIs are better and more efficient than others. From our experience in developing enterprise APIs, we have found that Python, Flask, Node, JS, and Express are the best languages for building EFFICIENT web application APIs.
Which API do you use to connect to a database from Python? ›
DB-API is Python's standard API used for accessing databases. It allows you to write a single program that works with multiple kinds of relational databases instead of writing a separate program for each one.
Is web scraping legal in USA? ›Yes, web scraping itself is legal in the US. The conclusion is supported by recent case law; the courts in HiQ v LinkedIn confirmed that scraping publicly available data is legal.
Can Google give data to police? ›If we reasonably believe that we can prevent someone from dying or from suffering serious physical harm, we may provide information to a government agency — for example, in the case of bomb threats, school shootings, kidnappings, suicide prevention, and missing persons cases.
Can I get in trouble for web scraping? ›So is it legal or illegal? Web scraping and crawling aren't illegal by themselves. After all, you could scrape or crawl your own website, without a hitch. Startups love it because it's a cheap and powerful way to gather data without the need for partnerships.
Is Google it automation with Python worth it? ›Is the Google IT Automation With Python Certificate Worth it? The Google IT Automation with Python is a good certificate to pursue and start a career as a computer programmer. Python is a very powerful language that can be used on a variety of platforms and offer solutions to different problems.
Is Google it automation with Python free? ›Google develops the IT Automation with Python free online course with basic knowledge of coding. The course offers IT professionals with in-demand skills such as Python, Git, and IT automation that could help the participants to develop their careers.
How do I scrape information from a website in Python? ›- Find the URL that you want to scrape.
- Inspecting the Page.
- Find the data you want to extract.
- Write the code.
- Run the code and extract the data.
- Store the data in the required format.
- Target Long-tail Keywords. ...
- Use Keywords in Alt Text for Images. ...
- Use Keywords in Headers. ...
- Focus on Great UX. ...
- Include Multiple Media Types. ...
- Use Internal Links. ...
- Focus on “Top of the Funnel” Keyword Phrases. ...
- Eliminate Spammy Backlinks.
While logged in, click on the gear icon in the top right of the browser (next to your name) and click “Search settings”. On the Global Preferences page, scroll down to “Number of Results” and change the dropdown value. Note 10 is the default value and 10, 20, 30, 50, and 100 are the available options.
Which API is best in Python? ›- Flask. Category: Micro Framework. ...
- Tornado. Category: Micro Framework. ...
- FastAPI. Category: Micro Framework. ...
- Sanic. Category: Micro Framework. ...
- Falcon. Category: Micro Framework. ...
- Bottle. Category: Micro Framework. ...
- Hug. Category: Micro Framework. ...
- Pyramid. Category: Full Stack Framework.
Do hackers use API? ›
API hacking is a type of security testing that seeks to exploit weaknesses in an API. By targeting an API endpoint, you as an attacker can potentially gain access to sensitive data, interrupt services or even take over entire systems. It's said that more than 80% of all web traffic is now driven through API requests.
What is the fastest Python API framework? ›FastAPI is a modern, fast (high-performance), web framework for building APIs with Python 3.7+ based on standard Python type hints. The key features are: Fast: Very high performance, on par with NodeJS and Go (thanks to Starlette and Pydantic). One of the fastest Python frameworks available.
What are the three most common types of APIs Python? ›Today, there are three categories of API protocols or architectures: REST, RPC and SOAP.
What is the easiest API to use? ›The first API I'll be talking about will the simplest one on the list: The Official Joke API. This API is extremely simple to use, you basically use a URL to connect to one of the API's endpoints and it returns back data in JSON format. Easy!
Is writing an API difficult? ›Writing a functional API is relatively easy, but writing a good one that's functional and empowers your users takes planning and patience. Designing a good API is about creating a sense of clarity and simplicity—it's the bridge between your intention and your users.
How do you use API for beginners? ›- Understand what APIs do.
- Understand the different types of APIs.
- Learn about W3C API and Google API.
- Identify the services available which can be accessed using an API.
- Integrate APIs to produce a working system or program with your code and test API.
Method to sell API
The unique feature of RapidAPI Hub is that developers can charge other developers who want to use their APIs . As a result, you may use your APIs to generate income or even make passive income out of them. Additionally, their platform enables developers to find and use already-existing APIs.
Approach: First make the necessary JavaScript file, HTML file and CSS file. Then store the API URL in a variable (here api_url). Define a async function (here getapi()) and pass api_url in that function. Define a constant response and store the fetched data by await fetch() method.
Can you extract data with Python? ›One of the most important features of ScrapingBee, is the ability to extract exact data without need to post-process the request's content using external libraries. We can use this feature by specifying an additional parameter with the name extract_rules .
How to use fetch in Python? ›- Create a database Connection from Python. ...
- Define the SELECT query. ...
- Execute the SELECT query using the cursor. ...
- Get resultSet (all rows) from the cursor object using a cursor. ...
- Iterate over the ResultSet using for loop and get column values of each row.
Does Google Search Console show all keywords? ›
Use Search Console for Keyword Research
The Queries tab will be selected by default. Below it, you can scroll and page through the list to see every keyword you show up for in search results across your entire website.
As of December 2022, online search engine Bing accounted for nearly nine percent of the global search market, while market leader Google had a share of around 84.08 percent.
Who has the most accurate search engine? ›1. Google. Besides being the most popular search engine covering over 90% of the worldwide market, Google boasts outstanding features that make it the best search engine in the market. It boasts cutting-edge algorithms, easy-to-use interface, and personalized user experience.
How do you access the console in Python? ›Working with Python console The console appears as a tool window every time you choose the corresponding command on the Tools menu. You can assign a shortcut to open Python console: press Ctrl+Alt+S , navigate to Keymap, specify a shortcut for Main menu | Tools | Python or Debug Console.
How do I connect Google to Python? ›- Head over to the Google API Console.
- Create a new project by selecting My Project -> + button.
- Search for 'Google Drive API', enable it.
- Head over to 'Credentials' (sidebar), click 'Create Credentials' -> 'Service Account Key'