Tracking keyword rankings is important for understanding how your website is performing in search engine results. Whether you’re an SEO specialist, a digital marketer, or a website owner, knowing where your site ranks for specific keywords can guide your strategy and help you make informed decisions.
While there are many tools available for checking keyword rankings, Python offers a flexible and powerful way to do this yourself. In this post, we’ll explore two methods for checking Google keyword rankings using Python: one that leverages the Google Search Console API, and another that involves web scraping.
Method 1: Checking Keyword Rankings with Google Search Console API
Google Search Console (GSC) is a free tool provided by Google that helps you monitor, maintain, and troubleshoot your site’s presence in Google Search results. The GSC API allows you to programmatically access data related to your site’s search performance, including keyword rankings.
Step 1: Set Up Google Search Console API
To get started with the Google Search Console API, you need to set up a project in the Google Cloud Console and obtain the necessary credentials.
- Create a Project on Google Cloud Console:
- Go to the Google Cloud Console.
- Create a new project.
- Enable the Search Console API:
- In your Google Cloud project, navigate to APIs & Services > Library.
- Search for “Search Console API” and enable it for your project.
- Create OAuth 2.0 Credentials:
- Go to APIs & Services > Credentials.
- Create OAuth 2.0 Client ID credentials.
- Download the credentials file (a JSON file) to your computer.
Step 2: Install Required Python Packages
Next, you’ll need to install some Python packages that will allow you to interact with the Google Search Console API.
bashCopy codepip install google-auth google-auth-oauthlib google-auth-httplib2 google-api-python-client
These packages help with authentication and making API requests.
Step 3: Authenticate and Access the API
Now that you have your credentials and the necessary packages installed, you can authenticate and start accessing the API.
from google.oauth2 import service_account
from googleapiclient.discovery import build
SCOPES = ['https://www.googleapis.com/auth/webmasters.readonly']
SERVICE_ACCOUNT_FILE = 'path/to/your-service-account-file.json'
credentials = service_account.Credentials.from_service_account_file(
SERVICE_ACCOUNT_FILE, scopes=SCOPES)
service = build('searchconsole', 'v1', credentials=credentials)
# Replace with your site's URL
site_url = 'https://www.yoursite.com'
# Requesting data from the API
response = service.searchanalytics().query(
siteUrl=site_url,
body={
'startDate': '2023-01-01',
'endDate': '2023-01-31',
'dimensions': ['query'],
'rowLimit': 10
}
).execute()
# Displaying the results
for row in response['rows']:
print(f"Keyword: {row['keys'][0]}, Position: {row['position']}")
In this script:
- SCOPES defines the permissions that the script is requesting (read-only access to Google Search Console data).
- SERVICE_ACCOUNT_FILE is the path to your downloaded JSON credentials.
- site_url is the URL of your website.
- The API query retrieves data for a specific date range and returns the top 10 keywords with their average positions.
Step 4: Interpret the Data
The API response includes a wealth of data such as clicks, impressions, CTR (Click-Through Rate), and the average position for each query (keyword). The position
field indicates the average position of your site in Google Search results for that keyword.
Also read: Python to Analyze Website Broken Links for SEO
Method 2: Checking Keyword Rankings by Web Scraping (Use with Caution)
Web scraping involves extracting data from websites by parsing the HTML of web pages. While it’s possible to scrape Google search results to determine keyword rankings, it’s important to note that this is against Google’s terms of service. If you choose to proceed, do so with caution, as your IP address could be blocked by Google.
Step 1: Install Required Python Packages
You can use libraries like BeautifulSoup
and requests
to scrape Google’s search results.
codepip install beautifulsoup4 requests
Step 2: Write a Web Scraping Script
Here’s a basic example of a web scraping script that checks the ranking of a keyword for a specific domain:
import requests
from bs4 import BeautifulSoup
def get_google_rank(keyword, domain):
headers = {'User-Agent': 'Mozilla/5.0'}
url = f"https://www.google.com/search?q={keyword}"
response = requests.get(url, headers=headers)
soup = BeautifulSoup(response.text, 'html.parser')
for idx, result in enumerate(soup.select('.tF2Cxc')):
link = result.select_one('.yuRUbf a')['href']
if domain in link:
return idx + 1
return None
keyword = "example keyword"
domain = "example.com"
rank = get_google_rank(keyword, domain)
print(f"Rank: {rank}")
In this script:
- requests.get sends a request to Google Search.
- BeautifulSoup parses the HTML of the response.
- The script checks each result to see if it contains the specified domain and returns the position (rank) of the first match.
Step 3: Handle Risks and Limitations
Web scraping Google’s results can lead to issues such as:
- IP Blocking: Google may block your IP if it detects scraping activity.
- CAPTCHA Challenges: Google might prompt CAPTCHAs, which are difficult to bypass programmatically.
- Data Inaccuracy: Google customizes search results based on factors like location, search history, and device, making scraped data less reliable.
Conclusion
Checking Google keyword rankings using Python can be done in a few different ways, each with its pros and cons. The Google Search Console API is the most reliable and Google-approved method, offering accurate data without violating terms of service. Web scraping, while possible, carries significant risks and should only be used with caution.
If you’re looking for a programmatic way to monitor keyword rankings for your website, start with the Google Search Console API. It’s a robust tool that integrates seamlessly with Python and offers comprehensive data on your site’s search performance.