This project automates Google Maps business data extraction using Selenium WebDriver and stores the collected information directly in a Google Sheet.
It retrieves business names, phone numbers, addresses, plus codes, and websites for given search queries — all handled automatically, with results saved in real-time.
- Scrapes business listings from Google Maps
- Extracts name, phone number, address, Plus Code, and website
- Saves results directly to a Google Sheet
- Reads city or business queries dynamically from another sheet
- Automatically resumes for multiple queries
- Prevents duplicate entries with tracking
- Detects and skips unavailable or malformed listings
Google-Map-Scraper/
│
├── main.py # Main scraper script
├── requirements.txt # Python dependencies
├── README.md # Project documentation
└── credentials.json # Google API credentials (Service Account)
Before running the script, ensure you have:
- Python 3.8+
- Google Cloud Service Account with Sheets and Drive API enabled
- Google Chrome browser installed
- ChromeDriver (must match your Chrome version)
- Access to a Google Sheet named:
Google Map Scraping (Python)
with two sheets:- City → contains search queries
- Data → where results will be stored
git clone https://github.com/monojitbgit/google-map-scraper.git
cd google-map-scraperpip install -r requirements.txt
Go to Google Cloud Console.
Create or use an existing Service Account.
Enable:
Google Sheets API
Google Drive API
Download the JSON key file.
Rename it to credentials.json and place it in your project folder.
Share your Google Sheet with the Service Account found inside credentials.json):
Download ChromeDriver matching your installed Chrome version from:
https://chromedriver.chromium.org/downloads
Add it to your system PATH or keep it in the project folder.
Once everything is configured, simply run:
python gmapscraper.py
The script will:
Authenticate with Google Sheets
Read queries from the City sheet (Column A)
For each query, open Google Maps
Extract all available business details
Write results to the Scraping sheet
Mark the query status in the City sheet
Listed in requirements.txt:
selenium
gspread
google-auth
beautifulsoup4
Install them all:
pip install -r requirements.txt
This project is licensed under the MIT License.
You’re free to use, modify, and distribute it with attribution.