A Python scraper for extracting property listings from kleinanzeigen.de (formerly eBay Kleinanzeigen) using the ScrapingAnt API.
- Scrapes apartments for rent/sale, houses, WG rooms, and commercial properties
- Supports 30+ German cities and federal states
- Parallel scraping for improved performance
- Extracts 22 property attributes including rent, size, location, amenities
- Exports data to CSV format
- Rate limiting and retry logic for reliability
- Clone the repository:
git clone https://github.com/kami4ka/KleinanzeigenScraper.git
cd KleinanzeigenScraper- Create a virtual environment and install dependencies:
python -m venv .venv
source .venv/bin/activate # On Windows: .venv\Scripts\activate
pip install -r requirements.txt# Scrape apartments for rent in Berlin
python main.py --category rent --location berlin
# Scrape houses for sale in Hamburg
python main.py --category house-buy --location hamburg --limit 100
# Scrape WG rooms in Munich with custom output
python main.py --category wg --location munich --output munich_wg.csv
# Enable verbose logging
python main.py --category rent --location berlin -v| Option | Description |
|---|---|
--category, -c |
Property category (default: rent) |
--location, -l |
City/region to search in (optional) |
--output, -o |
Output CSV file path (default: properties.csv) |
--limit |
Maximum number of properties to scrape |
--max-pages |
Maximum number of pages to scrape |
--max-workers, -w |
Maximum parallel requests (default: 10) |
--api-key, -k |
ScrapingAnt API key (overrides config) |
--verbose, -v |
Enable verbose logging |
| Category | Description |
|---|---|
rent, apartment-rent |
Apartments for rent |
house-rent |
Houses for rent |
wg, temporary |
Shared apartments (WG) |
buy, apartment-buy |
Apartments for sale |
house-buy |
Houses for sale |
land |
Land/Plots |
commercial |
Commercial properties |
garage |
Garages/Parking |
all |
All real estate |
Major Cities: Berlin, Hamburg, Munich, Cologne, Frankfurt, Stuttgart, Dusseldorf, Bremen, Dresden, Hanover, Nuremberg, Leipzig
Federal States: Bayern, Brandenburg, Hessen, Niedersachsen, Nordrhein-Westfalen (NRW), Sachsen, Schleswig-Holstein, Thüringen
The scraper exports data to CSV with the following fields:
| Field | Description |
|---|---|
| url | Property listing URL |
| title | Property title/description |
| listing_id | Kleinanzeigen listing ID |
| price | Listed price in EUR |
| price_type | Price type (Kaltmiete, Warmmiete, VB) |
| warm_rent | Total rent including utilities |
| living_area | Living area in m² |
| rooms | Number of rooms |
| bedrooms | Number of bedrooms |
| bathrooms | Number of bathrooms |
| address | Full address |
| postal_code | German postal code (5 digits) |
| city | City name |
| district | District/neighborhood |
| property_type | Property type (Etagenwohnung, etc.) |
| floor | Floor level |
| available_from | Availability start date |
| amenities | Available amenities |
| provider_type | Provider type (Privat, Gewerblich) |
| provider_name | Provider name |
| posted_date | Listing post date |
| views | Number of views |
| description | Property description |
This scraper uses the ScrapingAnt API for web scraping. You can provide the API key via:
- Command line:
--api-key YOUR_KEY - Config file: Set
SCRAPINGANT_API_KEYinconfig.py
Configuration options in config.py:
SCRAPINGANT_API_KEY: Your API keyDEFAULT_MAX_WORKERS: Parallel request limit (default: 10)DEFAULT_TIMEOUT: Request timeout in seconds (default: 60)MAX_RETRIES: Number of retry attempts (default: 3)
MIT License