A Python scraper for extracting event listings from Skiddle.com using the ScrapingAnt web scraping API.
- Scrapes event listings from Skiddle.com
- Extracts comprehensive event details:
- Event ID
- Event title
- Date and time
- Venue name
- City/location
- Age restriction (18+, 14+, etc.)
- Event image URL
- Direct event URL
- Supports multiple event categories (Clubs, Gigs, Festivals, etc.)
- Exports data to CSV and JSON formats
- Deduplicates events automatically
- Python 3.8+
- ScrapingAnt API key (Get free API key)
Note: The ScrapingAnt free plan has a concurrency limit of 1 thread. For higher throughput, consider upgrading to a paid plan.
- Clone this repository:
git clone https://github.com/kami4ka/skiddle-scraper.git
cd skiddle-scraper- Create and activate a virtual environment:
python -m venv .venv
source .venv/bin/activate # On Windows: .venv\Scripts\activate- Install dependencies:
pip install -r requirements.txt- Set your ScrapingAnt API key:
export SCRAPINGANT_API_KEY="your_api_key_here"Scrape events from Skiddle (default: 2 categories):
python main.pypython main.py [OPTIONS]
Options:
--pages INT Number of categories to scrape (default: 2)
--category TEXT Scrape a specific category only
--all-categories Scrape all available categories
-o, --output PATH Output CSV file path (default: output/skiddle_events.csv)
--json Also export to JSON format
-v, --verbose Enable verbose output
--api-key TEXT ScrapingAnt API key (or set SCRAPINGANT_API_KEY env var)
--list-categories List available categories and exitScrape with verbose output:
python main.py --verboseScrape club events only:
python main.py --category clubs --verboseScrape multiple categories:
python main.py --pages 3 --verboseScrape all available categories:
python main.py --all-categories --verboseExport to both CSV and JSON:
python main.py --json -o output/events.csvList available categories:
python main.py --list-categoriesclubs- Nightclub eventsgigs- Live music gigsfestivals- Music festivalswhats-on- All eventscomedy- Comedy showstheatre- Theatre performances
| Field | Description | Example |
|---|---|---|
| event_id | Unique event identifier | 41495100 |
| event_title | Event name | Hannah Laing presents doof Liverpool |
| date | Event date | Saturday 24th January |
| time | Event time | 4:00pm - 11:00pm |
| venue | Venue name | Exhibition Centre Liverpool |
| city | City/location | Liverpool |
| age_restriction | Minimum age | 18+ |
| event_url | Direct link to event | https://www.skiddle.com/whats-on/... |
| image_url | Event image URL | https://skiddle.imgix.net/... |
| category | Event category | clubs |
| scraped_at | Timestamp of scraping | 2026-01-02T10:30:00Z |
event_id,event_title,date,time,venue,city,age_restriction,event_url,image_url,category,scraped_at
41495100,Hannah Laing presents doof Liverpool,Saturday 24th January,4:00pm - 11:00pm,Exhibition Centre Liverpool,Liverpool,18+,https://www.skiddle.com/whats-on/Liverpool/Exhibition-Centre-Liverpool/Hannah-Laing-presents-doof-Liverpool/41495100/,https://skiddle.imgix.net/...,clubs,2026-01-02T10:30:00ZSkiddleScraper/
├── config.py # Configuration settings
├── models.py # Event and EventCollection data classes
├── utils.py # Utility functions
├── scraper.py # Main scraper class
├── main.py # CLI entry point
├── requirements.txt # Python dependencies
├── .gitignore # Git ignore patterns
├── output/ # Output directory for scraped data
│ └── .gitkeep
└── README.md # This file
This scraper uses category pages as a form of pagination. Each category page (like Clubs, Gigs, Festivals) contains a different set of events. By scraping multiple categories, you can collect comprehensive event data across different event types.
The site uses React with Material-UI components and renders content via JavaScript. ScrapingAnt's browser rendering capability ensures complete page loading before extraction.
MIT License
This scraper is for educational purposes only. Please respect Skiddle.com's terms of service and rate limits when using this tool. Always ensure your scraping activities comply with applicable laws and website policies.