A modern Django-based service built for the Ansible Automation Platform (AAP) ecosystem, featuring comprehensive task management, REST APIs, and automated background job processing.
- 🚀 Modern Django Architecture - Django 4.2+ with clean app-based structure
- 📊 Automated Task Management - Feature-enable controlled task groups with automatic routing
- ⚡ Smart Task Routing - Automatic submission to dispatcherd with no manual intervention
- 🔌 REST API - Versioned RESTful APIs with OpenAPI documentation
- 🔐 Authentication & Authorization - Django-Ansible-Base integration with RBAC
- 📈 Real-time Dashboard - Web-based task monitoring and management interface
- 🐳 Docker Ready - Multi-container deployment with PostgreSQL
- 🧪 Comprehensive Testing - Unit and integration tests with coverage reporting
- 📝 API Documentation - Interactive Swagger/OpenAPI documentation
- 🔧 Metrics Collection - Integrated metrics-utility for data collection
# Clone the repository
git clone <repository-url>
cd metrics-service
# Start all services
docker-compose up -d
# Create a superuser (optional)
docker-compose exec metrics-service python manage.py createsuperuserYour service will be available at:
- Application: http://localhost:8000
- API Documentation: http://localhost:8000/api/docs/
- Admin Interface: http://localhost:8000/admin/
- Task Dashboard: http://localhost:8000/dashboard/
# Prerequisites: Python 3.11+, PostgreSQL 13+
# Create virtual environment
python -m venv venv
source venv/bin/activate # Windows: venv\Scripts\activate
# Install dependencies
pip install -e ".[dev]"
# Set up database (configure via environment variables if needed)
# See Configuration section below for environment variable options
python manage.py migrate
python manage.py metrics_service init-service-id
python manage.py metrics_service init-system-tasks
python manage.py createsuperuser
# Start complete service (Django + dispatcher + scheduler)
python manage.py metrics_service runmetrics-service/
├── apps/
│ ├── api/v1/ # REST API endpoints
│ ├── core/ # Core models and business logic
│ ├── dashboard/ # Web dashboard interface
│ └── tasks/ # Background task system
├── metrics_service/
│ ├── settings/ # Environment-specific settings
│ └── urls.py # URL configuration
├── tests/ # Test suite
├── config/ # Configuration files
└── docker-compose.yml # Container orchestration
Core Models (apps/core/models.py)
- User management with Django-Ansible-Base
- Organization and team hierarchy
- RBAC permissions and roles
Task System (apps/tasks/)
- Feature-flag controlled task groups (System, Anonymized Data, Metrics Collection)
- Automatic task routing with Django signals
- APScheduler integration for cron-based scheduling
- Dispatcherd background task execution
- Task execution tracking and monitoring
- Built-in task functions and metrics collection
API Layer (apps/api/v1/)
- RESTful endpoints with filtering and pagination
- OpenAPI/Swagger documentation
- Authentication and permission controls
Dashboard (apps/dashboard/)
- Real-time task monitoring
- Task creation and management interface
- Live status updates every 5 seconds
The API supports multiple authentication methods:
- Session authentication (for web interface)
- Token authentication
- OAuth2 tokens (for third-party integrations)
# List all tasks
GET /api/v1/tasks/
# Create a new task
POST /api/v1/tasks/
{
"name": "Data Cleanup",
"function_name": "cleanup_old_data",
"task_data": {"days_old": 30}
}
# Get running tasks
GET /api/v1/tasks/running/
# Retry a failed task
POST /api/v1/tasks/{id}/retry/
# Available task functions
GET /api/v1/tasks/available_functions/System Tasks (always enabled):
cleanup_old_data- Clean up old system datacleanup_old_tasks- Clean up completed/failed taskssend_notification_email- Send notification emailsprocess_user_data- Process user data in backgroundhello_world- Simple test task for dispatcherd integrationsleep- Sleep for specified duration (testing)execute_db_task- Execute database-defined tasks with lifecycle management
Anonymized Data Collection Tasks (controlled by ANONYMIZED_DATA_COLLECTION, default: enabled):
collect_anonymous_metrics- Collect anonymous system metricscollect_config_metrics- Collect configuration information
Metrics Collection Tasks (controlled by METRICS_COLLECTION_ENABLED, default: disabled):
collect_job_host_summary- Collect job execution statisticscollect_host_metrics- Collect host performance datacollect_all_metrics- Run multiple collectors in sequence
The service includes an automated background task system with intelligent routing:
# Start complete service (Django + dispatcher + scheduler)
python manage.py metrics_service run
# Start with custom configuration
python manage.py metrics_service run --workers 4
# Individual components (for development)
python manage.py run_dispatcherd --workers 2
python manage.py metrics_service cron startTasks are automatically routed based on their properties:
- Immediate tasks → Direct to dispatcherd
- Scheduled tasks → APScheduler with DateTrigger
- Recurring tasks → APScheduler with CronTrigger
No manual intervention required - create a task and it's automatically processed!
Control task execution with environment variables:
# Enable/disable anonymized data collection (default: true)
METRICS_SERVICE_ANONYMIZED_DATA=true
# Enable/disable metrics collection (default: false)
METRICS_SERVICE_METRICS_COLLECTION=falseThese environment variables control which task groups are active in the scheduler.
# Format code
black .
# Lint code
ruff check .
# Type checking
mypy .
# Sort imports
isort .This project uses pre-commit hooks to ensure code quality and automatically sync requirements files:
# Install pre-commit hooks
pre-commit install
# Run hooks on all files
pre-commit run --all-files
# Run hooks manually
pre-commit runThe pre-commit configuration automatically:
- Syncs requirements files when
pyproject.tomloruv.lockchanges - Ensures requirements files are always up-to-date before commits
# Run all tests
pytest
# Run with coverage
pytest --cov=apps --cov=metrics_service --cov-report=html
# Run specific test categories
pytest -m unit # Unit tests only
pytest -m integration # Integration tests only# Create migrations
python manage.py makemigrations
# Apply migrations
python manage.py migrate
# Initialize DAB ServiceID (required after first migration)
python manage.py metrics_service init-service-id
# Initialize system tasks
python manage.py metrics_service init-system-tasksMetrics Service uses Dynaconf for settings management, following the AAP Phase 1 standards.
Development Mode (default):
# Database
METRICS_SERVICE_DB_HOST=localhost
METRICS_SERVICE_DB_PORT=5432
METRICS_SERVICE_DB_USER=metrics_service
METRICS_SERVICE_DB_PASSWORD=metrics_service
METRICS_SERVICE_DB_NAME=metrics_service
# Django
METRICS_SERVICE_SECRET_KEY=your-secret-key
METRICS_SERVICE_DEBUG=false
METRICS_SERVICE_ALLOWED_HOSTS=localhost,yourdomain.com
# Task feature flags
METRICS_SERVICE_ANONYMIZED_DATA=true
METRICS_SERVICE_METRICS_COLLECTION=falseNote: Development mode works with default settings - just run the server:
python manage.py runserverProduction Mode:
# Set environment mode and required secrets
export METRICS_SERVICE_MODE=production
export METRICS_SERVICE_SECRET_KEY="your-secure-random-key"
export METRICS_SERVICE_ALLOWED_HOSTS="yourdomain.com,api.yourdomain.com"
# Override defaults as needed
export METRICS_SERVICE_DATABASES__default__HOST=prod-db.example.com
export METRICS_SERVICE_DATABASES__default__PASSWORD=secure-password
python manage.py runserverSettings are loaded in order of precedence (lowest to highest):
metrics_service/settings/defaults.py- Base Django defaultsconfig/settings.yaml- Environment-specific configuration/etc/ansible-automation-platform/settings.yaml- System-wide AAP settings- Environment variables with
METRICS_SERVICE_prefix - Highest priority
| Variable | Description | Required in Production |
|---|---|---|
METRICS_SERVICE_MODE |
Environment mode (development/production) | No (defaults to development) |
METRICS_SERVICE_SECRET_KEY |
Django secret key | Yes |
METRICS_SERVICE_DEBUG |
Enable debug mode | No |
METRICS_SERVICE_LOG_LEVEL |
Logging level (DEBUG/INFO/WARNING/ERROR) | No (defaults to INFO) |
METRICS_SERVICE_DATABASES__default__HOST |
Database host | No (has default) |
METRICS_SERVICE_DATABASES__default__PASSWORD |
Database password | No (has default) |
METRICS_SERVICE_ALLOWED_HOSTS |
Allowed hosts (comma-separated) | Yes (production) |
Note: Use double underscores (__) for nested settings:
# Nested database configuration
export METRICS_SERVICE_DATABASES__default__HOST=localhost
export METRICS_SERVICE_DATABASES__default__PORT=5432Metrics Service uses a centralized logging system that integrates with Django's logging framework. All log levels are controlled by a single environment variable.
Setting Log Level:
# For development - see all debug messages
export METRICS_SERVICE_LOG_LEVEL=DEBUG
# For production - informational messages only
export METRICS_SERVICE_LOG_LEVEL=INFO
# For troubleshooting - warnings and errors
export METRICS_SERVICE_LOG_LEVEL=WARNING
# For critical issues only
export METRICS_SERVICE_LOG_LEVEL=ERRORQuick Debug Mode:
# Run with debug logging temporarily
METRICS_SERVICE_LOG_LEVEL=DEBUG python manage.py runserver
# Or for the complete service
METRICS_SERVICE_LOG_LEVEL=DEBUG python manage.py metrics_service runLog Output Format:
All logs use Django's configured format with timestamps, log levels, request IDs (when applicable), module names, and messages:
2025-01-18 10:15:23,456 INFO [abc123] apps.tasks.signals New task created: Cleanup (ID: 42)
2025-01-18 10:15:24,789 WARNING [] apps.core.utils Database connection slow: 2.3s
For comprehensive configuration documentation, validators, troubleshooting, and testing information, see metrics_service/settings/README.md.
# Build production image
docker build -t metrics-service .
# Run with production settings
docker run -p 8000:8000 \
-e METRICS_SERVICE_MODE=production \
-e METRICS_SERVICE_SECRET_KEY=your-secret-key \
-e METRICS_SERVICE_DATABASES__default__HOST=your-db-host \
-e METRICS_SERVICE_DATABASES__default__PASSWORD=your-db-password \
metrics-serviceFor Kubernetes deployment, see the manifests in the manifests/base/apps/metrics-service/ directory.
- Fork the repository
- Create a feature branch:
git checkout -b feature/my-feature - Make your changes with tests
- Run the test suite:
pytest - Run code quality checks:
ruff check . && black . && mypy . - Submit a pull request
- Code Style: Black formatting, 120 character line length
- Type Hints: Required for all new code
- Documentation: Docstrings for public APIs
- Testing: Test coverage for new features
- Commits: Conventional commit messages
This project is licensed under the Apache License - see the LICENSE file for details.
- Documentation: Check the CLAUDE.md file for detailed development guidance
- Issues: Report bugs and feature requests via GitHub issues
- API Documentation: Interactive docs available at
/api/docs/when running