See Getting Started for full setup instructions. Quick version:
git clone https://github.com/prebid/salesagent.git
cd salesagent
make setup # Installs deps, starts Docker, verifies healthConductor is a Mac app for running multiple development workspaces in parallel. Each workspace gets its own git worktree and isolated Docker environment.
Set these environment variables in your shell (add to ~/.bashrc or ~/.zshrc):
# Required for Admin UI access
export SUPER_ADMIN_EMAILS='your-email@example.com'
export SUPER_ADMIN_DOMAINS='example.com' # Optional
# Required for AI features
export GEMINI_API_KEY='your-gemini-api-key'
# Required for Google OAuth
export GOOGLE_CLIENT_ID='your-client-id.apps.googleusercontent.com'
export GOOGLE_CLIENT_SECRET='your-client-secret'The system uses a predefined pool of ports to avoid OAuth redirect URI updates:
-
Configure Google OAuth redirect URLs:
python manage_conductor_ports.py oauth-urls
Add these URLs to your Google OAuth app:
- http://localhost:8002/auth/google/callback
- http://localhost:8003/auth/google/callback
- ... through port 8011
-
Reserve a port for your workspace:
python manage_conductor_ports.py reserve --workspace my-feature
-
Release port when done:
python manage_conductor_ports.py release --workspace my-feature
Run the automated setup script from within your Conductor workspace:
./setup_conductor_workspace.shThis script:
- Detects the Conductor workspace automatically
- Assigns unique ports based on workspace name
- Creates
.envwith proper configuration - Creates
docker-compose.override.ymlfor hot-reloading (including PYTHONPATH for package imports) - Installs Git hooks for the workspace
Manual setup: If you're not using the setup script, ensure your docker-compose.override.yml includes the PYTHONPATH setting. See docker-compose.override.example.yml for the required configuration.
.conductor/
├── workspace-name/ # Git worktree
│ ├── .env # Auto-generated config
│ ├── docker-compose.override.yml # Dev overrides
│ └── (project files)
├── another-workspace/
│ └── ...
# List all workspaces and their ports
python manage_conductor_ports.py status
# Clean up a workspace
./cleanup_conductor_workspace.sh workspace-name
# View workspace logs
cd .conductor/workspace-name
docker-compose logs -ffrom src.adapters.base import AdServerAdapter
from src.core.schemas import *
class MyPlatformAdapter(AdServerAdapter):
adapter_name = "myplatform"
def __init__(self, config, principal, dry_run=False, creative_engine=None):
super().__init__(config, principal, dry_run, creative_engine)
self.advertiser_id = self.principal.get_adapter_id("myplatform")
def create_media_buy(self, request, packages, start_time, end_time):
# Implementation
def get_avails(self, request):
# Implementation
def activate_media_buy(self, media_buy_id):
# Implementation- get_avails - Check inventory availability
- create_media_buy - Create campaigns/orders
- activate_media_buy - Activate pending campaigns
- pause_media_buy - Pause active campaigns
- get_media_buy_status - Get campaign status
- get_media_buy_performance - Get performance metrics
Adapters can provide custom configuration interfaces:
def get_config_ui_endpoint(self) -> Optional[str]:
return f"/adapters/{self.adapter_name}/config"
def register_ui_routes(self, app, db_session_factory):
@app.route(self.get_config_ui_endpoint() + "/<tenant_id>/<product_id>")
def config_ui(tenant_id, product_id):
# Render configuration UI
def validate_product_config(self, config: dict) -> tuple[bool, Optional[str]]:
# Validate adapter-specific configurationThe system supports two-tier targeting:
-
Overlay Dimensions - Available to principals
- Geography, demographics, interests, devices
- AEE signals, contextual targeting
-
Managed-Only Dimensions - Internal use only
- Platform-specific optimizations
- Reserved inventory segments
Each adapter translates AdCP targeting to platform-specific format:
def _translate_targeting(self, overlay):
platform_targeting = {}
if "geo_countries" in overlay:
platform_targeting["location"] = {
"countries": overlay["geo_countries"]
}
if "signals" in overlay:
platform_targeting["custom_targeting"] = {
"keys": self._map_signals(overlay["signals"])
}
return platform_targeting# All tests
uv run pytest
# Specific category
uv run pytest tests/unit/
uv run pytest tests/integration/
# With coverage
uv run pytest --cov=. --cov-report=html
# Inside Docker
docker-compose exec adcp-server pytest- Unit Tests - Component isolation tests
- Integration Tests - Full workflow tests
- Adapter Tests - Platform-specific tests
- UI Tests - Admin interface tests
- Contract Validation Tests - MCP tool integration tests (prevents client failures)
- AdCP Compliance Tests - Protocol schema compliance tests
Ensure MCP tools work with minimal parameters to prevent client integration failures:
# Test contract validation
uv run pytest tests/integration/test_mcp_contract_validation.py -v
# Audit schema field requirements
uv run python scripts/audit_required_fields.py
# Test specific schema validation
uv run python -c "
from src.core.schemas import GetProductsRequest
req = GetProductsRequest(promoted_offering='test')
print(f'✅ Brief defaults to: {repr(req.brief)}')
"
# Run pre-commit validation checks
pre-commit run mcp-contract-validation --all-files
pre-commit run audit-required-fields --all-filesWhen to run contract validation tests:
- Before adding new MCP tools or modifying schemas
- When changing field requirements (required → optional or vice versa)
- If clients report validation errors like
'brief' is a required property - As part of schema design review process
# Full lifecycle simulation
uv run python run_simulation.py
# Dry-run mode (logs API calls)
uv run python run_simulation.py --dry-run --adapter gam
# Custom scenarios
uv run python simulation_full.py http://localhost:8000 \
--token "test_token" \
--principal "test_principal"- Check existing schema first:
grep -r "Column(" src/core/database/models.py
# Connect to PostgreSQL in Docker
docker compose -f docker-compose.yml exec postgres psql -U adcp_user -d adcp -c "\d table_name"- Create migration:
uv run alembic revision -m "add_new_column"- Edit migration file:
def upgrade():
op.add_column('table_name',
sa.Column('new_column', sa.String(100)))
def downgrade():
op.drop_column('table_name', 'new_column')- Run migration:
# Inside Docker (recommended)
docker compose exec adcp-server python scripts/ops/migrate.py
# Or locally with uv
uv run python scripts/ops/migrate.py- Always use SQLAlchemy's
sa.table()in migrations - Use PostgreSQL-specific features (JSONType, etc.)
- Use scoped sessions for thread safety
- Use SQLAlchemy 2.0 patterns:
select()+scalars(), notquery()
Tools are exposed via FastMCP:
@app.tool
async def get_products(
context: Context,
brief: Optional[str] = None
) -> GetProductsResponse:
# Get auth from headers
auth_token = context.http.headers.get("x-adcp-auth")
# Resolve principal and tenant
principal, tenant = await resolve_auth(auth_token)
# Return products
return GetProductsResponse(products=products)- Define schema in
schemas.py - Implement tool in
main.py - Add tests in
test_main.py - Update documentation
- Always extend
base.html - Use Bootstrap classes (loaded in base)
- Avoid global CSS resets
- Test element visibility
// Handle nulls
const value = (data.field || 'default');
// Check elements exist
const element = document.getElementById('id');
if (element) {
// Safe to use
}
// API calls with error handling
try {
const response = await fetch('/api/endpoint', {
credentials: 'same-origin'
});
if (!response.ok) throw new Error('Failed');
const data = await response.json();
} catch (error) {
console.error('API error:', error);
}- MCP uses
x-adcp-authheader tokens - Admin UI uses Google OAuth
- Principals have unique tokens per advertiser
- Super admins configured via environment
All operations are logged to database:
from audit_logger import AuditLogger
logger = AuditLogger(db_session)
logger.log(
operation="create_media_buy",
principal_id=principal.principal_id,
tenant_id=tenant_id,
success=True,
details={"media_buy_id": result.media_buy_id}
)-
Create a feature branch:
git checkout -b feature/my-feature
-
Make changes and test:
# Run tests uv run pytest # Check formatting black --check . ruff check .
-
Commit with descriptive message:
git add . git commit -m "feat: add new targeting dimension"
-
Push and create PR:
git push origin feature/my-feature
type(scope): description
Detailed explanation if needed
Fixes #123
Types:
feat: New featurefix: Bug fixdocs: Documentationstyle: Formattingrefactor: Code restructuringtest: Test changeschore: Maintenance
Install Git hooks for code quality:
./setup_hooks.shCritical hooks run on every commit:
- Black (code formatting)
- Ruff (linting)
- AdCP contract tests (protocol compliance)
- MCP contract validation (prevents client integration failures)
- Required fields audit (catches over-strict validation)
- Schema-database alignment (prevents AttributeError bugs)
Additional quality gates:
- Unit tests (optional - use
pre-commit run pytest-unit) - Migration testing (manual - use
pre-commit run test-migrations) - Smoke tests (manual - use
pre-commit run smoke-tests)
Contract Validation Prevention:
The system automatically prevents validation errors like 'brief' is a required property through:
- Pre-commit hooks that test minimal parameter calls
- Automated field requirement auditing
- Clear error messages with specific fixes
- Follow PEP 8
- Use type hints for all functions
- Maximum line length: 100 characters
- Use descriptive variable names
- Document all public methods
# Imports order
import standard_library
import third_party
from local_app import modules
# Class structure
class MyClass:
"""Class description."""
def __init__(self):
"""Initialize."""
pass
def public_method(self) -> str:
"""Public method description."""
return self._private_method()
def _private_method(self) -> str:
"""Private method description."""
return "result"# View container logs
docker-compose logs -f adcp-server
# Execute commands in container
docker-compose exec adcp-server bash
# Check container health
docker ps
# Rebuild containers
docker-compose build --no-cache
docker-compose up -d# Add breakpoint
import pdb; pdb.set_trace()
# Or use IPython
import IPython; IPython.embed()
# Run with debugger
python -m pdb script.py# Use MCP Inspector
npm install -g @modelcontextprotocol/inspector
npx inspector http://localhost:8000/mcp/
# View server logs
docker compose logs -f adcp-server
# Test with curl
curl -H "x-adcp-auth: your_token" \
http://localhost:8000/mcp/tools/get_products# Connect to PostgreSQL
docker-compose exec postgres psql -U adcp_user -d adcp
# Common queries
SELECT * FROM tenants;
SELECT * FROM principals WHERE tenant_id = 'tenant_123';
SELECT * FROM media_buys ORDER BY created_at DESC LIMIT 10;
# Check migrations
SELECT * FROM alembic_version;try:
result = perform_operation()
except ValidationError as e:
return {"error": str(e)}, 400
except Exception as e:
logger.error(f"Unexpected error: {e}")
return {"error": "Internal error"}, 500
finally:
db_session.remove()from sqlalchemy.orm import scoped_session
db_session = scoped_session(SessionLocal)
try:
db_session.remove() # Start fresh
# Do work
db_session.commit()
except Exception:
db_session.rollback()
raise
finally:
db_session.remove()