Introduction
The Surveillance API enables automated monitoring of multiple regulatory data sources including FDA, EMA, DIGEMID, and VigiAccess. Configure scheduled scraping, query results with advanced filters, and generate automated email reports. Base Path:/api/v1/surveillance
Key Features
Multi-Scope Architecture
Separate national and external surveillance with independent configurations
Automated Scraping
CRON-based scheduling with configurable data sources
Advanced Filtering
Query by severity, medication, date range, and full-text search
Multi-Format Reports
Generate HTML, PDF, CSV, and XLSX reports via email
Surveillance Scopes
VIGIA separates surveillance into two independent scopes:| Scope | Description | Use Case |
|---|---|---|
national | Domestic regulatory monitoring | DIGEMID alerts, local health authorities |
external | International monitoring | FDA, EMA, VigiAccess, MHRA, Health Canada |
Each scope maintains its own:
- Data sources configuration
- Scraping schedule (CRON expression)
- Surveillance results
- Report recipients
Authentication
All surveillance endpoints require authentication and one of the following roles:Core Endpoints
Configuration
| Method | Endpoint | Description |
|---|---|---|
GET | /surveillance/sources | Get configured data sources for a scope |
POST | /surveillance/sources | Update data sources (replaces entire list) |
GET | /surveillance/schedule | Get scraping schedule for a scope |
POST | /surveillance/schedule | Configure CRON schedule |
Execution
| Method | Endpoint | Description |
|---|---|---|
POST | /surveillance/run | Trigger immediate scraping (all enabled sources) |
POST | /surveillance/run-and-send | Scrape and email report |
Querying
| Method | Endpoint | Description |
|---|---|---|
GET | /surveillance/results | Query surveillance items with filters |
POST | /surveillance/send-reports | Generate and email filtered reports |
Data Model
SurveillanceItem
Each scraped item contains:Severity Levels
| Severidad | Criteria | Examples |
|---|---|---|
| Alta | Death, hospitalization, product recall | Drug contamination, device malfunction causing injury |
| Media | Serious ADR, label change, black box warning | New hepatotoxicity warning, dosage restrictions |
| Baja | Minor ADR, information update | Updated administration guidelines |
Quick Start
1. Configure Data Sources
2. Set Up Schedule
3. Query Results
4. Send Reports
Common Use Cases
Weekly High-Severity Monitoring
Drug-Specific Surveillance
On-Demand Scraping
Filtering & Search
The/results endpoint supports multiple filters:
| Parameter | Type | Description | Example |
|---|---|---|---|
scope | string | national or external | scope=external |
query | string | Full-text search (title, medication, event) | query=hepatotoxicity |
severity | string | Filter by severity level | severity=Alta |
medicamento | string | Medication name (partial match) | medicamento=Paracetamol |
date_from | string | Start date (ISO 8601) | date_from=2024-01-01 |
date_to | string | End date (ISO 8601) | date_to=2024-03-31 |
limit | integer | Results per page (1-200) | limit=50 |
offset | integer | Pagination offset | offset=100 |
Report Formats
Generate reports in multiple formats:| Format | Description | Use Case |
|---|---|---|
| Inline HTML | Embedded table in email body | Quick review in email client |
| Formatted document | Archiving, regulatory submissions | |
| CSV | Comma-separated values | Data analysis, Excel import |
| XLSX | Excel spreadsheet | Advanced filtering, pivot tables |
Timezone Handling
Dates in API responses use ISO 8601 format with UTC timezone:next_run_iso, you can provide any timezone—it will be converted:
Error Handling
Common Error Codes
| Status | Error | Cause |
|---|---|---|
400 | Bad Request | Invalid filters, missing required fields |
401 | Unauthorized | Missing or invalid JWT token |
403 | Forbidden | User lacks required role |
502 | Bad Gateway | Email service unavailable |
Rate Limits
To prevent abuse, manual scraping (
/run) is limited to:- National scope: Once per hour
- External scope: Once every 2 hours
Implementation References
| Component | File Location |
|---|---|
| Router | backend/app/routers/surveillance.py:29 |
| Schemas | backend/app/schemas/surveillance.py:1-87 |
| Models | backend/app/models/surveillance.py |
| CRUD Operations | backend/app/crud/surveillance.py |
| Scraper Service | backend/app/services/scraper.py |
| Report Generation | backend/app/services/reports.py |
| Filter Builder | backend/app/services/report_filters.py:27 |
Related APIs
Schedule Management
Configure CRON-based scraping
Global Data Sources
Query FDA, EMA, DIGEMID directly
ICSR Integration
Create cases from surveillance signals
Next Steps
- Configure Sources: Set up national and external data sources
- Test Scraping: Run manual scrape to verify source accessibility
- Set Schedule: Configure automated scraping frequency
- Query Results: Test filtering and pagination
- Setup Reports: Configure weekly email reports for stakeholders