Overview
TheJobService class implements the Facade design pattern to provide a simplified interface for the entire job analysis workflow. It orchestrates interactions between the scraper, text cleaner, AI analyzer, and exporters.
JobService
Class Definition
Constructor
Scraper implementation to use for data extraction. Injected via dependency injection pattern. Typically an instance of
LinkedInScraper.- Stores the injected scraper instance
- Creates an
AIAnalyzerinstance for GPT integration - Sets output directory to
"datos_extraidos" - Creates the output directory if it doesn’t exist
Methods
procesar_busqueda()
Executes the complete extraction and cleaning workflow for a job search.Search term for job postings (e.g., “Python Developer”, “Data Scientist”)
Dictionary containing processed results:On success:
exito(bool): Truehabilidades(List[str]): Cleaned skills listtitulo_oferta(str): Job titledatos_completos(Dict): Complete structured data including:termino_busqueda(str): Original search termtitulo_oferta(str): Job titleurl(str): Job posting URLhabilidades(List[str]): Cleaned skillsfecha_extraccion(str): Timestamp in format “YYYY-MM-DD HH:MM:SS”
exito(bool): Falsemensaje(str): Error description
Example Usage:
generar_resumen_ia()
Generates an AI-powered summary of the job posting.Job posting title
List of cleaned skills. Should be the output from
procesar_busqueda().Markdown-formatted summary generated by GPT, or an error message if AI analysis fails
AIAnalyzer class:
guardar_datos()
Exports job data to JSON and/or Excel formats.Complete job data dictionary (typically
datos_completos from procesar_busqueda())Export format selection:
"1"- JSON only"2"- Excel only"3"- Both JSON and Excel
List of file paths where data was successfully saved. Contains 1-2 paths depending on format selection.
- Generates timestamped filename:
linkedin_YYYYMMDD_HHMMSS - Saves to
datos_extraidos/directory - Uses
ExporterFactoryto obtain appropriate exporters - Returns all created file paths
Complete Workflow Example
Here’s a complete example showing the typical usage ofJobService:
Design Patterns
TheJobService class demonstrates several software design patterns:
Facade Pattern
Facade Pattern
The service provides a simplified interface to a complex subsystem of scrapers, cleaners, analyzers, and exporters. Clients interact with a single
JobService instance instead of managing multiple components.Dependency Injection
Dependency Injection
The scraper is injected via the constructor rather than hardcoded. This allows:
- Easy testing with mock scrapers
- Swapping LinkedIn scraper for other job sites
- Better separation of concerns
Single Responsibility
Single Responsibility
The service orchestrates workflow but delegates actual work:
- Scraping →
ScraperStrategy - Cleaning →
TextCleaner - AI Analysis →
AIAnalyzer - Exporting →
ExporterFactory
Dependencies
Best Practices
Use Dependency Injection
Always inject the scraper in the constructor rather than instantiating it inside the service. This improves testability and flexibility.
Check Success Flag
Always check the
exito flag in the returned dictionary before accessing other fields to avoid KeyError exceptions.Handle AI Errors
The AI summary may return error strings. Check for warning/error prefixes (⚠️, ❌) before displaying to users.
Timestamp Filenames
The service automatically generates timestamped filenames. This prevents overwriting previous extractions.