Fetch time-series electricity market data with automatic chunking and multi-geography support
Indicators are time-series datasets published by ESIOS, covering electricity prices, demand, generation, and more. The IndicatorsManager provides methods to list, search, and fetch historical data.
from esios import ESIOSClientwith ESIOSClient() as client: # Get an indicator handle handle = client.indicators.get(600) # Fetch historical data df = handle.historical("2025-01-01", "2025-01-07") print(df)
The client automatically enriches geography metadata from API responses:
handle = client.indicators.get(600)# First fetch: learns geo_id → geo_name mappings from API responsedf = handle.historical("2025-01-01", "2025-01-07")# Mappings are persisted to cache and reusedprint(handle.geos) # Now includes all discovered geographies
From src/esios/managers/indicators.py:49-74:
def _enrich_geo_map(self, values: list[dict]) -> None: """Learn geo_id → geo_name mappings from API response values. The indicator metadata may not list all geos (e.g. 600 omits Países Bajos). This enriches the metadata from actual data and persists new mappings to the global geos registry and per-indicator meta.json. """ known_ids = {g["geo_id"] for g in self.geos} new_geos: dict[str, str] = {} for v in values: gid = v.get("geo_id") gname = v.get("geo_name") if gid is not None and gname: if gid not in known_ids: self.metadata.setdefault("geos", []).append( {"geo_id": gid, "geo_name": gname} ) known_ids.add(gid) new_geos[str(gid)] = gname # Persist to global geos registry and per-indicator meta cache = self._cache if cache.config.enabled and new_geos: cache.merge_geos(new_geos) self._persist_meta()
Geography mappings are shared across all indicators in a global geos.json registry.
Indicator data is automatically cached locally. See the Caching page for details on:
Cache directory structure
Gap detection and partial fetches
TTL settings for recent data
Cache maintenance
with ESIOSClient(cache=True, cache_recent_ttl=48) as client: handle = client.indicators.get(600) # First call: fetches from API, writes to cache df = handle.historical("2025-01-01", "2025-01-07") # Second call: reads from cache df = handle.historical("2025-01-01", "2025-01-07")