Custom Cache Implementation
Thedldr/cache module provides built-in caching to avoid refetching data that has already been loaded. You can use the default Map-based cache or provide your own custom cache implementation.
Basic Caching
Create a cache instance
Create a Map to store cached values:If you don’t provide a cache, a default Map will be created automatically:
Use the cached loader
The first call fetches from your data source, subsequent calls return cached values:
Custom Cache Requirements
Any object that implements the Map interface can be used as a cache:LRU Cache Implementation
For production applications, an LRU (Least Recently Used) cache prevents unbounded memory growth. We recommend usingtmp-cache:
Cache Implementations
- LRU Cache
- TTL Cache
- Redis Cache
- Layered Cache
Best for production use with memory constraints:
Cache Management
Manual Cache Control
Since dldr doesn’t handle cache mutations, you control cache invalidation:Priming the Cache
Pre-populate the cache with known data:Cache Warming
Warm up the cache on application start:Best Practices
Choose the right cache size
Choose the right cache size
Size your cache based on memory constraints and access patterns. Monitor cache hit rates to optimize size.
Invalidate on mutations
Invalidate on mutations
Always invalidate cache entries when the underlying data changes:
Use TTL for stale data
Use TTL for stale data
Implement time-based expiration for data that changes frequently but doesn’t require immediate invalidation.
Monitor memory usage
Monitor memory usage
Use LRU caches in production to prevent unbounded memory growth. Monitor cache size and eviction rates.
Next Steps
- Learn about batching in GraphQL resolvers
- Optimize database queries
- Read the Cache API reference