Cache module provides a high-performance, effect-based caching solution with automatic time-to-live (TTL) management, capacity limits, and customizable lookup functions.
Overview
ACache<Key, A, E, R> is a mutable key-value store that:
- Automatically populates missing entries using a lookup function
- Enforces capacity limits with automatic eviction
- Supports time-to-live (TTL) for cache entries
- Integrates seamlessly with Effect for safe, composable caching
- Handles concurrent access without race conditions
Creating Caches
Basic Cache
Cache with TTL
Cache with Error Handling
Advanced Cache Creation
Dynamic TTL with makeWith
TTL Based on Value
Using Caches
Get Values
Invalidate Entries
Refresh Values
Set Values Manually
Capacity and Eviction
Inspecting Cache State
Get Size
Get Keys
Advanced Patterns
Cache with Complex Keys
Cache as a Service
Best Practices
- Choose appropriate capacity: Set capacity based on memory constraints
- Use TTL for frequently changing data: Prevent stale data with appropriate TTL
- Handle lookup errors: Consider error caching strategy (short TTL for errors)
- Monitor cache performance: Track hit/miss ratios and adjust capacity
- Use structural equality for complex keys: Use Data.Class or similar for complex key types
- Wrap in services for reusability: Expose caches through service layers
Performance Considerations
- Cache operations are lock-free and thread-safe
- TTL is evaluated on access, not via background processes
- Eviction happens synchronously when capacity is exceeded
- Consider using separate caches for different data types to optimize capacity usage