Skip to main content

Custom Cache Implementation

The dldr/cache module provides built-in caching to avoid refetching data that has already been loaded. You can use the default Map-based cache or provide your own custom cache implementation.

Basic Caching

1

Import from dldr/cache

Use the cache-enabled version of load:
import { load } from 'dldr/cache';
import { getPosts } from './db';
2

Create a cache instance

Create a Map to store cached values:
const cache = new Map();
const loadPost = load.bind(null, getPosts, cache);
If you don’t provide a cache, a default Map will be created automatically:
// This works too - cache is created internally
const loadPost = load.bind(null, getPosts);
3

Use the cached loader

The first call fetches from your data source, subsequent calls return cached values:
const posts = await Promise.all([
  load(getPosts, cache, '123'),
  loadPost('123'), // Cached - no database call
  loadPost('456'),
]);

// getPosts is called ONCE with ['123', '456']
// Even though '123' was requested twice
4

Cached across calls

The cache persists between separate operations:
// Later in your code
const post = await loadPost('123');

// getPosts is NOT called - result is cached

Custom Cache Requirements

Any object that implements the Map interface can be used as a cache:
interface MapLike<K, V> {
  get(key: K): V | undefined;
  set(key: K, value: V): void;
  has(key: K): boolean;
}

LRU Cache Implementation

For production applications, an LRU (Least Recently Used) cache prevents unbounded memory growth. We recommend using tmp-cache:
1

Install tmp-cache

npm install tmp-cache
2

Create an LRU cache

import LRU from 'tmp-cache';
import { load } from 'dldr/cache';

// Create an LRU cache with max 100 entries
const cache = new LRU(100);

const loadUser = load.bind(null, getUsers, cache);
3

Use as normal

The LRU cache automatically evicts old entries:
// First 100 unique users are cached
for (let i = 0; i < 100; i++) {
  await loadUser(`user-${i}`);
}

// This triggers eviction of the least recently used entry
await loadUser('user-101');

Cache Implementations

Best for production use with memory constraints:
import LRU from 'tmp-cache';
import { load } from 'dldr/cache';
import { getUsers } from './db';

// Limit cache to 1000 entries
const userCache = new LRU(1000);
const loadUser = load.bind(null, getUsers, userCache);

// Automatically evicts old entries when limit is reached
const user = await loadUser('user-1');

Cache Management

Manual Cache Control

Since dldr doesn’t handle cache mutations, you control cache invalidation:
import { load } from 'dldr/cache';

const cache = new Map();
const loadUser = load.bind(null, getUsers, cache);

// Clear a specific entry
const invalidateUser = (id: string) => {
  cache.delete(id);
};

// Clear entire cache
const clearCache = () => {
  cache.clear();
};

// Usage
await loadUser('user-1'); // Fetches from database
await loadUser('user-1'); // Returns cached value

invalidateUser('user-1');
await loadUser('user-1'); // Fetches from database again

Priming the Cache

Pre-populate the cache with known data:
const cache = new Map();

// Prime the cache
cache.set('user-1', { id: 'user-1', name: 'John' });
cache.set('user-2', { id: 'user-2', name: 'Jane' });

const loadUser = load.bind(null, getUsers, cache);

// These will use cached values
await loadUser('user-1'); // No database call

Cache Warming

Warm up the cache on application start:
import { load } from 'dldr/cache';

const cache = new Map();
const loadUser = load.bind(null, getUsers, cache);

// Warm up cache with frequently accessed users
async function warmCache() {
  const popularUserIds = ['user-1', 'user-2', 'user-3'];
  await Promise.all(popularUserIds.map(id => loadUser(id)));
}

// Call on startup
await warmCache();

Best Practices

Size your cache based on memory constraints and access patterns. Monitor cache hit rates to optimize size.
Always invalidate cache entries when the underlying data changes:
async function updateUser(id: string, data: any) {
  await db.users.update(id, data);
  cache.delete(id); // Invalidate
}
Implement time-based expiration for data that changes frequently but doesn’t require immediate invalidation.
Use LRU caches in production to prevent unbounded memory growth. Monitor cache size and eviction rates.

Next Steps

Build docs developers (and LLMs) love