The makeCacheableSignalKeyStore function wraps a SignalKeyStore with an in-memory cache to reduce database queries and improve performance.
Function Signature
function makeCacheableSignalKeyStore(
store: SignalKeyStore,
logger?: ILogger,
_cache?: CacheStore
): SignalKeyStore
Parameters
The underlying key store to add caching to. Can be any implementation of SignalKeyStore.
Optional logger to trace cache operations. Logs events like:
- Cache hits/misses
- Number of items loaded from store
- Cache update operations
Optional custom cache implementation. If not provided, uses a default NodeCache with:
- TTL: 5 minutes (300 seconds)
- useClones: false (stores references, not copies)
- deleteOnExpire: true (automatically removes expired entries)
Returns
A cached wrapper around the original store with the same interface
Retrieves keys, checking cache first before querying the storeasync get<T extends keyof SignalDataTypeMap>(
type: T,
ids: string[]
): Promise<{ [id: string]: SignalDataTypeMap[T] }>
Writes to both cache and underlying storeasync set(data: SignalDataSet): Promise<void>
Clears both cache and underlying storeasync clear?(): Promise<void>
Usage Example
import {
makeCacheableSignalKeyStore,
useMultiFileAuthState
} from '@whiskeysockets/baileys'
// Create base auth state
const { state } = await useMultiFileAuthState('./auth_info')
// Wrap the key store with caching
const cachedState = {
...state,
keys: makeCacheableSignalKeyStore(
state.keys,
logger, // your logger instance
customCache // optional custom cache
)
}
// Use cached state with socket
const sock = makeWASocket({ auth: cachedState })
How It Works
Cache Keys
Cache keys are generated using a combination of type and ID:
function getUniqueId(type: string, id: string) {
return `${type}.${id}`
}
// Examples:
// "pre-key.1"
// "[email protected]"
// "app-state-sync-key.abc123"
Get Operation Flow
- Check cache for each requested ID
- Collect cache misses in
idsToFetch array
- Query store only for missing IDs
- Update cache with fetched items
- Return combined results (cached + fetched)
async get(type, ids) {
return cacheMutex.runExclusive(async () => {
const data = {}
const idsToFetch = []
// Check cache first
for (const id of ids) {
const item = await cache.get(getUniqueId(type, id))
if (typeof item !== 'undefined') {
data[id] = item
} else {
idsToFetch.push(id)
}
}
// Fetch missing from store
if (idsToFetch.length) {
logger?.trace({ items: idsToFetch.length }, 'loading from store')
const fetched = await store.get(type, idsToFetch)
// Cache fetched items
for (const id of idsToFetch) {
const item = fetched[id]
if (item) {
data[id] = item
await cache.set(getUniqueId(type, id), item)
}
}
}
return data
})
}
Set Operation Flow
- Update cache with all provided data
- Write to store (pass-through to underlying implementation)
- Log operation if logger provided
async set(data) {
return cacheMutex.runExclusive(async () => {
let keys = 0
// Update cache
for (const type in data) {
for (const id in data[type]) {
await cache.set(getUniqueId(type, id), data[type][id])
keys += 1
}
}
logger?.trace({ keys }, 'updated cache')
// Write to store
await store.set(data)
})
}
Thread Safety
All cache operations are protected by a mutex to prevent race conditions:
const cacheMutex = new Mutex()
// All operations run exclusively
await cacheMutex.runExclusive(async () => {
// Cache operations here
})
Frequently accessed keys (like sessions) are served from memory, significantly reducing database load.
In-memory cache access is orders of magnitude faster than disk/database access.
Default 5-minute TTL ensures cache doesn’t grow unbounded while keeping hot data available.
Custom Cache Implementation
You can provide a custom cache that implements the CacheStore interface:
interface CacheStore {
get<T>(key: string): Promise<T> | T | undefined
set<T>(key: string, value: T): Promise<void> | void | number | boolean
del(key: string): void | Promise<void> | number | boolean
flushAll(): void | Promise<void>
}
Example: Redis Cache
import { createClient } from 'redis'
const redisClient = await createClient().connect()
const redisCache: CacheStore = {
async get(key) {
const value = await redisClient.get(key)
return value ? JSON.parse(value) : undefined
},
async set(key, value) {
await redisClient.setEx(key, 300, JSON.stringify(value))
},
async del(key) {
await redisClient.del(key)
},
async flushAll() {
await redisClient.flushAll()
}
}
const cachedStore = makeCacheableSignalKeyStore(
baseStore,
logger,
redisCache
)