Skip to main content

Memory Cache Driver

The memory cache driver stores data in memory. All cached data is lost when the application restarts.

When to Use

  • Local development
  • Caching data for the duration of a process
  • Fastest possible cache (no disk or network I/O)

Best For

  • Temporary data
  • Caching API responses, user sessions, or computed values

Limitations

  • Data is lost on restart
  • Not shared between processes or servers
  • Memory growth: Without maxSize, cache can grow indefinitely (use maxSize or LRU Memory Cache Driver for automatic eviction)

Alternatives

Configuration

src/config/cache.ts
import { env } from "@mongez/dotenv";
import { CacheConfigurations, MemoryCacheDriver, CACHE_FOR } from "@warlock.js/cache";

const cacheConfigurations: CacheConfigurations = {
drivers: {
memory: MemoryCacheDriver,
},
default: env("CACHE_DRIVER", "memory"),
options: {
memory: {
globalPrefix: "dev-app",
ttl: CACHE_FOR.ONE_HOUR,
maxSize: 1000, // Optional: Max 1000 items with LRU eviction
},
},
};

export default cacheConfigurations;

Options

OptionTypeDefaultDescription
globalPrefixstring | FunctionundefinedGlobal prefix for all cache keys
ttlnumberInfinityDefault TTL in seconds
maxSizenumberundefinedMaximum number of items. When exceeded, least recently used items are evicted

Global Prefix

{
globalPrefix: "myapp", // Static prefix
// OR
globalPrefix: () => `app-${environment()}`, // Dynamic prefix
}

TTL Configuration

{
ttl: Infinity, // Never expire (default)
// OR
ttl: 3600, // 1 hour
// OR
ttl: 60 * 60 * 24, // 24 hours
}

Memory Size Limit (maxSize)

The maxSize option limits the number of items in the cache. When the limit is reached, the least recently used (LRU) items are automatically evicted.

{
maxSize: 1000, // Cache will hold maximum 1000 items
// When 1001st item is added, least recently used item is removed
}

When to use maxSize:

  • You want to prevent unbounded memory growth
  • You have predictable cache size requirements
  • You want automatic eviction without using the dedicated LRU driver

How it works:

  • Uses LRU (Least Recently Used) eviction algorithm
  • Tracks access order automatically
  • Evicts oldest items when limit is reached
  • Works alongside TTL expiration (both can remove items)

Example:

import { CacheConfigurations, MemoryCacheDriver, CACHE_FOR } from "@warlock.js/cache";

const cacheConfigurations: CacheConfigurations = {
drivers: {
memory: MemoryCacheDriver,
},
options: {
memory: {
maxSize: 500, // Max 500 items
ttl: 3600, // Items expire after 1 hour
// Items can be removed by: expiration OR eviction when full
},
},
};
tip

For dedicated LRU caching with more control, consider using the LRU Memory Cache Driver instead.

Example Usage

Storing and Retrieving Data

import { cache } from "@warlock.js/cache";

await cache.set("user.1", { name: "Alice" }, CACHE_FOR.ONE_HOUR);
const user = await cache.get("user.1");

Memory Management

The memory driver uses an in-memory data structure to store cache entries. Key points:

  • Size limits: Use maxSize option to limit cache size with automatic LRU eviction
  • Eviction: Items are removed when they expire, manually deleted, or evicted when maxSize limit is reached
  • Fast access: O(1) average time complexity for get/set operations
  • LRU tracking: When maxSize is set, access order is tracked for efficient eviction
  • Process isolation: Each Node.js process has its own cache instance

Troubleshooting

  • Data disappears after restart: This is expected. Use File or Redis drivers for persistence.
  • Cache not shared between processes: Use Redis for distributed cache.
  • Memory usage growing: Set maxSize option or use LRU Memory Cache Driver for automatic eviction.
  • Items being evicted unexpectedly: Check if maxSize is set and if cache is reaching the limit.
note

Please note that the Memory Cache Drier implements all methods in Cache Driver Interface so you can use it directly as a cache driver.