Caching: Improving Performance

Implementing caching strategies using Redis or other caching providers to improve application performance.


Advanced Caching in NestJS

Advanced Caching Configurations

Caching is a crucial technique for improving the performance and scalability of NestJS applications. Beyond basic caching, advanced configurations offer greater control and efficiency in managing cached data. This section explores techniques like setting different Time-To-Live (TTLs) for various data types, utilizing Redis Pub/Sub for efficient cache invalidation, and implementing multi-layered caching strategies. These strategies can significantly reduce database load, improve response times, and enhance the overall user experience.

Setting Different TTLs for Different Data

One-size-fits-all TTLs are often inefficient. Frequently updated data should have shorter TTLs, while rarely changing data can benefit from longer TTLs. This optimizes cache freshness and avoids serving stale data unnecessarily.

Example (Conceptual):

Imagine caching user profiles and product details. User profiles might update frequently (e.g., address changes), so a TTL of 5 minutes might be appropriate. Product details, however, may only change a few times a day, justifying a TTL of several hours.

(Conceptual - using a theoretical caching service wrapper)

 // Assuming a custom caching service
  import { Injectable, Inject } from '@nestjs/common';
  import { CACHE_MANAGER } from '@nestjs/cache-manager';
  import { Cache } from 'cache-manager';

  @Injectable()
  export class CustomCacheService {
    constructor(@Inject(CACHE_MANAGER) private cacheManager: Cache) {}

    async get(key: string, ttl: number = null): Promise<any> {
        return await this.cacheManager.get(key);
    }

    async set(key: string, value: any, ttl: number = null): Promise<void> {
        if(ttl){
            await this.cacheManager.set(key, value, ttl);
        } else {
            await this.cacheManager.set(key, value);
        }
    }
  }

  // In your controller or service
  async getUserProfile(userId: string) {
    const cacheKey = `user:${userId}`;
    let userProfile = await this.customCacheService.get(cacheKey);

    if (!userProfile) {
      userProfile = await this.userService.getUserProfile(userId);
      await this.customCacheService.set(cacheKey, userProfile, { ttl: 300 }); // 5 minutes (300 seconds)
    }

    return userProfile;
  }

  async getProductDetails(productId: string) {
    const cacheKey = `product:${productId}`;
    let productDetails = await this.customCacheService.get(cacheKey);

    if (!productDetails) {
      productDetails = await this.productService.getProductDetails(productId);
      await this.customCacheService.set(cacheKey, productDetails, { ttl: 3600 }); // 1 hour (3600 seconds)
    }

    return productDetails;
  } 

Explanation:

  • This example uses a CustomCacheService to abstract the caching logic. This would interact with the CACHE_MANAGER in NestJS, likely using Redis or Memcached.
  • The getUserProfile and getProductDetails functions demonstrate setting different TTLs (ttl option) for different types of data. User profiles have a shorter TTL (5 minutes), while product details have a longer TTL (1 hour).
  • Note: The exact syntax for setting TTLs might vary slightly depending on the underlying cache provider (e.g., Redis, Memcached). The provided example shows the general principle.

Using Redis Pub/Sub for Cache Invalidation

When data is updated in the database, it's crucial to invalidate the corresponding cache entries. Polling mechanisms are inefficient. Redis Pub/Sub provides a real-time, event-driven approach to cache invalidation. When data changes, a message is published to a Redis channel, and all interested services (cache layers) subscribe to that channel and invalidate their relevant cache entries.

Example (Conceptual):

(Conceptual - showing Redis Pub/Sub integration)

 // Assume a Redis client is already configured and injected

  // Publisher (e.g., in your data update service)
  async updateProduct(productId: string, productData: any) {
    // Update product in the database
    await this.productRepository.update(productId, productData);

    // Publish a message to the 'productUpdates' channel
    this.redisClient.publish('productUpdates', productId); // Publish the product ID
  }


  // Subscriber (e.g., in your caching service)
  async onModuleInit() {
    const subscriber = this.redisClient.duplicate(); // Create a separate connection for subscribing

    subscriber.subscribe('productUpdates', (productId) => {
      // Invalidate the cache for the updated product
      this.cacheManager.del(`product:${productId}`);
      console.log(`Cache invalidated for product: ${productId}`);
    });
  } 

Explanation:

  • The updateProduct function updates the product in the database and then publishes a message containing the product ID to the productUpdates channel.
  • The onModuleInit method in a service (likely your caching service) subscribes to the productUpdates channel.
  • When a message is received, the subscriber extracts the product ID and invalidates the corresponding cache entry using this.cacheManager.del().
  • You need to have a Redis client properly setup in your NestJS application, injected as a dependency.

Implementing a Multi-Layered Cache

A multi-layered cache combines different caching technologies to leverage their strengths. A common approach is to use an in-memory cache (e.g., using cache-manager with a memory store) for frequently accessed data and a distributed cache (e.g., Redis, Memcached) for larger datasets or data shared across multiple application instances.

Example (Conceptual):

(Conceptual - showing in-memory and Redis caching)

 import { Injectable, Inject } from '@nestjs/common';
  import { CACHE_MANAGER } from '@nestjs/cache-manager';
  import { Cache } from 'cache-manager';

  @Injectable()
  export class ProductService {
    constructor(@Inject(CACHE_MANAGER) private cacheManager: Cache) {}

    async getProductDetails(productId: string): Promise<any> {
      // 1. Check in-memory cache (fastest)
      const inMemoryCacheKey = `product:in-memory:${productId}`;
      let productDetails = await this.cacheManager.get(inMemoryCacheKey);

      if (!productDetails) {
        // 2. Check Redis cache (slower than in-memory, but distributed)
        const redisCacheKey = `product:redis:${productId}`;
        productDetails = await this.cacheManager.get(redisCacheKey);

        if (!productDetails) {
          // 3. Fetch from database (slowest)
          productDetails = await this.productRepository.findById(productId);

          // 4. Store in both Redis and in-memory caches
          await this.cacheManager.set(redisCacheKey, productDetails, { ttl: 3600 }); // Redis TTL
          await this.cacheManager.set(inMemoryCacheKey, productDetails, { ttl: 60 });  // In-memory TTL (shorter)
        } else {
            //Store in memory since it came from Redis
            await this.cacheManager.set(inMemoryCacheKey, productDetails, { ttl: 60 });
        }
      }


      return productDetails;
    }
  } 

Explanation:

  • The getProductDetails function first checks the in-memory cache. If the data is found there, it's returned immediately.
  • If the data is not in the in-memory cache, it checks the Redis cache.
  • If the data is not in either cache, it's fetched from the database.
  • Finally, the data is stored in both the Redis cache (with a longer TTL) and the in-memory cache (with a shorter TTL) for future requests.
  • In this scenario, the CACHE_MANAGER would likely be configured to use both an in-memory store and a Redis store (or Memcached). You'd configure the `CacheModule` in your app.module.ts to use the two stores.