Previous Versioning WEB API in .net core Rate Limiting in .NET Core Next

🧠 Caching in .NET Core

📌 What is Caching?

Caching is a technique used to store frequently accessed data in memory to reduce latency and improve performance. In .NET Core, caching can be implemented using in-memory caching, distributed caching, or response caching.

In ASP.NET Core, caching is a powerful technique for storing frequently accessed data to improve application performance and reduce database load. ASP.NET Core provides several caching options, with the two most common being in-memory and distributed caching.

🛠️ Types of Caching in .NET Core

  • In-Memory Caching: This is the simplest form of caching, where data is stored directly in the web server's memory. Stores data in the server's memory. Ideal for single-server setups.
    • Pros: It is extremely fast because it avoids network overhead.
    • Cons: The cache is lost if the application or server restarts. It is also not shared across multiple server instances in a load-balanced environment, which can cause data inconsistency issues unless "sticky sessions" are used.
    • Best for: Small to mid-scale applications and data that is cheap to re-fetch.
  • Distributed Caching:This method stores cached data in an external service, such as a Redis cache or a SQL Server database, that can be shared across multiple web servers. Uses external stores like Redis or SQL Server. Suitable for multi-server environments.
    • Pros: The cache is consistent across all application instances, and data persists even if an individual server or the application restarts.
    • Cons: It is slower than in-memory caching due to the network trip required to access the cache service. It also requires external infrastructure, which increases complexity and cost.
    • Best for: Scalable, multi-instance, cloud-based applications.
  • Response Caching:This is a server-side and client-side caching mechanism that works based on HTTP cache headers (Cache-Control, Vary) to cache full HTTP responses. This can be configured with middleware or attributes. Caches HTTP responses to reduce processing time for repeated requests.

📋 Example: In-Memory Caching

This example shows how to use IMemoryCache to cache a list of products in a web API.

using Microsoft.AspNetCore.Mvc;
using Microsoft.Extensions.Caching.Memory;
using System.Collections.Generic;

namespace CachingExample.Controllers;

[Route("api/[controller]")]
[ApiController]
public class ProductsController : ControllerBase
{
    private readonly IMemoryCache _cache;
    private readonly ILogger<ProductsController>
        _logger;

    public ProductsController(IMemoryCache cache, ILogger<ProductsController>
        logger)
    {
        _cache = cache;
        _logger = logger;
    }

    [HttpGet]
    public async Task<ActionResult<IEnumerable<string>>> Get()
    {
        string cacheKey = "all_products";
        List <string>? products;

        // Check if data is in cache
        if (!_cache.TryGetValue(cacheKey, out products))
        {
            _logger.LogInformation("Cache miss. Fetching products from source.");

            // Simulate fetching data from a slow data source
            await Task.Delay(2000);
            products = new List<string>
            { "Laptop", "Mouse", "Keyboard" }
            ;

            // Configure cache options
            var cacheOptions = new MemoryCacheEntryOptions()
            .SetAbsoluteExpiration(TimeSpan.FromMinutes(5)) // Cache will expire in 5 minutes
            .SetSlidingExpiration(TimeSpan.FromMinutes(2)); // Reset expiry if accessed within 2 minutes

            // Store data in cache
            _cache.Set(cacheKey, products, cacheOptions);
        }
        else
        {
            _logger.LogInformation("Cache hit. Serving products from cache.");
        }

        return Ok(products);
    }
}

Run and test the API

  1. Run the application.
  2. Make your first GET request to api/products. The console will log a "cache miss," and you will experience a slight delay.
  3. Immediately make another GET request. The response will be near-instant, and the console will log a "cache hit."

✅ When to Use Caching

  • Data is read frequently but changes infrequently.
  • Performance is critical and latency must be minimized.
  • External API calls or database queries are expensive.
  • Content is static or semi-static (e.g., product catalogs, configuration settings).

🚫 When Not to Use Caching

  • Data changes frequently and must be real-time accurate.
  • Security-sensitive data (e.g., user credentials).
  • Memory-constrained environments where caching could cause pressure.
  • Highly personalized content per user session.

🌟 Advantages

  • Improves application performance and scalability.
  • Reduces load on databases and external services.
  • Enhances user experience with faster response times.

⚠️ Disadvantages

  • Stale data if cache is not invalidated properly.
  • Increased memory usage.
  • Complexity in cache management and expiration policies.

🧯 Precautions

  • Always set expiration policies (absolute or sliding).
  • Use cache keys consistently and avoid collisions.
  • Monitor cache hit/miss ratios to optimize usage.
  • Secure cached data if it contains sensitive information.

🧠 Best Practices

  • Use IMemoryCache for simple scenarios and IDistributedCache for scalable setups.
  • Use cache-aside pattern: check cache first, then fallback to source.
  • Group related cache entries for easier invalidation.
  • Use logging and metrics to track cache performance.
Back to Index
Previous Versioning WEB API in .net core Rate Limiting in .NET Core Next
*