IConfiguration vs IOptions NET
Synchronous and Asynchronous in .NET Core
Model Binding and Validation in ASP.NET Core
ControllerBase vs Controller in ASP.NET Core
ConfigureServices and Configure methods
IHostedService interface in .NET Core
ASP.NET Core request processing
| Caching in .NET Core | Swagger-OpenAPI-Documentation | |
🚦 Rate Limiting in .NET Core |
Rate limiting is a technique used to control the number of requests a client can make to an API within a specified time window. It helps protect resources, prevent abuse, and ensure fair usage across clients. It is an essential strategy for building resilient, secure, and scalable ASP.NET Core applications by preventing resource overuse, ensuring fair usage, and protecting against attacks.
Since ASP.NET Core 7, built-in middleware has been available in the Microsoft.AspNetCore.RateLimiting namespace, offering a standardized and straightforward way to apply rate limiting.
The ASP.NET Core rate limiting middleware supports several popular algorithms, each with its own characteristics.
How it works: Divides time into fixed, non-overlapping intervals (windows). It allows a set number of requests within each window.
Behavior: When a new window begins, the request limit is reset. This can cause "burstiness" at the start of a new window if multiple clients try to access the resource at once.
Best for: Simple, straightforward use cases where burst traffic at window boundaries is acceptable.
How it works: Similar to the fixed window but more advanced. It divides the time window into segments and tracks the request count for each segment.
Behavior: The window then "slides" forward, providing smoother request distribution and preventing burstiness.
Best for: General-purpose APIs that need reliable traffic shaping.
How it works: Uses a "bucket" that holds tokens, where each token represents a request. Tokens are added at a constant rate.
Behavior: Each request consumes a token. If the bucket is empty, the request is rejected or queued. Handles bursts by using accumulated tokens.
Best for: APIs needing flexibility to handle traffic bursts while maintaining a consistent average rate.
How it works: Limits the number of concurrent requests instead of using time windows.
Behavior: Reduces available permits when a request starts and releases them when it finishes.
Best for: Protecting limited resources like database connection pools.
Add the rate limiter service in Program.cs and configure a policy.
builder.Services.AddRateLimiter(options =>
{
options.RejectionStatusCode = StatusCodes.Status429TooManyRequests;
options.AddFixedWindowLimiter(policyName: "fixed", options =>
{
options.PermitLimit = 5;
options.Window = TimeSpan.FromSeconds(10);
options.QueueProcessingOrder = QueueProcessingOrder.OldestFirst;
options.QueueLimit = 2;
});
});
[EnableRateLimiting("fixed")]
[HttpGet]
public IEnumerable<string> GetProducts()
{
return new string[] { "Product A", "Product B", "Product C" };
}
[DisableRateLimiting]
[HttpGet("no-limit")]
public string GetNoLimit()
{
return "This endpoint has no rate limit.";
}
Accessing /api/products will be subject to the fixed-window limit. Extra requests will be queued or return 429 Too Many Requests.
For user or IP-based limits, use a partitioned rate limiter with a partition key during configuration.
| Caching in .NET Core | Swagger-OpenAPI-Documentation | |