Concurrency Protection: Cache, Degradation, and Rate Limiting

Concurrency Protection: Cache, Degradation, and Rate Limiting

Concurrency Protection: Cache, Degradation, and Rate Limiting

System protection measures: cache, degradation, and rate limiting.

Cache

Caching improves system throughput.

Cache Penetration

This is an attack method where requests query data that doesn’t exist in the database. Each request misses the cache, triggering unnecessary database reads every time.

Solutions:

  1. Bloom filter returns quickly when nothing found
  2. Cache empty results to block requests from reaching the database

Cache Avalanche

When cached items expire at the same time, causing database overload after simultaneous invalidation.

Solution: Randomize expiration times to spread them out.

Cache Breakdown

When a single key expires and causes database overload.

Don’t fetch from DB immediately. Instead, set a mutex lock (SETNX in Redis) and only fetch data if the lock succeeds. Then refresh the cache and retry the request.

Degradation

Block non-critical requests that affect core processes.

Rate Limiting

Reject service or degrade when limits are reached. Start with database connection pools and thread pools.

  • Gateway (nginx) limits concurrent requests
  • RPC call rates
  • Message queue consumption rates

Token Bucket

The token bucket algorithm uses a fixed-capacity bucket filled with tokens at a constant rate.

When the bucket is full, new tokens are discarded or rejected. When tokens run out, data gets dropped or waits.

Leaky Bucket

Used for traffic shaping and control.

Requests flow out at a fixed rate. When requests exceed bucket capacity, they get rejected.

Counter-based Rate Limiting

Limits total concurrent connections like database pools, thread pools, or flash sale concurrency.

Use cache to store counters and expiration times. Limits calls per second.

Distributed Rate Limiting

Use Redis or nginx load balancing with consistent hashing for distribution.