Redis cache penetration, cache breakdown, cache avalanche

table of Contents

Caching ideas

Redis is often used as the cache of the main database: the main database is Mysql, Oracle, etc., and the data will eventually land in the main database, while Redis is used as a data cache for efficient data reading. When a data query request comes in, the internal The logic is as follows:

Insert picture description here

The use of this cache architecture has brought a certain improvement to the data interaction efficiency of the system, but at the same time it has also caused some problems that need attention.

Cache problem

1. Cache penetration

Cache penetration means that the query target is data that is not in the Redis cache or the database , such as querying data with an id of -1. If the id starts from 0, the query request will query Redis and the database separately, and then return empty. If there are a large number of such malicious requests, it will put a lot of pressure on the system.

For this cache penetration problem, there are the following solutions:

  • 1. The null value is also cached in Redis, so that it will not continue to access the database
  • 2. Set a whitelist to prevent malicious IP attacks
  • 3. Use Bloom filter to verify the cache and database before querying
  • 4. Report to the police... if you really encounter a hacker attack, you can contact the network police

2. Cache breakdown

Cache penetration refers to the data that the query target is not in the Redis cache but exists in the database . Generally, the key value has expired, but it is still being continuously accessed with high concurrency.

For this cache breakdown problem, there are the following solutions:

  • 1. Cache upcoming hot data in Redis in advance.
  • 2. Adjust the expiration time of data in real time, evaluate the data that will be used for a long time, and extend the expiration time to a few days or never expire.
  • 3. Using locks, when the first request to read the database is completed, other same requests are delayed and waited. After the first request is read and the data is cached in Redis, other same requests will continue to be executed.

3. Cache avalanche

Cache breakdown often occurs when a key value expires. When multiple key values ​​expire at the same time, all queries directly reach the database layer . DB may not be able to withstand such a large pressure and cause the system to crash. This is a cache avalanche.

For this cache avalanche problem, there are the following solutions:

  • 1. Set up a multi-level cache architecture, such as nginx cache + redis cache + other caches.
  • 2. Distribute the cache invalidation time to prevent a large number of keys from invalidating at the same time.
  • 3. Set up automatic tasks to update necessary cache data in advance before the cache becomes invalid.