3


I used the cacheables cache library in the past two days. I think it's pretty good. I will share with you a summary of the source code after reading it.

1. Introduction

Just like its name, "cacheables" is used for memory caching. Its code is only about 200 lines (without comments). The official introduction is as follows:

A simple memory cache, supports different caching strategies, and uses TypeScript to write elegant syntax.

Its characteristics:

  • Elegant syntax, wrapping existing API calls, saving API calls;
  • The result of the complete input. No type conversion is required.
  • Support different caching strategies.
  • Integration log: Check the time of the API call.
  • Use auxiliary functions to build cache keys.
  • Suitable for browsers and Node.js.
  • There is no dependency.
  • Conduct a wide range of tests.
  • Small size, 1.43kb after gzip.

When we need to cache asynchronous tasks such as requests in our business to avoid repeated requests, we can use "cacheables".

Two, hands-on experience

cacheables started with 0617e703a90301 is very simple, take a look at the following comparison:

// 没有使用缓存
fetch("https://some-url.com/api");

// 有使用缓存
cache.cacheable(() => fetch("https://some-url.com/api"), "key");

Next, let's take a look at the usage example of the cache request provided by the official website:

1. Installation dependencies

npm install cacheables
// 或者
pnpm add cacheables

2. Examples of use

import { Cacheables } from "cacheables";
const apiUrl = "http://localhost:3000/";

// 创建一个新的缓存实例  ①
const cache = new Cacheables({
  logTiming: true,
  log: true,
});

// 模拟异步任务
const wait = (ms: number) => new Promise((resolve) => setTimeout(resolve, ms));

// 包装一个现有 API 调用 fetch(apiUrl),并分配一个 key 为 weather
// 下面例子使用 'max-age' 缓存策略,它会在一段时间后缓存失效
// 该方法返回一个完整 Promise,就像' fetch(apiUrl) '一样,可以缓存结果。
const getWeatherData = () =>
  // ②
  cache.cacheable(() => fetch(apiUrl), "weather", {
    cachePolicy: "max-age",
    maxAge: 5000,
  });

const start = async () => {
  // 获取新数据,并添加到缓存中
  const weatherData = await getWeatherData();

  // 3秒之后再执行
  await wait(3000);

  // 缓存新数据,maxAge设置5秒,此时还未过期
  const cachedWeatherData = await getWeatherData();

  // 3秒之后再执行
  await wait(3000);

  // 缓存超过5秒,此时已过期,此时请求的数据将会再缓存起来
  const freshWeatherData = await getWeatherData();
};

start();

In the example code above, we implement a service that requests a cache. If the maxAge , the request will not be re-sent, but the result will be read from the cache and returned.

3. API introduction

Many APIs are introduced in the official documents, which can be obtained from the document . The more commonly used ones, such as cache.cacheable() , are used to package a method for caching.
All APIs are as follows:

  • new Cacheables(options?): Cacheables
  • cache.cacheable(resource, key, options?): Promise<T>
  • cache.delete(key: string): void
  • cache.clear(): void
  • cache.keys(): string[]
  • cache.isCached(key: string): boolean
  • Cacheables.key(...args: (string | number)[]): string

You can deepen your understanding through the following figure:

Three, source code analysis

After cloning the cacheables project, you can see that the main logic is in index.ts , remove the line breaks and comments, the code size is about 200 lines, and it is relatively easy to read.
Next, we follow the official examples and read the source code as the main line.

1. Create a cache instance

In step ① in the example, first new Cacheables() Cacheables class in the source code is as follows. Here, delete the redundant code first, and look at the methods and functions provided by the class:

export class Cacheables {
  constructor(options?: CacheOptions) {
    this.enabled = options?.enabled ?? true;
    this.log = options?.log ?? false;
    this.logTiming = options?.logTiming ?? false;
  }
  // 使用提供的参数创建一个 key
  static key(): string {}

  // 删除一笔缓存
  delete(): void {}

  // 清除所有缓存
  clear(): void {}

  // 返回指定 key 的缓存对象是否存在,并且有效(即是否超时)
  isCached(key: string): boolean {}

  // 返回所有的缓存 key
  keys(): string[] {}

  // 用来包装方法调用,做缓存
  async cacheable<T>(): Promise<T> {}
}

In this way, it is very intuitive to understand the role and supported methods of the cacheables instance. The UML class diagram is as follows:

When instantiating in step ①, the internal constructor of Cacheables will save the input parameters. The interface is defined as follows:

const cache = new Cacheables({
  logTiming: true,
  log: true,
});

export type CacheOptions = {
  // 缓存开关
  enabled?: boolean;
  // 启用/禁用缓存命中日志
  log?: boolean;
  // 启用/禁用计时
  logTiming?: boolean;
};

According to the parameters, we can see that our Cacheables instance supports cache log and timing functions at this time.

2. Packaging cache method

In step ②, we wrap the request method in the cache.cacheable method, and realize the use of max-age as the caching strategy, and a cache valid for 5000 milliseconds:

const getWeatherData = () =>
  cache.cacheable(() => fetch(apiUrl), "weather", {
    cachePolicy: "max-age",
    maxAge: 5000,
  });

Among them, the cacheable method is Cacheables class, defined as follows (remove log related code):

// 执行缓存设置
async cacheable<T>(
  resource: () => Promise<T>,  // 一个返回Promise的函数
  key: string,  // 缓存的 key
  options?: CacheableOptions, // 缓存策略
): Promise<T> {
  const shouldCache = this.enabled
  // 没有启用缓存,则直接调用传入的函数,并返回调用结果
  if (!shouldCache) {
    return resource()
  }
    // ... 省略日志代码
  const result = await this.#cacheable(resource, key, options) // 核心
    // ... 省略日志代码
  return result
}

The cacheable method receives three parameters:

  • resource : The function that needs to be wrapped is a function that returns Promise, such as () => fetch() ;
  • key : used for caching key ;
  • options : configuration options for caching strategy;

Return this.#cacheable result of the implementation of private methods, this.#cacheable private methods to achieve the following:

// 处理缓存,如保存缓存对象等
async #cacheable<T>(
  resource: () => Promise<T>,
  key: string,
  options?: CacheableOptions,
): Promise<T> {
  // 先通过 key 获取缓存对象
  let cacheable = this.#cacheables[key] as Cacheable<T> | undefined
    // 如果不存在该 key 下的缓存对象,则通过 Cacheable 实例化一个新的缓存对象
  // 并保存在该 key 下
  if (!cacheable) {
    cacheable = new Cacheable()
    this.#cacheables[key] = cacheable
  }
    // 调用对应缓存策略
  return await cacheable.touch(resource, options)
}

The parameters received by the this.#cacheable cacheable method, and the return is cacheable.touch method call.
If the key cache object does not exist, Cacheable class, and its UML class diagram is as follows:

3. Dealing with caching strategies

In the previous step, the cacheable.touch method, which is defined as follows:

// 执行缓存策略的方法
async touch(
  resource: () => Promise<T>,
  options?: CacheableOptions,
): Promise<T> {
  if (!this.#initialized) {
    return this.#handlePreInit(resource, options)
  }
  if (!options) {
    return this.#handleCacheOnly()
  }
    // 通过实例化 Cacheables 时候配置的 options 的 cachePolicy 选择对应策略进行处理
  switch (options.cachePolicy) {
    case 'cache-only':
      return this.#handleCacheOnly()
    case 'network-only':
      return this.#handleNetworkOnly(resource)
    case 'stale-while-revalidate':
      return this.#handleSwr(resource)
    case 'max-age': // 本案例使用的类型
      return this.#handleMaxAge(resource, options.maxAge)
    case 'network-only-non-concurrent':
      return this.#handleNetworkOnlyNonConcurrent(resource)
  }
}

touch method receives two parameters, from #cacheable private methods parameters resource and options .
This case uses the max-age caching strategy, so let's take a look at the corresponding #handleMaxAge private method definition (other similar):

// maxAge 缓存策略的处理方法
#handleMaxAge(resource: () => Promise<T>, maxAge: number) {
    // #lastFetch 最后发送时间,在 fetch 时会记录当前时间
    // 如果当前时间大于 #lastFetch + maxAge 时,会非并发调用传入的方法
  if (!this.#lastFetch || Date.now() > this.#lastFetch + maxAge) {
    return this.#fetchNonConcurrent(resource)
  }
  return this.#value // 如果是缓存期间,则直接返回前面缓存的结果
}

When we execute getWeatherData() second time, it has been 6 seconds, and it has exceeded maxAge . After that, the cache will be invalidated and the request will be reissued.

Look at the #fetchNonConcurrent , which is used to send non-concurrent requests:

// 发送非并发请求
async #fetchNonConcurrent(resource: () => Promise<T>): Promise<T> {
    // 非并发情况,如果当前请求还在发送中,则直接执行当前执行中的方法,并返回结果
  if (this.#isFetching(this.#promise)) {
    await this.#promise
    return this.#value
  }
  // 否则直接执行传入的方法
  return this.#fetch(resource)
}

#fetchNonConcurrent private method only accepts the parameter resource , which is the function that needs to be wrapped.

Here, first judge whether it is currently in the [Sending] state, if it is directly called this.#promise , and return the cached value to end the call. Otherwise resource incoming #fetch execution.

#fetch private method is defined as follows:

// 执行请求发送
async #fetch(resource: () => Promise<T>): Promise<T> {
  this.#lastFetch = Date.now()
  this.#promise = resource() // 定义守卫变量,表示当前有任务在执行
  this.#value = await this.#promise
  if (!this.#initialized) this.#initialized = true
  this.#promise = undefined  // 执行完成,清空守卫变量
  return this.#value
}

#fetch private method receives the previous functions that need to be packaged, and guard variable assignment at the beginning of the execution. After the task execution is completed, the guard variable is cleared.

This is also a method often used in our actual business development. For example, before sending a request, assign a variable to it to indicate that there is currently a task to be executed, and other requests cannot be sent. After the request is over, the variable is cleared and other tasks are continued.

mission accomplished. The execution process of "cacheables" is roughly like this. Next, we summarize a general cache solution for easy understanding and expansion.

Fourth, the general cache library design scheme

Five caching strategies are supported in Cacheables, of which max-age only introduced above:

Here is a summary of a general cache library design plan, roughly as follows:

The cache library supports instantiation by passing in the options parameter, using the options.key passed by the user as the key, and calling the CachePolicyHandler object to obtain the user-specified cache policy (Cache Policy).
Then use the options.resource passed by the user as the actual method to be executed, and pass it in and execute it CachePlicyHandler()

In the above figure, we need to define various cache library operation methods (such as methods for reading and setting cache) and processing methods for various caching strategies.

Logger can also be integrated, which is convenient for users to use and develop. This article will not go into details, the core is to introduce this program.

Five, summary

This article shares with you the cacheables cache library source code, the source code logic is not complicated, mainly to support various caching strategies and corresponding processing logic. At the end of the article, I will sum up a general cache library design plan. If you are interested, you can try it yourself. Memorability is not as good as bad writing.
Thinking is the most important. This kind of thinking can be used in many scenarios. You can practice and summarize more in actual business.​

Six, a few more thoughts

1. Thinking about reading source code

Everyone is reading the source code and discussing the source code. How to read the source code?
personal suggestion:

  1. First determine the part of the source code you want to learn (such as Vue2 reactive principle, Vue3 Ref, etc.);
  2. According to the part to be learned, write a simple demo;
  3. Get a general understanding through demo breakpoints;
  4. Look through the source code and read it in detail, because there are often comments and examples in the source code.

If you just want to learn a certain library, you can read README.md first, focusing on introduction, features, usage, examples, etc. Grasp its characteristics and examples for targeted source code reading.
I believe that the thinking will be clearer after reading this way.

2. Thinking about interface-oriented programming

This library uses TypeScript. Through each interface definition, we can clearly know the role of each class, method, and attribute. This is what we need to learn.
When we receive a demand task, you can do this, and your efficiency will often improve a lot:

  1. function analysis : Analyze the entire requirement, understand the functions and details that need to be implemented, and sort through tools such as xmind to avoid doing things, frequent rework, and messy code structure.
  2. functional design : After sorting out the requirements, you can design each part, such as extracting general methods, etc.
  3. function realization : The first two steps are done well, I believe the function realization is not difficult~

3. Think about the optimization points of this library

This library code is mainly concentrated in index.ts , and it is good to read. When the amount of code increases, I am afraid that the reading experience will be worse.
So my suggestion is:

  1. Split the code and divide some independent logic into separate files for maintenance. For example, the logic of each caching strategy can be a separate file, developed through a unified development method (such as Plugin), and then unified entry files are imported and exported.
  2. Logger can be transformed to support user customization. For example, other logging tool methods can be used instead of the built-in Logger, which is more decoupled. You can refer to the plug-in architecture design, so that this library will be more flexible and expandable.

pingan8787
3.2k 声望4.1k 粉丝

🌼 前端宝藏小哥哥