2
头图

Welcome to join the human high-quality front-end framework research group , take flight

Hi everyone, I'm Kasong.

React source code uses a variety of algorithms and data mechanisms to implement different modules (for example, the scheduler uses a small top heap).

Today I’m going to talk about the LRU algorithm related to data caching. The content includes four aspects:

  • Introduce a React feature
  • The relationship between this feature and the LRU
  • LRU algorithm principle
  • React in LRU

It can be said that it will be talked about from entry to implementation, so the content is more, it is recommended to like to collect and eat slowly.

The starting point of everything: Suspense

In React 16.6 introduced Suspense and React.lazy , used to split the component code.

For the following code:

import A from './A';
import B from './B';

function App() {
  return (
    <div>
      <A/>
      <B/>
    </div>
  )
}

Generated after packaging by the packaging tool:

  • chunk.js (including A、B、App component code)

For above-the- B rendering, if the 0616cdc33be66c component is not necessary, the code can be split out. Just make the following modifications:

// 之前
import B from './B';
// 之后
const B = React.lazy(() => import('./B'));

Generated after packaging by the packaging tool:

  • chunk.js (including A、App component code)
  • b.js (including B component code)

In this way, the B jsonp when the first screen is rendered, and then rendered after the request is returned.

In order B request returns, you need to use Suspense :

// 之前,省略其余代码
return (
  <div>
    <A/>
    <B/>
  </div>
)
// 之后,省略其余代码
return (
  <div>
    <A/>
    <Suspense fallback={<div>loading...</div>}>
      <B/>
    </Suspense>
  </div>
)

0616cdc33be752 will render <div>loading.。.</div> as a placeholder B

It can be seen that the role of Suspense

Before the asynchronous content returns, display the placeholder (fallback attribute), and display the content after the return

Observe JSX structure returned by the component after Suspense , and you will find a very powerful detail:

return (
  <div>
    <A/>
    <Suspense fallback={<div>loading...</div>}>
      <B/>
    </Suspense>
  </div>
)

From this JSX , it is completely impossible to see that the component B is rendered asynchronously!

The difference between synchronous and asynchronous is:

  • Synchronization: Start -> Results
  • Asynchronous: start -> intermediate state -> result

Suspense wrapped subassembly therein intermediate state logic may converge to himself process (i.e. Suspense the fallback property), it is not necessary to distinguish subassembly synchronous, asynchronous.

So, can the ability of React.lazy Suspense from 0616cdc33be8d4 (asynchronous request component code) to all asynchronous operations?

The answer is yes.

resource's great deeds

React warehouse is monorepo , contains multiple libraries (such as react , react-dom ), and which has a Suspense combination of caching library - react-cache , let us look at his usefulness.

Suppose we have a method fetchUser requests user data:

const fetchUser = (id) => {
  return fetch(`xxx/user/${id}`).then(
    res => res.json()
  )
};

Via react-cache of createResource coating process, he became a resource (resources):

import {unstable_createResource as createResource} from 'react-cache';

const userResource = createResource(fetchUser);

resource with Suspense to write the logic of asynchronous request data in a synchronous manner:

function User({ userID }) {
  const data = userResource.read(userID);
  
  return (
    <div>
      <p>name: {data.name}</p>
      <p>age: {data.age}</p>
    </div>
  )
}

As you can see, userResource.read completely synchronous fetchUser .

The logic behind is:

  1. The first call to userResource.read will create a promise (that is, the return value of fetchUser
  2. throw promise
  3. React internal catch promise later, from User component closest ancestor Suspense component rendering fallback
  4. promise resolve , the User component is renewed to render
  5. At this time, calling userResource.read will return the result of resolve fetchUser ), use this data to continue render

It can be seen from steps 1 and 5 that for a request, userResource.read may be called twice, namely:

  • Send the request for the first time, return promise
  • Return the requested data for the second time

So userResource internal need to buffer the promise value, the cached key is userID :

const data = userResource.read(userID);

Since userID is User assembly props , so when User component receives different userID time, userResource internal cache need different userID corresponding promise .

If the switch 100 userID , 100 is cached promise . Obviously we need a cache cleaning algorithm, otherwise the cache will occupy more and more until it overflows.

react-cache cache cleaning algorithm is LRU algorithm.

LRU principle

LRU (Least recently used) algorithm is:

If the data has been accessed recently, it is more likely to be accessed in the future

Therefore, the more frequently used data, the higher the weight. When you need to clean up data, always clean up the least frequently used data.

Implementation of LRU in react-cache

react-cache includes two parts:

  • Data access
  • LRU algorithm implementation

Data access

Each createResource created by resource has a corresponding map , where:

  • The map of key to resource.read(key) incoming execution key
  • The map of the value is the resource.read(key) returned after the execution of promise

In our userResource example, createResource will create map after execution:

const userResource = createResource(fetchUser);

userResource.read executed for the first time, userID as key promise as value map (referred to as a entry ):

const data = userResource.read(userID);

To get a entry , you need to know two things:

  • entry corresponding to key
  • entry belongs resource

LRU algorithm implementation

react-cache uses two-way circular linked list implement the LRU algorithm, including three operations: insert, update, and delete.

Insert operation

userResource.read(userID) for the first time, get entry0 (abbreviated as n0 ), he will form a circular linked list with himself:

At this time, first (representing the highest weight) points to n0 .

After changing userID props , execute userResource.read(userID) to get entry1 (abbreviated as n1 ):

At this time, n0 and n1 form a circular linked list, and first points to n1 .

If you insert n2 , it looks like this:

It can be seen that whenever a new entry added, first always points to him, LRU that the new 0616cdc33bf185 is always high-weight.

Update operation

Whenever accessing a entry , because it is used, his weight will be updated to the highest.

For the following n0 n1 n2 , n2 highest weight ( first points to him):

n1 is accessed again, that is, when the following function is called:

userResource.read(n1对应userID);

n1 will be given the highest weight:

Delete operation

When the number of caches exceeds the set upper limit, react-cache will clear the caches with lower weights.

For the following n0 n1 n2 , n2 highest weight ( first points to him):

If the cache maximum limit is 1 (that is, only a cache entry ), the iteration will clean up first.previous , until the number of cache 1.

That is, first clean up n0 :

Then clean up n1 :

After each clean up will also map corresponding entry deleted.

For complete LRU implementation, see react-cache LRU

Summarize

Except React.lazy and react-cache can be combined with Suspense , any asynchronous process can converge to Suspense as long as you use your imagination, such as React Server Compontnt , streaming SSR.

As the bottom layer React18 is stable at the end of the year, it is believed that the development model of this synchronous writing method will gradually become the mainstream in the future.

No matter React , the bottom layer will always be these basic algorithms and data structures.

It's so plain and boring...


卡颂
3.1k 声望16.7k 粉丝