Background of the project
After years of iteration, the front-end single-page application of our system (a ToB Web single-page application) has accumulated hundreds of thousands of lines of business code and 30+ routing modules. The overall code volume and complexity are still relatively high. .
The whole project is based on Vue + TypeScirpt, and the build tool, because the earliest project was initialized by vue-cli
, naturally uses Webpack.
We know that as the project volume gets bigger and bigger, we run the project in the development stage, that is, through the single cold start time of npm run serve
, and when the project is released npm run build
will take longer and longer.
Therefore, packaging and building optimization is also something that needs to be done continuously with the growth of the project. In the early days, when the project volume is relatively small, the effect of construction optimization may not be obvious, but as the project volume increases, the construction time gradually increases, and how to reduce the construction time as much as possible becomes more and more obvious. important:
- Large projects are usually developed collaboratively by multiple people in the team. The reduction of the cold start time during a single development, multiplied by the number of people and days, saves a considerable amount of time over the years, which can greatly improve the development efficiency and enhance the development experience.
- The efficiency of the release and construction of large projects is improved, which can better ensure the accuracy and timeliness of a series of operations such as project release and rollback.
This article will introduce the entire WMS FE project in detail. In the process of increasing the project volume, the optimization of the overall packaging and construction efficiency will be carried out.
Bottleneck Analysis
To be more specific, our project was originally based on vue-cli 4
which was based on the webpack4 version at the time. Unless otherwise specified, some of the configurations below will be based on webpack4.
Before solving problems, we need to analyze the problem. To optimize the construction speed, we must first analyze the time-consuming process and focus distribution of Webpack in the process of building and compiling our project.
Here, we use the SMP plug-in to count the time-consuming data of each module.
speed-measure-webpack-plugin
is a plug-in that counts the packaging time of webpack. It can not only analyze the total packaging time, but also analyze the time-consuming of the loader in each stage, and can output a file for permanent storage of data.
// 安装
npm install --save-dev speed-measure-webpack-plugin
// 使用方式
const SpeedMeasurePlugin = require("speed-measure-webpack-plugin");
const smp = new SpeedMeasurePlugin();
config.plugins.push(smp());
Time-consuming construction in development phase
For npm run serve
, that is, the development stage, without any cache, the time for a single cold start of the entire project reached an astonishing 4 minutes.
<img width="335" alt="image" src="https://user-images.githubusercontent.com/8554143/172987426-76757d80-7a9f-4a05-9aa0-d2ad0a5c3941.png">
Time-consuming production builds
For npm run build
, that is, the construction of the actual online production environment, look at the overall time-consuming:
<img width="332" alt="image" src="https://user-images.githubusercontent.com/8554143/172807307-d6b98ebd-c3dc-41d1-bc0f-1ef5ce2ebf68.png">
Therefore, it is imperative to optimize the construction efficiency. First of all, we need to make it clear that optimization is divided into two directions:
- Optimization based on development stage
npm run serve
In the development stage, our core goal is to improve the construction speed as much as possible and ensure the efficiency of development while maintaining all the functions of the project. Therefore, some functions that Live only need, such as code obfuscation compression, image compression and other functions are available. Not enabled, and in the development phase, we need hot updates.
- Optimization based on production stage
npm run build
In the production packaging stage, although the construction speed is also very important, some functions that are dispensable during development must be added, such as code compression and image compression. Therefore, the goal of production construction is to ensure that the final project packaging volume is as small as possible, and the required related functions are as complete as possible, while maintaining a relatively fast construction speed.
The purpose of the two is different, so some build optimizations may only work in one of them.
Based on some of the above analysis, this paper will explore the optimization of construction efficiency from the following aspects:
- Some common traditional optimization methods based on Webpack
- Build in modules
- Vite-based build tool switching
- Build efficiency optimization based on Es-build plugin
Why so slow?
So why does the build become slower and slower as the project grows?
It is not difficult to see from the above two screenshots that for a single-page application like ours, most of the time during the construction process is spent on various Loaders that compile JavaScript files and CSS files.
This article will not describe the construction principle of Webpack in detail. We only need to know in general that the construction process of Webpack mainly spends its time recursively traversing each entry file, and constantly looking for the process of compiling and then recursively processing the dependencies one by one based on the entry file. It needs to go through the process of String->AST->String, and then process some strings or execute some JavaScript scripts through different loaders. Due to the single-threaded characteristics of NodeJS and the efficiency limitations of the language itself, Webpack has been criticized for its slow construction. reason.
Therefore, based on the above Webpack construction process and some of the problems mentioned, the overall optimization direction becomes:
- cache
- multi-Progress
- Pathfinding optimization
- Extraction and splitting
- Build tool replacement
Traditional optimization method based on Webpack
As mentioned above, most of the time in the construction process is spent on recursively compiling various Loaders for JavaScript and CSS, and is limited by the single-threaded nature of NodeJS and the efficiency limitations of the language itself.
If Webpack itself is not replaced, the execution efficiency of the language itself (NodeJS) cannot be optimized, and it can only make a fuss about other points.
Therefore, in the early days, what we did were some relatively conventional optimization methods. Here are the core ones:
- cache
- multi-Progress
- addressing optimization
cache optimization
In fact, for vue-cli 4
, some cache operations have been built in. For example, in the process of loader shown in the figure above, cache-loader
is used, so we do not need to add it to the project again. .
-
cache-loader
: add cache-loader before some loaders with high performance overhead to cache results to disk
Are there any other cache operations that are used? We used one HardSourceWebpackPlugin
.
HardSourceWebpackPlugin
-
HardSourceWebpackPlugin
: HardSourceWebpackPlugin provides an intermediate cache for modules, the default path of the cache isnode_modules/.cache/hard-source
, after configuringHardSourceWebpackPlugin
, the first build time does not change much. But the second time you start, the build time will be significantly faster.
First install the dependencies:
npm install hard-source-webpack-plugin -D
Modify vue.config.js
configuration file:
const HardSourceWebpackPlugin = require('hard-source-webpack-plugin');
module.exports = {
...
configureWebpack: (config) => {
// ...
config.plugins.push(new HardSourceWebpackPlugin());
},
...
}
The first build time of HardSourceWebpackPlugin
is configured, as expected, and there is not much change, but the second build has dropped from an average of about 4 minutes to an average of about 20s, and the improvement is very exaggerated . Of course, This also varies from project to project, but overall, it has been found in different projects that it can greatly improve the efficiency of secondary compilation during development.
Set cacheDirectory and DLL of babel-loader
In addition, our attempts at caching include:
- Set the cacheDirectory of babel-loader
- DLL
But the overall effect is not too big, you can simply talk about it.
Open the cacheDirectory configuration of babel-loader. When set, the specified directory will be used to cache the execution result of the loader. Subsequent webpack builds will attempt to read the cache to avoid the potentially high-performance Babel recompilation process on each execution. For the actual steps, you can take a look at Webpack - babel-loader .
So what is a DLL?
A DLL file is a dynamic link library, and a dynamic link library can contain functions and data that are called by other modules.
Why use DLL?
The reason is that a dynamic link library containing a large number of reused modules only needs to be compiled once, and the modules included in the dynamic link library will not be recompiled in the subsequent construction process, but the code in the dynamic link library will be directly used.
Since most of the dynamic link libraries contain common third-party modules, such as Vue, React, and React-dom, as long as the versions of these modules are not upgraded, the dynamic link libraries do not need to be recompiled.
The configuration of the DLL is very cumbersome and has little effect in the end. We used autodll-webpack-plugin in the process, and you can try it yourself if you are interested. It is worth mentioning that Vue-cli has removed this feature.
multi-Progress
Based on the single-threaded feature of NodeJS, when multiple tasks exist at the same time, they can only be queued for serial execution.
Today, most CPUs are multi-core, so we can use some tools to fully release the advantages of CPU in multi-core concurrency, and use multi-core advantages to process tasks at the same time by multiple processes.
As you can see from the above figure, Vue CLi4 actually has built-in thread-loader
.
-
thread-loader
: Putthread-loader
before other loader, then the loader placed after this loader will run in a separate worker pool. The advantage of this is that tasks that would otherwise need to be executed in parallel are executed in parallel.
So, in addition to thread-loader
, what other solutions can be considered?
HappyPack
HappyPack is similar to thread-loader
.
HappyPack can use multiple processes to package files, decompose tasks into multiple sub-processes for parallel execution, and then send the results to the main process after the sub-processes are processed to achieve the effect of parallel packaging. HappyPack is not supported by all loaders. For example, vue-loader does not support it.
Supported loaders can be viewed through the Loader Compatibility List . It should be noted that there is an overhead in creating the communication between the child process and the main process. When your loader is very slow, you can add happypack. Otherwise, it may compile more slowly.
Of course, due to the gradual loss of interest in JavaScript by HappyPack authors and less maintenance, webpack4 and later are more recommended to use thread-loader
. Therefore, no practical conclusions are given here.
The last HappyPack update was 3 years ago
addressing optimization
For addressing optimizations, the overall improvement is not that great.
Its core lies in setting the exclude
and include
attributes of the loader reasonably.
- By configuring the exclude option of the loader, tell the corresponding loader that a certain directory can be ignored
- By configuring the include option of the loader, you tell the loader to only process the specified directory. The fewer files the loader processes, the faster the execution speed will be.
This is certainly a useful optimization, but for some large projects, such optimizations won't be particularly noticeable on the overall build time.
Build in modules
After some general optimizations above are done. The overall effect is still not very obvious, so we started to think about some other directions.
Let's take a look at the overall process of Webpack build:
The above picture is the general webpack construction process, briefly introduce:
- entry-option: read the webpack configuration and call the new Compile(config) function to prepare for compilation
- run: start compiling
- make: analyze dependencies from the entry point and build dependent modules
- before-resolve: resolve the location module
- build-module: start building the module
- normal-module-loader: Generate AST tree
- program: Traverse the AST tree and collect dependencies when encountering a require statement
- seal: build complete and start optimization
- emit: output dist directory
As the size of the project continues to increase, most of the time is consumed in step 7, recursively traverses the AST, parses the require, and so on until the entire project is traversed.
What's interesting is that for a single development, the high probability is only based on a small module of the entire large project.
Therefore, if we can skip the modules we don't need this time when collecting dependencies, or we can choose to build only the necessary modules, then the overall build time can be greatly reduced .
That's what we're going to do -- build in modules .
What does that mean? For example, suppose our project has a total of 6 large routing modules A, B, C, D, E, and F. When new requirements only need to be optimized and added within the scope of module A, then we start the whole process in the development phase. During the project, you can skip the five modules B, C, D, E, and F, and only build the A module:
Assuming that the average construction time of each module is 3s, the overall cold-start construction time of 18s can be reduced to 3s .
The principle of sub-module construction and packaging
Webpack is statically compiled and packaged. When Webpack collects dependencies, it will analyze the require statements in the code (import will be compiled into require by bebel), and then recursively collect dependencies for packaging and construction.
What we have to do is to simply transform our existing code by adding some configurations, so that Webpack can skip the modules we do not need when initializing and traversing the entire routing module to collect dependencies.
To be more specific, let's assume that our routing code is roughly as follows:
import Vue from 'vue';
import VueRouter, { Route } from 'vue-router';
// 1. 定义路由组件.
// 这里简化下模型,实际项目中肯定是一个一个的大路由模块,从其他文件导入
const moduleA = { template: '<div>AAAA</div>' }
const moduleB = { template: '<div>BBBB</div>' }
const moduleC = { template: '<div>CCCC</div>' }
const moduleD = { template: '<div>DDDD</div>' }
const moduleE = { template: '<div>EEEE</div>' }
const moduleF = { template: '<div>FFFF</div>' }
// 2. 定义一些路由
// 每个路由都需要映射到一个组件。
// 我们后面再讨论嵌套路由。
const routesConfig = [
{ path: '/A', component: moduleA },
{ path: '/B', component: moduleB },
{ path: '/C', component: moduleC },
{ path: '/D', component: moduleD },
{ path: '/E', component: moduleE },
{ path: '/F', component: moduleF }
]
const router = new VueRouter({
mode: 'history',
routes: routesConfig,
});
// 让路由生效 ...
const app = Vue.createApp({})
app.use(router)
What we need to do is to collect the modules that need to be started this time through a pre-command line script every time we start the project, and generate the required ones on demand routesConfig
.
We tried:
- IgnorePlugin plugin
- webpack-virtual-modules with require.context
- NormalModuleReplacementPlugin plugin for file replacement
In the end, I chose to use the NormalModuleReplacementPlugin
plugin for file replacement, because it is very intrusive to the entire project, and only needs to add pre-scripts and modify the Webpack configuration without changing any routing file code. In summary, the two advantages of this scheme are:
- No need to change the upper-level code
- By generating a temporary routing file, the original routing file is replaced without any impact on the project
Use NormalModuleReplacementPlugin to generate a new routing configuration file
Using the NormalModuleReplacementPlugin
plugin, you can generate a new routing configuration file according to the configuration during the compilation phase without modifying the original routing configuration file and then use it. The advantage of this is that it is not intrusive to the entire source code.
The role of the NormalModuleReplacementPlugin plugin is to replace the content of the target source file with our own content.
We simply modify the Webpack configuration. If it is currently a development environment, use this plugin to replace the original config.ts
file with another, the code is as follows:
// vue.config.js
if (process.env.NODE_ENV === 'development') {
config.plugins.push(new webpack.NormalModuleReplacementPlugin(
/src\/router\/config.ts/,
'../../dev.routerConfig.ts'
)
)
}
The function of the above code is to replace the actual config.ts
with the custom-configured dev.routerConfig.ts
file, then how is the content of the dev.routerConfig.ts
file generated? With the help of inquirer and EJS template engine, through an interactive command line question and answer, select the required modules, and dynamically generate new dev.routerConfig.ts
code based on the selected content, and the code is directly here.
Modify our startup script. Before executing vue-cli-service serve
, run a section of our pre-script:
{
// ...
"scripts": {
- "dev": "vue-cli-service serve",
+ "dev": "node ./script/dev-server.js && vue-cli-service serve",
},
// ...
}
And dev-server.js
needs to do is to implement an interactive command through inquirer
, the user selects the list of modules to be started this time, and generates a copy through ejs
New dev.routerConfig.ts
file.
// dev-server.js
const ejs = require('ejs');
const fs = require('fs');
const child_process = require('child_process');
const inquirer = require('inquirer');
const path = require('path');
const moduleConfig = [
'moduleA',
'moduleB',
'moduleC',
// 实际业务中的所有模块
]
//选中的模块
const chooseModules = [
'home'
]
function deelRouteName(name) {
const index = name.search(/[A-Z]/g);
const preRoute = '' + path.resolve(__dirname, '../src/router/modules/') + '/';
if (![0, -1].includes(index)) {
return preRoute + (name.slice(0, index) + '-' + name.slice(index)).toLowerCase();
}
return preRoute + name.toLowerCase();;
}
function init() {
let entryDir = process.argv.slice(2);
entryDir = [...new Set(entryDir)];
if (entryDir && entryDir.length > 0) {
for(const item of entryDir){
if(moduleConfig.includes(item)){
chooseModules.push(item);
}
}
console.log('output: ', chooseModules);
runDEV();
} else {
promptModule();
}
}
const getContenTemplate = async () => {
const html = await ejs.renderFile(path.resolve(__dirname, 'router.config.template.ejs'), { chooseModules, deelRouteName }, {async: true});
fs.writeFileSync(path.resolve(__dirname, '../dev.routerConfig.ts'), html);
};
function promptModule() {
inquirer.prompt({
type: 'checkbox',
name: 'modules',
message: '请选择启动的模块, 点击上下键选择, 按空格键确认(可以多选), 回车运行。注意: 直接敲击回车会全量编译, 速度较慢。',
pageSize: 15,
choices: moduleConfig.map((item) => {
return {
name: item,
value: item,
}
})
}).then((answers) => {
if(answers.modules.length===0){
chooseModules.push(...moduleConfig)
}else{
chooseModules.push(...answers.modules)
}
runDEV();
});
}
init();
A simple illustration of the template code:
// 模板代码示意,router.config.template.ejs
import { RouteConfig } from 'vue-router';
<% chooseModules.forEach(function(item){%>
import <%=item %> from '<%=deelRouteName(item) %>';
<% }) %>
let routesConfig: Array<RouteConfig> = [];
/* eslint-disable */
routesConfig = [
<% chooseModules.forEach(function(item){%>
<%=item %>,
<% }) %>
]
export default routesConfig;
The core of dev-server.js
is to start an inquirer interactive command line service, allowing the user to select the modules to be built, similar to this:
The template code indicates router.config.template.ejs
is the EJS template file, chooseModules
is the module set array selected by the user obtained when we input it in the terminal. According to this list, we will generate a new routesConfig
File.
In this way, we achieve sub-module construction and collect dependencies on demand. Taking our project as an example, our entire project has about 20 different modules and hundreds of thousands of lines of code:
number of building blocks | time consuming |
---|---|
Cold start full build of 20 modules | 4.5MIN |
Cold start builds only 1 module | 18s |
Build 1 module twice in cached state | 4.5s |
The actual effect is roughly as follows, no need to start all modules, just start the modules we selected for corresponding development:
In this way, if a single development only involves a fixed module, the cold start time of a single project can be reduced from the original 4min+ to about 18s, while the second build of a module in a cached state only takes 4.5s, which belongs to a comparison big boost.
Limited by the performance bottleneck of the language used by Webpack, to pursue faster build performance, we inevitably need to focus on other build tools. Here, we focus on Vite and esbuild.
Build-time optimized with Vite
Vite, a development server based on browser-native ES modules. Use the browser to parse the imports, compile and return as needed on the server side, completely skipping the concept of packaging, and the server can be used at any time. At the same time, there is not only Vue file support, but also hot update, and the speed of hot update will not slow down as the number of modules increases.
Of course, due to the limitations of Vite's own characteristics, it is currently only suitable for replacing Webpack in the development phase.
We all know that Vite is very fast, where is it mainly fast?
- Faster cold start of projects
- Hot update faster
So what makes it so fast?
The difference between Webpack and Vite cold start
Let's first take a look at the differences between Webpack and Vite in construction. The following figure is the process of Webpack's traversal and recursive collection of dependencies:
We also mentioned above that when Webpack starts, starting from the entry file, all configured Loaders are called to compile the module, and then the modules that the module depends on are found, and this step is recursed until all the files that the entry depends on have passed this step. processing.
This process is very, very time-consuming, look at Vite again:
Vite improves the development server startup time by initially separating the modules in the application into dependencies and source . The core of its speed lies in two points:
Pre-build dependencies with Go language : Vite will use esbuild for pre-build dependencies . esbuild is written in Go and pre-builds dependencies 10-100 times faster than bundlers written in JavaScript. What does relying on pre-build mainly do?
- During the development phase, Vite's development server treats all code as native ES modules. So Vite must first convert dependencies published as CommonJS or UMD to ESM
- Vite converts ESM dependencies that have many internal modules into a single module to improve subsequent page load performance. If it is not compiled, each dependency package may contain multiple other dependencies, and each imported dependency will require another request. The more requests, the more time-consuming
- Compile and return on demand : Vite provides source code in native ESM mode. This effectively lets the browser take over part of the packager's work: Vite only needs to transform the source code when the browser requests it and serve it on demand. The code is imported dynamically according to the context, i.e. it will only be processed when it is actually used on the current screen.
The difference between Webpack and Vite hot update
Another great advantage of using Vite is that its hot update is also very fast.
Let's first take a look at Webpack's hot update mechanism:
Some noun explanations:
-
Webpack-complier
: Webpack's compiler, which compiles Javascript into bundle (the final output file) -
HMR Server
: Output the hot update file to HMR Runtime -
Bunble Server
: Provides access to files in the browser, which is why we can normally access our local website through localhost -
HMR Runtime
: If the hot update is enabled, it will be injected into the bundle.js in the browser during the packaging stage, so that the bundle.js can establish a connection with the server, usually using Websocket. When updating the instruction, it will update the changes of the file -
bundle.js
: build output file
The general principle of Webpack hot update is that the files are compiled by Webpack-complier and then transmitted to HMR Server. HMR Server knows which resource (module) has changed and notifies HMR Runtime of the changes, and HMR Runtime will update our code. This way the browser is updated and does not require a refresh.
The main time-consuming point of the Webpack hot update mechanism is that the hot update of Webpack will use the currently modified file as the entry to rebuild and package, and all the dependencies involved will also be reloaded once .
And Vite claims that the speed of hot update will not slow down as the number of modules increases . Where is its main optimization point?
The way Vite implements hot update is similar to that of Webpack. It also establishes communication between the browser and the server by creating a WebSocket, and sends a message to the client by monitoring the change of the file, and the client updates different operations corresponding to different files.
Vite monitors file system changes through chokidar
, and only needs to reload the changed modules. It only needs to accurately invalidate the connection between the relevant module and its adjacent HMR boundary, so that the HMR update speed will not be faster. Slow down due to the increase in application size and Webpack has to go through a bundled build. Therefore, in the HMR scenario, Vite performs better than Webpack.
Some events are triggered by different messages. Real-time hot module replacement (hot update) on the browser side . Trigger finer-grained updates through different events (currently only Vue and JS, and Vue files include changes to template, script, and style), so that only necessary files are updated, rather than full updates. The events are:
- connected: WebSocket connection is successful
- vue-reload: Vue component reload (when the content in the script is modified)
- vue-rerender: Vue components re-render (when the content in the template is modified)
- style-update: style update
- style-remove: style removal
- js-update: js file update
- full-reload: fallback mechanism, web page refresh
This article will not do much in-depth on the principle of Vite. If you are interested, you can learn more through the official documentation -- Vite Official Documentation -- Why choose Vite
The transformation based on Vite is equivalent to replacing Webpack in the development stage. The following mainly talks about some of the problems we encountered during the replacement process.
The transformation of the Vue2 project based on Vue-cli 4 roughly only needs:
- Install Vite
- Configure index.html (Vite parses the <script type="module" src="..."> tag to point to the source code)
- configure vite.config.js
- In package.json
scripts
add the startup command"vite": "vite"
When running npm run vite
from the command line, Vite will automatically parse the file named vite.config.js
in the project root directory and read the corresponding configuration. For the configuration of vite.config.js
, it is relatively simple as a whole:
- Vite provides built-in support for .scss, .sass, .less, and .stylus files
- Natural support for TS, out of the box
- Based on Vue2 project support, different projects may encounter different problems. You can debug step by step according to the error report. For example, some official plugins are compatible with
.tsx
,.jsx
Of course, for the source code of the project, some modifications may be required. Here are some minor problems we encountered:
- The compilation problem caused by the use of decorators in tsx has been modified by magic
@vitejs/plugin-vue-jsx
to support jsx under Vue2 - Since Vite only supports ESM syntax, the module import method in the code needs to be changed from
require
toimport
- The Sass preprocessor cannot correctly parse
/deep/
in the style, you can use::v-deep
instead - Other minor issues, such as compatibility of Webpack environment variables, compatibility of SVG iCON
For the places that need to be modified to the source code, our approach is to ensure that Vite can be adapted, and at the same time, the change will not affect the original Webpack construction, so that we can switch back to Webpack at critical moments or subsequent iterations
After solving some of the above problems, we successfully packaged and migrated the Webpack-based build during development to Vite, and the effect is amazing. The entire module build takes only 2.6s :
So far, the construction time in the development phase has been optimized from the original 4.5min to 2.6s:
number of building blocks | time consuming |
---|---|
Webpack cold start builds 20 modules in full | 4.5MIN |
Webpack cold start builds only 1 module | 18s |
Webpack builds 1 module twice in a cached state | 4.5s |
Vite cold start | 2.6s |
Optimize production builds
Well, we have basically completed the construction optimization of the entire development stage. The next step is to optimize the production build .
Our production release is a full CI/CD flow based on GitLab and Jenkins.
Before optimizing, take a look at the time taken to publish our entire project online:
It can be seen that the production environment construction time is long, the average build time is about 9 minutes, and the overall release construction time is about 15 minutes. The overall construction process takes too long, and the efficiency is low, which seriously affects testing and rollback .
OK, let's see what the entire build process needs to do:
Among them, there is a large room for optimization in the Build base and Build Region stages.
The optimization of the Build base phase involves environment preparation, image pulling, and dependency installation. There is not much room for the front end to play. This part is mainly communicated with the SRE team to optimize together. What can be done is to increase the cache processing, attach the file system, and write the dependencies into the container.
Our optimization mainly focuses on the Build Region stage, that is, how to reduce the time of npm run build
.
The time-consuming analysis of npm run build
has been posted at the beginning of the article, simply paste it below:
<img width="332" alt="image" src="https://user-images.githubusercontent.com/8554143/172807307-d6b98ebd-c3dc-41d1-bc0f-1ef5ce2ebf68.png">
In general, code compilation time is positively correlated with code size .
According to the past optimization experience, the static code inspection may take up a lot of time, and the eyes are locked on eslint-loader
.
In the production build phase, the eslint prompt information is of little value, so consider removing it in the build phase and pre-step the steps.
At the same time, we learned that the time-consuming babel-loader, ts-loader and other loaders can be replaced by the esbuild-loader plugin.
Therefore, our overall optimization direction is:
- Rewrite the packaging script and introduce the esbuild plugin
- Optimize the architecture logic and reduce unnecessary checks in the build phase
Process comparison before and after optimization:
Optimize the architecture logic and reduce unnecessary checks in the build phase
As mentioned above, it is relatively easy to understand. In the production build stage, the eslint prompt information is of little value. Consider removing it in the build stage and pre-step the steps.
git commit
的时候利用lint-staged
及git hook
做检查, 或者利用CI git merge
的时候加一条流水线任务,专门做Static check.
We do it in both ways, simply give the code to access Gitlab CI:
// .gitlab-ci.yml
stages:
- eslint
eslint-job:
image: node:14.13.0
stage: eslint
script:
- npm run lint
- echo 'eslint success'
retry: 1
rules:
- if: '$CI_PIPELINE_SOURCE == "merge_request_event" && $CI_MERGE_REQUEST_TARGET_BRANCH_NAME == "test"'
Through the .gitlab-ci.yml
configuration file, specify a fixed timing to perform the lint command, the pre-step.
Rewrite the packaging script and introduce the esbuild plugin
Here, we mainly use esbuild-loader .
In fact, we also mentioned esbuild above, and Vite uses esbuild for pre-build dependencies. Here we use esbuild-loader, which wraps the capabilities of esbuild into Webpack's loader to compile Javascript, TypeScript, CSS and other resources. As well as providing a faster resource compression scheme.
It is also very easy to access. Our project is based on Vue CLi, the main modification vue.config.js
, the modification is as follows:
// vue.config.js
const { ESBuildMinifyPlugin } = require('esbuild-loader');
module.exports = {
// ...
chainWebpack: (config) => {
// 使用 esbuild 编译 js 文件
const rule = config.module.rule('js');
// 清理自带的 babel-loader
rule.uses.clear();
// 添加 esbuild-loader
rule
.use('esbuild-loader')
.loader('esbuild-loader')
.options({
loader: 'ts', // 如果使用了 ts, 或者 vue 的 class 装饰器,则需要加上这个 option 配置, 否则会报错: ERROR: Unexpected "@"
target: 'es2015',
tsconfigRaw: require('./tsconfig.json')
})
// 删除底层 terser, 换用 esbuild-minimize-plugin
config.optimization.minimizers.delete('terser');
// 使用 esbuild 优化 css 压缩
config.optimization
.minimizer('esbuild')
.use(ESBuildMinifyPlugin, [{ minify: true, css: true }]);
}
}
After removing ESLint and inserting esbuild-loader, a single local build can be optimized to 90 seconds.
stage | time consuming |
---|---|
Before optimization | 200s |
Remove ESLint, access esbuild-loader | 90s |
Looking at the online Jenkins build time, there is also a very obvious improvement:
The evolution and follow-up planning of front-end engineering
Overall, after the above optimization is completed, the packaging and construction efficiency of the entire project has been greatly improved, but this is not the best.
Take a look at the live build time of the brother group next to us:
In the case of similar project size, their production build time ( npm run build
) is in the early 2 minutes, and the reasons are:
- Their project is React + TSX, and the project I optimized this time is Vue, which requires more than one layer of file processing
vue-loader
; - Their project uses a micro-frontend and splits the project correctly. The main project only needs to load the code related to the base, and the sub-applications are built separately. The main application code that needs to be built is greatly reduced, which is the main reason;
Yes, there are still many directions we can try in the future. For example, some of the attempts we are doing are:
- Split the project on the micro front end, disassemble the relatively independent modules, and achieve independent deployment
- When building based on Jenkins, the improvement of optimization in the Build Base stage, such as pre-positioning the build process, combining CDN for fast rollback, and pre-installing dependencies into the Docker container, reducing the need for each time in the container
npm install
time consumption etc.
At the same time, we must also see that front-end technology is changing with each passing day, and various construction tools are dizzying. The front-end has gone from the earliest slash-and-burn cultivation, to the gradual advancement to engineering, to the various standards, specifications, and various efficiency-enhancing tools that are included in today's pan-front-end engineering. Build efficiency optimization may be in a state of being on the road. Of course, there is not necessarily a best practice here, only the best practice for our project, we need to constantly explore and try.
at last
This concludes this article, I hope it helps you :)
If you have any questions or suggestions, you can communicate more. Original articles are limited in writing and knowledge. If there are any inaccuracies in the article, please let me know.
**粗体** _斜体_ [链接](http://example.com) `代码` - 列表 > 引用
。你还可以使用@
来通知其他用户。