17

Preface

esbuild is a new generation of JavaScript packaging tools.

He is the author of Figma of CTO - Evan Wallace .

esbuild is known for its fast speed of , and it takes only 2% to 3% of webpack's time.

esbuild main goals of the open up a new era of build tool performance and create an easy-to-use modern packager.

Its main functions:

  • Extreme speed without needing a cache
  • ES6 and CommonJS modules
  • Tree shaking of ES6 modules
  • An API for JavaScript and Go
  • TypeScript and JSX syntax
  • Source maps
  • Minification
  • Plugins

Many tools now have it built-in, such as what we are familiar with:

  • vite,
  • snowpack

With the excellent performance of esbuild, vite is even more powerful, and it is about to take off.

Today we will explore: why esbuild is so fast.

Today's main content:

  • several sets of performance data comparison
  • why esbuild is so fast
  • esbuild upcoming roadmap
  • esbuild in vite
  • Why does the production environment still need to be packaged?
  • Why is Vite not packaged with esbuild?
  • summary

text

First look at a set of comparisons:

Use 10 threeJS production packages to compare the packaging speed of different packaging tools in the default configuration.

webpack5 is the bottom, taking 55.25 seconds.

esbuild only took 0.37 seconds.

The difference is huge.

There are more comparisons:

https://twitter.com/evanwallace/status/1314121407903617025

webpack5 said it was hurt: I still can't beat webpack 4?

...

Why is esbuild so fast?

There are several reasons.

(In order to ensure the accuracy of the content, the following content is translated from the esbuild official website.)

1. It is written in Go language and can be compiled into native code.

Most packagers are written in JavaScript, but for languages compiled JIT, command-line applications have the worst performance.

Every time the packager is run, the JavaScript VM will see the code of the packager without any optimization hints.

While esbuild is busy parsing JavaScript, node is busy parsing the JavaScript of the packaged program.

By the time the node finishes parsing the packager code, esbuild may have exited, and your packager has not even started packaging.

In addition, Go is designed for the parallelism of , but JavaScript is not.

shares memory between threads, while JavaScript must serialize data between threads.

Go and JavaScript have 160b5b7d2d7578 parallel garbage collectors, but Go's heap all threads at 160b5b7d2d7579, and for JavaScript, each JavaScript thread has a separate heap.

According to tests, this seems to cut the parallelism of JavaScript worker threads by half, probably because half of the CPU cores are busy collecting garbage for the other half.

2. Extensive use of parallel operations.

The been carefully designed to make full use of CPU resources.

It is roughly divided into three stages:

  1. analysis
  2. link
  3. code generation

parsing and code generation are most of the work, and can be fully parallelized (links are inherent serial tasks in most cases).

Since all threads share memory, when different entry points of the same JavaScript library are bundled and imported, work can be easily shared.

Most modern computers have multi-cores, so parallelism is a huge win.

3. The code is written by oneself, no third-party dependencies are used.

Writing everything yourself, instead of using third-party libraries, can bring many performance advantages.

You can keep performance in mind from the beginning, you can ensure that everything uses a consistent data structure to avoid costly conversions, and you can make extensive architectural changes when necessary. The disadvantage is of course a lot of work.

For example, many bundled programs use the official TypeScript compiler as the parser.

However, it was built to achieve the goals of the TypeScript compiler team, and they didn't make performance a top priority.

4. Efficient use of memory.

Ideally, according to the length of the data, the complexity of the compiler is O(n).

If you are dealing with large amounts of data, the memory access speed may severely affect performance.

The fewer iterations of the data (the fewer different representations needed to convert the data into data), the faster the compiler will be.

For example, esbuild only touches the entire JavaScript AST 3 times:

  1. The process of lexical analysis, parsing, scope setting and declaring symbols
  2. Bind symbols, minimize syntax. For example: Convert JSX / TS to JS, ES Next to es5.
  3. The smallest identifier, the smallest space, and the generated code.

When AST data is still active in the CPU cache, the reuse of AST data will be maximized.

Other packers perform these steps in a separate process, rather than weaving them together.

They can also convert between data representations and organize multiple libraries together (for example: string → TS → JS → string, then string → JS → old JS → string, then string → JS → minified JS → string).

This will take up more memory and will slow down the speed.

Another benefit of Go is that it can store content compactly in memory, allowing it to use less memory and hold more content in the CPU cache.

The types and fields of all object fields are tightly packed together, for example, several Boolean flags occupy only one byte each.

Go also has value semantics, you can embed one object directly into another object, so it is'free' and no additional allocation is required.

JavaScript does not have these features, but also has other shortcomings, such as JIT overhead (such as hidden class slots) and inefficient representation (such as non-integer and pointer heap allocation).

Each of the above factors can improve compilation speed to a certain extent.

When they work together, the effect is orders of magnitude faster than other packers commonly used today.

The above content is relatively cumbersome, and some netizens have made a brief summary of this:

  • It is written in Go language, which can be compiled into native code. And the execution speed of Go is very fast. Generally speaking, the operation of JS is milliseconds, and Go is nanoseconds.
  • parsing, generating the final package file and generating source maps are all completely parallelized
  • not require expensive data conversion, all operations can be completed in a few steps
  • The to improve compilation speed as the first principle when writing code, and try to avoid unnecessary memory allocation.

For reference only.

Upcoming roadmap

The following features are already in progress, and are the first priority:

  1. Code splitting (#16, docs)
  2. CSS content type (#20, docs)
  3. Plugin API (#111)

The following fearures have potential, but they are not yet certain:

  1. HTML content type (#31)
  2. Lowering to ES5 (#297)
  3. Bundling top-level await (#253)

Those who are interested can keep paying attention.

The use of esbuild in vite

vite is extensively used in esbuild . Here are two points to share.

  1. optimizer

https://github.com/vitejs/vite/blob/main/packages/vite/src/node/optimizer/index.ts#L262

import { build, BuildOptions as EsbuildBuildOptions } from 'esbuild'

// ...
const result = await build({
    entryPoints: Object.keys(flatIdDeps),
    bundle: true,
    format: 'esm',
    external: config.optimizeDeps?.exclude,
    logLevel: 'error',
    splitting: true,
    sourcemap: true,
    outdir: cacheDir,
    treeShaking: 'ignore-annotations',
    metafile: true,
    define,
    plugins: [
      ...plugins,
      esbuildDepPlugin(flatIdDeps, flatIdToExports, config)
    ],
    ...esbuildOptions
  })

  const meta = result.metafile!

  // the paths in `meta.outputs` are relative to `process.cwd()`
  const cacheDirOutputPath = path.relative(process.cwd(), cacheDir)

  for (const id in deps) {
    const entry = deps[id]
    data.optimized[id] = {
      file: normalizePath(path.resolve(cacheDir, flattenId(id) + '.js')),
      src: entry,
      needsInterop: needsInterop(
        id,
        idToExports[id],
        meta.outputs,
        cacheDirOutputPath
      )
    }
  }

  writeFile(dataPath, JSON.stringify(data, null, 2))
  1. Process .ts files

https://github.com/vitejs/vite/commit/59035546db7ff4b7020242ba994a5395aac92802

Why does the production environment still need to be packaged?

Although the native ESM is now widely supported , because the nested import will cause additional network round trips for , publishing unpackaged ESM in a production environment is still inefficient (even if HTTP/2 used).

In order to get the best loading performance of in a production environment, it is best to perform code tree-shaking , lazy loading and chunk splitting (for better cache).

It is not easy to ensure that the best output of behavior between the development server and the product build are consistent.

To solve this problem, Vite comes with a set of build optimized build commands, ready to use out of the box.

Why does vite not be packaged with esbuild?

Although esbuild fast and is already an excellent tool for building libraries, some important functions for building applications are still under continuous development-especially code segmentation and CSS processing.

For now, Rollup more mature and flexible in application packaging.

Nevertheless, when these functions are stable in the future, the possibility of using esbuild as the production builder is not ruled out.

to sum up

esbuild has brought dawn to build efficiency, and the number of esm is also increasing rapidly:

https://twitter.com/skypackjs/status/1113838647487287296

I hope that the esm be improved as soon as possible to benefit the front-end.

--

Today's content is so much, I hope to inspire everyone.

If there are any errors in the article, please correct me. Thank you.

Reference link

  1. https://esbuild.github.io/getting-started/
  2. https://esbuild.github.io/faq/
  3. https://twitter.com/skypackjs/status/1113838647487287296

皮小蛋
8k 声望12.8k 粉丝

积跬步,至千里。