1
The error message in the title appears when merging file shards using Node.js.

code analysis

 m.merge = (opts) => {
  return new Promise((resolve, reject) => {
    const { filename, target } = opts;
    try {
      let len = 0;
      const bufferList = fs.readdirSync(`${STATIC_TEMPORARY}/${filename}`).map((hash, index) => {
        const buffer = fs.readFileSync(`${STATIC_TEMPORARY}/${filename}/${index}`);
        len += buffer.length;
        return buffer;
      });
      // Merge files
      const buffer = Buffer.concat(bufferList, len);
      const ws = fs.createWriteStream(`${target}/${filename}`);
      ws.write(buffer);
      ws.close();
      resolve({ success: true, msg: 'Section merge completed' });
    } catch (error) {
      console.error(error);
      reject({ success: false, msg: error });
    }
  });
};

const buffer = Buffer.concat(bufferList, len);
It is located that there is a problem with the line above, and I checked the server. The memory usage of the node application is 192.1mb.

At this time, the reason is very clear, because the file fragment is too large and the memory is exhausted, and there is no free space.

Optimization

The above approach is to concat all file segments and write them to the stream, which leads to memory exhaustion in the process.
In fact, we can write multiple concats in sequence, and modify the code as follows:

 m.merge = (opts) => {
  return new Promise((resolve, reject) => {
    const { filename, target } = opts;

    try {
      // 优化
      const ws = fs.createWriteStream(`${target}/${filename}`);
      const bufferList = fs.readdirSync(`${STATIC_TEMPORARY}/${filename}`)
      let len = 0;
      let list = []
      bufferList.forEach((hash, index) => {
        const b = fs.readFileSync(`${STATIC_TEMPORARY}/${filename}/${index}`);
        len += b.length;
        list.push(b)
        if (len > 10485760) { // 10M
          const buffer = Buffer.concat(list, len);
          ws.write(buffer);
          len = 0;
          list = []
        }
      })
      ws.close();
      resolve({ success: true, msg: 'Section merge completed' });
    } catch (error) {
      console.error(error);
      reject({ success: false, msg: error });
    }
  });
};

This is the error that the title will not appear if the interface is called again, and the files can be merged successfully. But looking at the memory usage of the Node process, it still remains around 192.1mb.

If, analyze again, the above code is still a bit verbose, try the following way of writing:

 m.merge = (opts) => {
  return new Promise((resolve, reject) => {
    const { filename, target } = opts;
    try {
      // 优化
      const ws = fs.createWriteStream(`${target}/${filename}`);
      const bufferList = fs.readdirSync(`${STATIC_TEMPORARY}/${filename}`)
      bufferList.forEach((hash, index) => {
        const b = fs.readFileSync(`${STATIC_TEMPORARY}/${filename}/${index}`);
        ws.write(b);
      })
      ws.close();
      resolve({ success: true, msg: 'Section merge completed' });
    } catch (error) {
      console.error(error);
      reject({ success: false, msg: error });
    }
  });
};

Write the read file directly to ws, because the file size has been sliced to 1M size, and it will be reclaimed by Node.js GC after each read and write to ws, which is a relatively safe operation in theory.

After the modification is completed, after testing, it is found that only when the file is uploaded and merged, there is a memory usage of about 200M and a high CPU consumption, and it becomes stable after the merge is completed.

Summarize

Through a RangeError: Array buffer allocation failed error, I realized the importance of rational use of memory. This problem has been solved, and our large file multipart upload function runs more stably.

If you also encounter such a problem, I hope this article can help you, and please help to give a like ❤️❤️, thank you. If you have a more advanced method, welcome to the comment area to discuss.

The article was first published on IICCOM-personal blog "RangeError: Array buffer allocation failed"


来了老弟
508 声望31 粉丝

纸上得来终觉浅,绝知此事要躬行