Introduction
When How I built a wind map with WebGL , framebuffer was used in it, so I checked the data and tried it out.
framebuffer object
WebGL has the ability to use the rendering result as a texture, which is the (framebuffer object).
By default, the final WebGL drawing result is stored in the color buffer, and the frame buffer object can be used instead of the color buffer, as shown in the following figure, the objects drawn in the frame buffer are not directly displayed on the Canvas, so This technique is also known as (offscreen drawing).
Example
To verify the above functionality, the example draws an image in the framebuffer, and then draws it again as a texture for display.
Based on using the picture example , there are mainly the following changes:
- data
- framebuffer object
- draw
data
Drawing in the frame buffer is the same as normal drawing, but it is not displayed, so there must be corresponding drawing area size, vertex coordinates and texture coordinates.
offscreenWidth: 200, // 离屏绘制的宽度
offscreenHeight: 150, // 离屏绘制的高度
// 部分代码省略
// 针对帧缓冲区绘制的顶点和纹理坐标
this.offScreenBuffer = this.initBuffersForFramebuffer(gl);
// 部分代码省略
initBuffersForFramebuffer: function (gl) {
const vertices = new Float32Array([
0.5, 0.5, -0.5, 0.5, -0.5, -0.5, 0.5, -0.5,
]); // 矩形
const indices = new Uint16Array([0, 1, 2, 0, 2, 3]);
const texCoords = new Float32Array([
1.0,
1.0, // 右上角
0.0,
1.0, // 左上角
0.0,
0.0, // 左下角
1.0,
0.0, // 右下角
]);
const obj = {};
obj.verticesBuffer = this.createBuffer(gl, gl.ARRAY_BUFFER, vertices);
obj.indexBuffer = this.createBuffer(gl, gl.ELEMENT_ARRAY_BUFFER, indices);
obj.texCoordsBuffer = this.createBuffer(gl, gl.ARRAY_BUFFER, texCoords);
return obj;
},
createBuffer: function (gl, type, data) {
const buffer = gl.createBuffer();
gl.bindBuffer(type, buffer);
gl.bufferData(type, data, gl.STATIC_DRAW);
gl.bindBuffer(type, null);
return buffer;
}
// 部分代码省略
Both vertex shaders and fragment shaders can be newly defined, and a set is shared here for convenience.
framebuffer object
To draw in the framebuffer, you need to create the corresponding framebuffer object.
// 帧缓冲区对象
this.framebufferObj = this.createFramebufferObject(gl);
// 部分代码省略
createFramebufferObject: function (gl) {
let framebuffer = gl.createFramebuffer();
let texture = gl.createTexture();
gl.bindTexture(gl.TEXTURE_2D, texture);
gl.texImage2D(
gl.TEXTURE_2D,
0,
gl.RGBA,
this.offscreenWidth,
this.offscreenHeight,
0,
gl.RGBA,
gl.UNSIGNED_BYTE,
null
);
// 反转图片 Y 轴方向
gl.pixelStorei(gl.UNPACK_FLIP_Y_WEBGL, true);
// 纹理坐标水平填充 s
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_S, gl.CLAMP_TO_EDGE);
// 纹理坐标垂直填充 t
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_T, gl.CLAMP_TO_EDGE);
// 纹理放大处理
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MAG_FILTER, gl.LINEAR);
// 纹理缩小处理
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MIN_FILTER, gl.LINEAR);
framebuffer.texture = texture; // 保存纹理对象
// 关联缓冲区对象
gl.bindFramebuffer(gl.FRAMEBUFFER, framebuffer);
gl.framebufferTexture2D(
gl.FRAMEBUFFER,
gl.COLOR_ATTACHMENT0,
gl.TEXTURE_2D,
texture,
0
);
// 检查配置是否正确
var e = gl.checkFramebufferStatus(gl.FRAMEBUFFER);
if (gl.FRAMEBUFFER_COMPLETE !== e) {
console.log("Frame buffer object is incomplete: " + e.toString());
return;
}
gl.bindFramebuffer(gl.FRAMEBUFFER, null);
gl.bindTexture(gl.TEXTURE_2D, null);
return framebuffer;
}
// 部分代码省略
- createFramebuffer function to create a frame buffer object, the function to delete the object is deleteFramebuffer .
- After the creation, you need to assign a texture object to the color association object of the frame buffer. The texture object created by the example has several characteristics: 1. The width and height of the texture are consistent with the width and height of the drawing area; 2. When using
texImage2D
, the last parameter isnull
, That is, a blank area for storing texture objects is reserved; 3 The created texture object is placed on the frame buffer object, which is this line of codeframebuffer.texture = texture
. - bindFramebuffer The function binds the framebuffer to the target, and then uses to bind the previously created texture object to the framebuffer's color-linked object
gl.COLOR_ATTACHMENT0
. - checkFramebufferStatus Check that the framebuffer object is configured correctly.
draw
The main difference when drawing is that there is a switching process:
// 部分代码省略
draw: function () {
const gl = this.gl;
const frameBuffer = this.framebufferObj;
this.canvasObj.clear();
const program = this.shaderProgram;
gl.useProgram(program.program);
// 这个就让绘制的目标变成了帧缓冲区
gl.bindFramebuffer(gl.FRAMEBUFFER, frameBuffer);
gl.viewport(0, 0, this.offscreenWidth, this.offscreenHeight);
this.drawOffFrame(program, this.imgTexture);
// 解除帧缓冲区绑定,绘制的目标变成了颜色缓冲区
gl.bindFramebuffer(gl.FRAMEBUFFER, null);
gl.viewport(0, 0, gl.canvas.width, gl.canvas.height);
this.drawScreen(program, frameBuffer.texture);
},
// 部分代码省略
- First use
bindFramebuffer
make the drawing target become the frame buffer, and you need to specify the corresponding viewport. - After the frame buffer is drawn, it is unbound and restored to the normal default color buffer. It is also necessary to specify the corresponding viewport, and more particularly, the texture of the buffer object is used. This indication is obtained from the frame buffer. Plot the result.
observe and think
The relevant examples found on the Internet feel more complicated, and there are some observations and thoughts below in the process of trying to simplify.
framebuffer.texture
an existing attribute or an artificial addition?
There is this logic when creating the frame buffer object: framebuffer.texture = texture
, then does the frame buffer object itself have the texture
attribute?
The print log found that this attribute was not present when it was just created, so it is presumed that it should be added artificially.
framebuffer.texture
have content?
When the framebuffer object is initialized, the stored texture is blank, but from the final result, after the framebuffer is drawn, the texture has content, so framebuffer.texture
attribute have content?
In the drawing logic, the statements related to textures are:
gl.activeTexture(gl.TEXTURE0);
gl.bindTexture(gl.TEXTURE_2D, texture);
gl.uniform1i(program.uSampler, 0);
gl.drawElements(gl.TRIANGLES, 6, gl.UNSIGNED_SHORT, 0);
It is speculated that the gl.drawElements
method is stored in the color-related object of the frame buffer. The color-related object of the frame buffer is associated with the blank texture object created during initialization. framebuffer.texture
points to the same blank texture object, so finally there is content.
Why doesn't the final display fill the entire canvas?
When the displayable content is finally drawn, it can be found that the vertices correspond to the entire canvas, and the texture coordinates correspond to the entire complete texture, but why is the entire canvas not covered?
The texture used in the final drawing of the displayable content comes from the drawing result of the frame buffer, and the vertices of the frame buffer correspond to half of the entire buffer area. If the drawing result of the entire frame buffer is regarded as a texture, draw the visible area according to the final drawing. If the scale is scaled, then the final drawing is not full, which is the expected correct result.
This is a 161f912bd416e4 example canvas, just adjust the buffer vertices to correspond to the entire buffer size.
**粗体** _斜体_ [链接](http://example.com) `代码` - 列表 > 引用
。你还可以使用@
来通知其他用户。