Introduction
JavaScript WebGL setting color has a limited effect. At this time, you will think of using pictures, which involves the use of textures in WebGL, which is more troublesome than expected.
use pictures
Textures can be used to add details to simulated objects, and textures are used in various simulated objects in 3D games. On drawing rectangle , there are mainly the following changes:
- data
- vertex shader
- fragment shader
- buffer texture coordinate data
- Load and create textures
- draw
data
First prepare a picture, and then in order to map the texture to the corresponding rectangle, you need to specify the position of each vertex of the rectangle corresponding to the texture.
Texture 2D coordinates are on the x and y axes, ranging from 0 to 1. Texture coordinates start at (0, 0), which corresponds to the lower left corner of the image, and end at (1, 1), which corresponds to the upper right corner of the image. So the corresponding texture coordinates are:
const texCoords = [
1.0,
1.0, // 右上角
0.0,
1.0, // 左上角
0.0,
0.0, // 左下角
1.0,
0.0, // 右下角
];
vertex shader
The texture coordinates need to be buffered and passed. The variable aVertexTextureCoord
is added to the vertex shader, and its value is passed to the fragment shader.
const source = `
attribute vec3 aVertexPos;
attribute vec2 aVertexTextureCoord;
varying highp vec2 vTextureCoord;
void main(void){
gl_Position = vec4(aVertexPos, 1);
vTextureCoord = aVertexTextureCoord;
}
`;
fragment shader
Accept texture coordinates in the fragment shader, define the texture sampler uSampler
, note that this is a global variable that can be accessed at any stage, and has no value yet. The built-in method texture2D
gets the final color.
const fragmentSource = `
varying highp vec2 vTextureCoord;
uniform sampler2D uSampler;
void main(void){
gl_FragColor = texture2D(uSampler, vTextureCoord);
}
`;
buffer texture coordinate data
Texture coordinate data also needs to go into the buffer.
/**
* 缓冲纹理坐标数据并激活
* @param {*} gl WebGL 上下文
* @param {*} shaderProgram 着色器程序
* @param {*} data 纹理坐标数据
*/
function setTextureBuffers(gl, shaderProgram, data) {
// 创建空白的缓冲对象
const buffer = gl.createBuffer();
// 绑定目标
gl.bindBuffer(gl.ARRAY_BUFFER, buffer);
// WebGL 不支持直接使用 JavaScript 原始数组类型,需要转换
const dataFormat = new Float32Array(data);
// 初始化数据存储
gl.bufferData(gl.ARRAY_BUFFER, dataFormat, gl.STATIC_DRAW);
// 获取对应数据索引
const texCoord = gl.getAttribLocation(
shaderProgram,
"aVertexTextureCoord"
);
// 解析顶点数据
gl.vertexAttribPointer(texCoord, 2, gl.FLOAT, false, 0, 0);
// 启用顶点属性,顶点属性默认是禁用的。
gl.enableVertexAttribArray(texCoord);
}
Load and create textures
Make sure that the image is loaded before it can be used. After getting the image data, you need to create a texture object.
function loadImage(gl) {
var img = new Image();
img.onload = (e) => {
createTexture(gl, e.target);
};
img.src = "./1.jpg";
}
function createTexture(gl, source) {
const texture = gl.createTexture();
gl.bindTexture(gl.TEXTURE_2D, texture);
// 反转图片 Y 轴方向
gl.pixelStorei(gl.UNPACK_FLIP_Y_WEBGL, true);
// 纹理坐标水平填充 s
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_S, gl.CLAMP_TO_EDGE);
// 纹理坐标垂直填充 t
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_T, gl.CLAMP_TO_EDGE);
// 纹理放大处理
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MAG_FILTER, gl.LINEAR);
// 纹理缩小处理
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MIN_FILTER, gl.LINEAR);
// 图片数据赋给纹理对象
gl.texImage2D(
gl.TEXTURE_2D,
0,
gl.RGBA,
gl.RGBA,
gl.UNSIGNED_BYTE,
source
);
// 激活纹理
gl.activeTexture(gl.TEXTURE0);
}
createTexture creates a texture object, then use bindTexture and bind it to the corresponding target, here is a two-dimensional picture, the first parameter value is gl.TEXTURE_2D
represent a two-dimensional texture, the second parameter is the texture object, when it is null
indicates unbinding. No further operations can be performed on the texture after binding.
pixelStorei method inverts the Y coordinate of the image, because the coordinate system of the image and the coordinate system of the texture reference are different.
texParameteri method sets various parameters of the texture. It needs to be explained here. If you want to use pictures of various sizes, you need to set the horizontal and vertical padding above. Otherwise, only pictures of a specific size can be displayed.
texImage2D method assigns the texture source to the texture object. Here, the pixel data of the image is passed to the texture object, so that the image can be seen when the texture is drawn.
activeTexture The method activates the specified texture, the range of texture units is 0 to gl.MAX_COMBINED_TEXTURE_IMAGE_UNITS - 1
, there is only one here, and the value is gl.TEXTUREI0
. The default first texture unit is always active, so this line of code can be removed.
draw
For the global variable declared in the fragment shader, use the uniform1i method to specify the corresponding value when drawing. The second parameter represents the texture unit, where 0 is the first texture unit.
/**
* 绘制
* @param {*} gl WebGL 上下文
* @param {*} shaderProgram 着色器程序
*/
function draw(gl, shaderProgram) {
// 获取纹理采样器
const samplerUniform = gl.getUniformLocation(shaderProgram, "uSampler");
// 指定全局变量关联的纹理单元
gl.uniform1i(samplerUniform, 0);
gl.drawElements(gl.TRIANGLES, 6, gl.UNSIGNED_SHORT, 0);
}
Effect
This is example , the effect is as follows:
If you compare the original image, you can find that the image is deformed and not adaptive.
**粗体** _斜体_ [链接](http://example.com) `代码` - 列表 > 引用
。你还可以使用@
来通知其他用户。