global glow
Global glow (Bloom), also known as floodlight. It is actually an outer glow effect that acts on a specific area.
In games, we can often see the effect of outer glow. Typical examples are chandeliers in indoor scenes, electronic device screens, outdoor street lights at night, car lights, and so on. What these scenes have in common is that they provide strong visual information of brightness and atmosphere. In real life, these glows are caused by light scattering in the atmosphere or in our eyes. But after rendering these objects to the screen, the amount of light it can reach to the eye is limited. Therefore, it is necessary to artificially simulate this effect to show the actual scene more realistically.
The image above shows a comparison with and without the glow, and we get a really bright feel when looking at this ceiling light. Therefore, floodlight can greatly improve the lighting effect in the scene.
So how to artificially simulate this effect? The answer is digital image processing, continue to analyze.
RTT and postprocessing
Regardless of the rendering engine, the most common and most effective way to achieve glow is post-processing. In short, it does not directly display the result of the main scene rendering on the screen, but saves the result to a texture. This process It's called RTT (render to texture). After getting this texture, render another scene with only Plane (which can be understood as a large flat container), and pass the texture into the material of Plane as a map for rendering. During the rendering process, digital image processing can be used to achieve some special effects. Essentially, these effects are applied to the first render of the main scene. So this is also the essence of PostProcessing: digital image processing of the current rendering result.
The following figure more clearly describes the post-processing of the global glow:
ThreeJS officially provides the UnrealBloom
post-processor to achieve the global glow effect. We combine its source code to roughly analyze the implementation process of glow.
Render the main scene
First create a plane to hold the rendering results of subsequent image processing. FullScreenQuad
is a flat container encapsulated by ThreeJS that holds textures for rendering results.
this.fsQuad = new FullScreenQuad( null );
Render the main scene to a texture
this.fsQuad.material = this.basic;
this.basic.map = readBuffer.texture;
renderer.setRenderTarget( null );
renderer.clear();
this.fsQuad.render( renderer );
Save rendererTarget
as the final blended raw image and thresholded input.
Thresholding - extract bright colors
After the main scene is rendered to textures, the first step is thresholding. Thresholding in image processing is for a certain pixel in the image. If the pixel grayscale is higher than a certain value, it is set to 1, and if it is lower than a certain value, it is set to 0. Then in our original image, thresholding is specialised - that is, if the grayscale in the texture is below a certain threshold, the color is set to (0,0,0)
, and if it's above the threshold, the primary color is preserved. Then you can get a texture with only "glow" color information and go to the next stage.
So how to choose the threshold? The selection of the threshold determines the selection of glow pixels. Generally, there are two methods—the global threshold and the local threshold. The parameter adjustment of the global threshold is relatively metaphysical, and of course, it can also be selected in combination with the histogram. The local threshold needs to be combined with the local filter, which is more complicated and will not be described in detail.
ThreeJS uses LuminosityHighPassShader
to handle thresholding:
// 1. Extract Bright Areas
this.highPassUniforms[ 'tDiffuse' ].value = readBuffer.texture;
this.highPassUniforms[ 'luminosityThreshold' ].value = this.threshold;
this.fsQuad.material = this.materialHighPassFilter;
renderer.setRenderTarget( this.renderTargetBright );
renderer.clear();
this.fsQuad.render( renderer );
Blur - Gaussian Blur
Now that we have the thresholded and downsampled texture rendererTarget1
, we can proceed to the next step of blurring. If we usually come into contact with various P-map software, we must be familiar with blurring. The most used blur algorithm is Gaussian blur . Intuitively, Gaussian blur can be understood as taking the average value of surrounding pixels for each pixel. In the figure below, 2 is the middle point, and the surrounding points are all 1.
<img src="https://img.alicdn.com/imgextra/i1/O1CN014VEwhT1Gjriw6Ka2R_!!6000000000659-2-tps-395-330.png" alt="img" style="zoom: 50%;" /><img src="https://img.alicdn.com/imgextra/i4/O1CN01xomOdi202mnMblmtf_!!6000000006792-2-tps-394-347.png" alt="img" style="zoom:50%;" />
The "middle point" takes the average of the "surrounding points", and it becomes 1. Numerically, this is a "smoothing". Graphically, it is equivalent to producing a "blur" effect, where the "middle point" loses detail. The effect of Gaussian blur depends on the blur radius and weight assignment. Intuitively, the blur radius is to calculate how many points around it, which can be 3*3 or 5*5. Obviously, the larger the blur radius, the more obvious the blurring effect. The weight distribution refers to the weight of each point in the process of calculating the average value. In the above example, we used a simple average, which is obviously not very reasonable, because the images are continuous, the closer the points are, the closer the relationship is, and the farther the points are, the more distant the relationship is. Therefore, the weighted average is more reasonable, the closer the distance is, the greater the weight, and the farther the distance is, the smaller the weight. Therefore we use the normal distribution curve to do the weighted average. So the Gaussian function is a normal function distribution in two dimensions. The specific calculation process is not repeated here, and most of the libraries are also packaged for us.
In the implementation process, we also consider performance issues. If you sample a 32*32 square area, you must sample 1024 times in a texture for each point. But a neat property of the Gaussian equation is that it is possible to decompose a two-dimensional equation into two smaller equations: one describing the horizontal weights and the other describing the vertical weights. We first blur horizontally over the entire texture with a horizontal weight, and then blur vertically over the altered texture. Using this feature, the result is the same, but with a huge performance savings, because we now only need to do 32+32 samples, not 1024! This process is a two-step Gaussian blur.
// 2. Blur All the mips progressively
let inputRenderTarget = this.renderTargetBright;
for ( let i = 0; i < this.nMips; i ++ ) {
this.fsQuad.material = this.separableBlurMaterials[ i ];
this.separableBlurMaterials[ i ].uniforms[ 'colorTexture' ].value = inputRenderTarget.texture;
this.separableBlurMaterials[ i ].uniforms[ 'direction' ].value = UnrealBloomPass.BlurDirectionX;
renderer.setRenderTarget( this.renderTargetsHorizontal[ i ] );
renderer.clear();
this.fsQuad.render( renderer );
this.separableBlurMaterials[ i ].uniforms[ 'colorTexture' ].value = this.renderTargetsHorizontal[ i ].texture;
this.separableBlurMaterials[ i ].uniforms[ 'direction' ].value = UnrealBloomPass.BlurDirectionY;
renderer.setRenderTarget( this.renderTargetsVertical[ i ] );
renderer.clear();
this.fsQuad.render( renderer );
inputRenderTarget = this.renderTargetsVertical[ i ];
}
A relatively simple Gaussian blur filter is used in ThreeJS, it has only 5 samples in each direction, the kernalSize is increased from 3 to 11, and the blur is improved by repeating the blur more times along a larger radius. Effect.
// Gaussian Blur Materials
this.separableBlurMaterials = [];
const kernelSizeArray = [ 3, 5, 7, 9, 11 ];
resx = Math.round( this.resolution.x / 2 );
resy = Math.round( this.resolution.y / 2 );
for ( let i = 0; i < this.nMips; i ++ ) {
this.separableBlurMaterials.push( this.getSeperableBlurMaterial( kernelSizeArray[ i ] ) );
this.separableBlurMaterials[ i ].uniforms[ 'texSize' ].value = new Vector2( resx, resy );
resx = Math.round( resx / 2 );
resy = Math.round( resy / 2 );
}
Among them, getSeperableBlurMaterial
is the shader that implements Gaussian blur, and the fragment shader part of the code is as follows:
#include <common>
varying vec2 vUv;
uniform sampler2D colorTexture;
uniform vec2 texSize;
uniform vec2 direction;
float gaussianPdf(in float x, in float sigma) {
return 0.39894 * exp( -0.5 * x * x/( sigma * sigma))/sigma;
}
void main() {
vec2 invSize = 1.0 / texSize;
float fSigma = float(SIGMA);
float weightSum = gaussianPdf(0.0, fSigma);
vec3 diffuseSum = texture2D( colorTexture, vUv).rgb * weightSum;
for( int i = 1; i < KERNEL_RADIUS; i ++ ) {
float x = float(i);
float w = gaussianPdf(x, fSigma);
vec2 uvOffset = direction * invSize * x;
vec3 sample1 = texture2D( colorTexture, vUv + uvOffset).rgb;
vec3 sample2 = texture2D( colorTexture, vUv - uvOffset).rgb;
diffuseSum += (sample1 + sample2) * w;
weightSum += 2.0 * w;
}
gl_FragColor = vec4(diffuseSum/weightSum, 1.0);
}
Because the quality of the blur is positively related to the quality of the bloom effect, increasing the blur effect can improve the bloom effect. Some boosts use blur filters in combination with blur kernels of different sizes or use multiple Gaussians to selectively combine weights.
Also, during the loop, we found that the size resx/resy
was reduced to 1/4 for each render. This operation is down-sampling processing. In order to reduce the texture resolution and the cost of post-processing operations, the blurring operation can also use a smaller window to blur a larger range. Of course, downsampling will also bring about the problem of aliasing . The solution to this requires specific consideration of specific projects, and will not be repeated here.
mix
Now that you have the original render texture and the blurred glow texture, it's time to do the final blending:
// Composite All the mips
this.fsQuad.material = this.compositeMaterial;
renderer.setRenderTarget( this.renderTargetsHorizontal[ 0 ] );
renderer.clear();
this.fsQuad.render( renderer );
compositeMaterial
is the material that finally mixes all textures, implemented as follows:
// ...
float lerpBloomFactor(const in float factor) {
float mirrorFactor = 1.2 - factor;
return mix(factor, mirrorFactor, bloomRadius);
}
void main() {
gl_FragColor = bloomStrength * ( lerpBloomFactor(bloomFactors[0]) * vec4(bloomTintColors[0], 1.0) * texture2D(blurTexture1, vUv) +
lerpBloomFactor(bloomFactors[1]) * vec4(bloomTintColors[1], 1.0) * texture2D(blurTexture2, vUv) +
lerpBloomFactor(bloomFactors[2]) * vec4(bloomTintColors[2], 1.0) * texture2D(blurTexture3, vUv) +
lerpBloomFactor(bloomFactors[3]) * vec4(bloomTintColors[3], 1.0) * texture2D(blurTexture4, vUv) +
lerpBloomFactor(bloomFactors[4]) * vec4(bloomTintColors[4], 1.0) * texture2D(blurTexture5, vUv) );
}
The following is a demo , you can adjust the four parameters to see what different effects will be.
<img src="https://img.alicdn.com/imgextra/i3/O1CN010mbAuP1bEoo1s5gsZ_!!6000000003434-2-tps-1866-1382.png" style="zoom: 33%;" />
partial glow
The above effect looks pretty good, right? But when we actually use it, we encounter problems. Sometimes we only want a certain object to emit light, but the global threshold method is used in the thresholding process, which will cause other objects that do not want to have a glow effect to appear glow.
ThreeJS also provides a demo to solve this problem.
<img src="https://img.alicdn.com/imgextra/i2/O1CN015QGmmv25MXxosgWdv_!!6000000007512-2-tps-1558-1232.png" style="zoom:33%;" />
The main idea is:
- Create a glow layer and add glow objects on this layer to distinguish between glow objects and non-glow objects
const BLOOM_LAYER = 1;
const bloomLayer = new THREE.Layers();
bloomLayer.set(BLOOM_LAYER);
In Three, all geometries are assigned from 1 to 32 layers, numbered from 0 to 31. All geometries are stored on the 0th layer by default. We can arbitrarily set the value of BLOOM_LAYER.
- Prepare two post-processors, EffectComposer, one bloomComposer to generate glow effects, and the other finalComposer to render the entire scene normally
const renderPass = new THREE.RenderPass(scene, camera);
// bloomComposer效果合成器 产生辉光,但是不渲染到屏幕上
const bloomComposer = new THREE.EffectComposer(renderer);
bloomComposer.renderToScreen = false; // 不渲染到屏幕上
bloomComposer.addPass(renderPass);
// 最终真正渲染到屏幕上的效果合成器 finalComposer
const finalComposer = new THREE.EffectComposer(renderer);
finalComposer.addPass(renderPass);
- Turn the material of objects other than glow objects to black (that is, to ensure that this part of the information is not preserved during the thresholding process)
const materials = {};
function darkenNonBloomed( obj ) {
if ( obj.isMesh && bloomLayer.test( obj.layers ) === false ) {
materials[ obj.uuid ] = obj.material;
obj.material = darkMaterial;
}
}
- Implement glow with UnrealBloomPass in bloomComposer without rendering to screen
const bloomPass = new UnrealBloomPass(
new THREE.Vector2(renderer.domElement.offsetWidth, renderer.domElement.offsetHeight), 1, 1, 0.1,
);
bloomComposer.addPass(bloomPass);
- Then restore the object converted to black material to the original material
const darkMaterial = new THREE.MeshBasicMaterial( { color: "black" } );
function restoreMaterial( obj ) {
if ( materials[ obj.uuid ] ) {
obj.material = materials[ obj.uuid ];
delete materials[ obj.uuid ];
}
}
- Using finalComposer rendering, finalComposer will add two channels, one is the rendering result of bloomComposer, and the other is the normal rendering result.
const shaderPass = new ShaderPass(
new THREE.ShaderMaterial({
uniforms: {
baseTexture: { value: null },
bloomTexture: { value: bloomComposer.renderTarget2.texture },
},
vertexShader: vs,
fragmentShader: fs,
defines: {},
}),
'baseTexture',
); // 创建自定义的着色器Pass,详细见下
shaderPass.needsSwap = true;
finalComposer.addPass(shaderPass);
The role of shaderPass even mixes two baseTexture and bloomTexture:
// vertextshader
varying vec2 vUv;
void main() {
vUv = uv;
gl_Position = projectionMatrix * modelViewMatrix * vec4( position, 1.0 );
}
// fragmentshader
uniform sampler2D baseTexture;
uniform sampler2D bloomTexture;
varying vec2 vUv;
void main() {
gl_FragColor = ( texture2D( baseTexture, vUv ) + vec4( 1.0 ) * texture2D( bloomTexture, vUv ) );
}
- The final render loop function call:
function animate(time) {
// ...
// 实现局部辉光
// 1. 利用 darkenNonBloomed 函数将除辉光物体外的其他物体的材质转成黑色
scene.traverse(darkenNonBloomed);
// 2. 用 bloomComposer 产生辉光
bloomComposer.render();
// 3. 将转成黑色材质的物体还原成初始材质
scene.traverse(restoreMaterial);
// 4. 用 finalComposer 作最后渲染
finalComposer.render();
requestAnimationFrame(animate);
}
The problem with the glow
A problem is solved, a new problem arises
black material treatment
During use, we also found a problem. After using TransformConstrol
, a strange phenomenon appeared. This component is used to drag and drop control objects. We restore a simplest scene, the blue box on the left does not use the glow, and the yellow box on the right uses the glow. Normal rendering is shown on the left. But after using TransformConstrol
, when dragging the object, the phenomenon on the right appears.
It looks like the TransformControl's material rendering is affected. In the process of realizing part of the glow effect, we distinguish between glow and non-glow objects, and first render the non-glow objects with a black material. The substitute black material uses MeshBasicMaterial
. It seems that there is no problem at first, but What if we add other types of materials to the scene? The implementation of TransformConstrol
uses LineBasicMaterial
, that is, when rendering non-glow objects, MeshBasicMaterial
is used to act on Line
, so it also leads to the phenomenon of rendering. In a nutshell, an object uses the A material, and the black B material is used for rendering in the darkNonBloomed process, instead of the black A material, which will definitely lead to an error in the final rendering.
Therefore, in the darkenNonBloomed process, the prototype of the material should be analyzed in detail, and then the corresponding black material should be created and saved:
const materials = {};
const darkMaterials = {};
export const darkenNonBloomed = (obj) => {
const material = obj.material;
if (material && bloomLayer.test(obj.layers) === false) {
materials[obj.uuid] = material;
if (!darkMaterials[material.type]) {
const Proto = Object.getPrototypeOf(material).constructor;
darkMaterials[material.type] = new Proto({ color: 0x000000 });
}
obj.material = darkMaterials[material.type];
}
};
After this step, we can find that the TransformControl renders normally, and we can test adding other objects such as Line Points to the scene without being affected.
<img src="https://img.alicdn.com/imgextra/i3/O1CN01zx2hYW1NMgzjbrknz_!!6000000001556-2-tps-670-466.png" style="zoom: 50%;" />
Transparency fails
Sometimes, we want to achieve the overall effect by setting the background color of the container, so the transparency of the renderer is set to 0.
renderer.setClearAlpha(0);
But after using UnrealBloomPass
, we found that the transparency setting of the overall background background did not take effect.
<img src="https://img.alicdn.com/imgextra/i4/O1CN013kgx8y1wmLjcPOXpQ_!!6000000006350-2-tps-994-866.png" alt="image-20211103175318581" style="zoom:50%;" />
Analysis of the source code, you can know that UnrealBloomPass
will affect the alpha channel of the renderer (see the Gaussian blur part of the code in the first section)
gl_FragColor = vec4(diffuseSum/weightSum, 1.0);
The final shader color is processed as vec4(diffuseSum/weightSum, 1.0)
and the alpha channel is always 1. To solve this problem, only modify the source code:
for( int i = 1; i < KERNEL_RADIUS; i ++ ) {
float x = float(i);
float w = gaussianPdf(x, fSigma);
vec2 uvOffset = direction * invSize * x;
vec4 sample1 = texture2D( colorTexture, vUv + uvOffset);
vec4 sample2 = texture2D( colorTexture, vUv - uvOffset);
diffuseSum += (sample1.rgb + sample2.rgb) * w;
alphaSum += (sample1.a + sample2.a) * w;
weightSum += 2.0 * w;
}
gl_FragColor = vec4(diffuseSum/weightSum, alphaSum/weightSum);
The 23rd line is the average calculation of the alpha channel of the two samples.
performance
In actual use, we found that the glow post-processing still greatly affects performance. First of all, the glow post-processing itself requires a lot of image processing calculations, and it needs to be performed several times (there are 5 Gaussian blur calculations in ThreeJS). In addition, in order to achieve some glow effects, we manually added glow objects and non-glow objects. The distinction between objects, so the overall performance is definitely not high.
In the performance part, we just rely on the intuitive feeling, such as adding glow, the FPS drops a lot. However, the specific effect has not been studied in depth. And glow, as an important effect in rendering, must be used, so solving performance problems needs to be discussed in depth. The current idea is to abandon the UnrealBloom provided by ThreeJS. According to the basic principles described in the first section, the effect is achieved by customizing the Shader, and fine-grained control can be performed according to the actual scene (the main process is the thresholding step, which can be achieved by local thresholding accomplish). But the specific has not started to realize, occupy the pit.
inner glow
The edge lighting effect is a very common effect in 3D scenes. The purpose is to highlight an object in the scene. Edge lighting is divided into inner lighting and outer lighting. As the name suggests, outer lighting is the gradual diffusion of edge light from the edge to the outside The effect of falloff, while Inner Glow is the effect of fading away as the edges diffuse inward. The aforementioned glow effect is the outer glow effect, so how to achieve the inner glow effect?
The biggest difference between the inner lighting effect and the outer lighting is the two keywords "edge" and "interior". Because it emits light inside the model, it can be implemented for the material of the model itself without using post-processing. This is obviously an advantage in performance. Of course, the most important thing is the scope of application of its own effect.
Phenomenon
In real life, the most common example of a glowing edge profile is calm and deep water. When we stand by the lake and look at the lake, we will find that the water in the lake below our feet is transparent and the reflection is not strong, but when we look into the distance, we find that the water is not transparent, and we can only see the result of reflection. That is to say, when the angle between the line of sight and the surface of the observed object is smaller, the reflection is more obvious.
<img src="https://img.alicdn.com/imgextra/i2/O1CN01Pf0Mwf1T5BY5JytmS_!!6000000002330-2-tps-603-1011.png" alt="img" style="zoom: 33%;" />
Fresnel reflection
The above phenomenon is called "Fresnel reflection" in optics. It is essentially caused by reflection and refraction of light propagating from one medium into another. Generally speaking, for the vast majority of media other than normal incidence, that is, the reflection is the least, and when it is incident in a direction perpendicular to the normal, its reflection ratio reaches the maximum (no transmission) .
Therefore, we can simulate this phenomenon very simply in the computer. As long as there is the normal of a vertex on the model and the line of sight of the current camera, the light intensity can be calculated with a small amount of calculation, so as to obtain this kind of light in the edge of the outline. Effect. Look at the picture and talk:
<img src="https://pic1.zhimg.com/80/v2-ae113a0ea06b474d2dc49724518b2290_1440w.jpg" alt="img" style="zoom:50%;" />
When the normal of the surface of the object is parallel to the screen, that is, when the Camera is basically looking at the surface horizontally, the reflected light at this time should be the strongest.
Commonly used Fresnel approximation equations are: $F_{schlick}(v,n) = F_0 + (1-F_0)(1-v \cdot n)$
Where $F_0$ is the reflection coefficient, which is used to control the reflection intensity, $v$ is the viewing angle direction, and $n$ is the normal direction.
Another equation: $F_{Empricial}(v,n) = \max(0, \min(1, bias + scale \times (1-v\cdot n)^{power}))$
Then first calculate the view angle and normal in the vertex shader:
uniform vec3 view_vector; // 视角
varying vec3 vNormal; // 法线
varying vec3 vPositionNormal;
void main() {
vNormal = normalize( normalMatrix * normal ); // 转换到视图空间
vPositionNormal = normalize(normalMatrix * view_vector);
gl_Position = projectionMatrix * modelViewMatrix * vec4( position, 1.0 );
}
The fragment shader applies the above formula to calculate the transparency change:
uniform vec3 glowColor;
uniform float b;
uniform float p;
uniform float s;
varying vec3 vNormal;
varying vec3 vPositionNormal;
void main() {
float a = pow(b + s * abs(dot(vNormal, vPositionNormal)), p );
gl_FragColor = vec4( glowColor, a );
}
During rendering, if the camera's perspective changes, it is updated in real time:
function render() {
const newView = camera.position.clone().sub(controls.target);
customMaterial.uniforms.view_vector.value = newView;
renderer.render( scene, camera );
requestAnimationFrame(render);
}
Rendering result:
<img src="https://img.alicdn.com/imgextra/i1/O1CN01k2lhtB1BsxVXqjND4_!!6000000000002-2-tps-942-698.png" style="zoom:33%;" />
finer control
Plain effects are definitely not enough, we also wanted this finer control over the glow effect. For example, the range, direction, and speed of light intensity of reflection. Of course, these parameters have been reflected in the formula: bias
value determines the position of the brightest value of the color, power
determines the speed and direction of light intensity change, and scale
controls the direction and range of light emission.
We can adjust the parameters in the demo to see the effect.
limit
The Fresnel reflection calculates the final lighting intensity based on the angle between the normal and the line of sight. Therefore, for a flat model such as cube and prism , since the normals of each face of the model are the same, it is impossible to achieve the desired effect. If you must use this effect, consider tessellation smoothing or use normal maps to modify vertex normals.
refer to
- http://paradise.dtysky.moe/effect/global-bloom
- http://paradise.dtysky.moe/effect/rim-light-fresnel
- https://learnopengl-cn.github.io/05%20Advanced%20Lighting/07%20Bloom/
- https://zhuanlan.zhihu.com/p/38548428
- https://github.com/mbalex99/threejs-unrealbloompass-transparent-background-example
Author: ES2049 / timeless
The article can be reproduced at will, but please keep the original link.You are very welcome to join ES2049 Studio with passion. Please send your resume to caijun.hcj@alibaba-inc.com .
**粗体** _斜体_ [链接](http://example.com) `代码` - 列表 > 引用
。你还可以使用@
来通知其他用户。