For an immersive game, the global illumination effect in the scene must be indispensable.
The light and shadow changes brought about by Dynamic Diffuse Global Illumination (DDGI) are a delicate and extended visual language, allowing each color in the scene to have a "colorful" interpretation. Expanded the level of picture information transmission, the finishing touch.
Direct Light Rendering vs Dynamic Diffuse GI
The technical challenges brought by the delicate lighting visual language are not small. The rendering effects of different material surfaces and lighting vary widely. Diffuse uniformly scatters lighting information, the intensity of light, the lighting dynamics, the transformation of surface materials, etc. In the face of these floating variables, platform performance and computing power Tested.
In response to the complex "symptoms" that global illumination needs to overcome, the HMS Core graphics engine service provides a set of real-time dynamic diffuse global illumination (DDGI) technology, which is mobile-oriented and can be extended to all platforms without pre-baking. Based on the Light Probe pipeline, an improved algorithm is proposed during Probe update and shading to reduce the computational load of the original pipeline. Realize the global illumination of multiple reflection information, improve the realism of rendering, and meet the real-time and interactive requirements of mobile terminal devices.
And, to achieve an immersive dynamic diffuse global illumination , you can easily do it in a few steps!
Demo example
Development Guide
Step Instructions
1. Initialization phase: Set up the Vulkan runtime environment and initialize the DDGIAPI class.
2. Preparation stage:
Create two Textures for saving DDGI rendering results, and pass the Texture information to DDGI.
Prepare the Mesh, Material, Light, Camera, resolution and other information required by the DDGI plug-in and pass it to DDGI.
Set DDGI parameters.
3. Rendering stage
If the Mesh transformation matrix, Light, and Camera information of the scene changes, it will be updated to the DDGI side synchronously.
The Render() function is called, and the rendering result of DDGI is saved in the Texture created in the preparation phase.
Incorporates DDGI results into shading calculations.
Art Restrictions
1. For scenes where you want to enable DDGI effects, the origin parameter of DDGI should be set to the center of the scene, and the corresponding step size and number of probes should be set so that DDGI Volume can cover the entire scene.
2. In order for DDGI to obtain a proper occlusion effect, please avoid using a wall without thickness; if the thickness of the wall is too thin relative to the density of the Probe, light leaking will occur. At the same time, the planes constituting the wall are preferably single-sided, that is, the wall is composed of two single-sided planes.
3. Since it is a DDGI solution on the mobile side, from the perspective of performance and power consumption, there are the following suggestions: ① Control the number of geometry transmitted to the SDK side (recommended within 50,000 vertices), such as only those in the scene that will generate indirect light The main structure is passed to the SDK; ② Try to use the appropriate Probe density and quantity, and try not to exceed 10 10 10. The above suggestions are based on the final presentation results.
Development steps
1. Download the SDK package of the plug-in, and unzip it to obtain the DDGI SDK related files, including 1 header file and 2 so files. For the download address of the so library file used by the Android platform, please refer to: Dynamic Diffuse Global Illumination Plug-in .
2. The plugin supports the Android platform and uses CMake to build. The following is a partial snippet of CMakeLists.txt for reference only:
cmake_minimum_required(VERSION 3.4.1 FATAL_ERROR)
set(NAME DDGIExample)
project(${NAME})
set(PROJ_ROOT ${CMAKE_CURRENT_SOURCE_DIR})
set(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} -std=c++14 -O2 -DNDEBUG -DVK_USE_PLATFORM_ANDROID_KHR")
file(GLOB EXAMPLE_SRC "${PROJ_ROOT}/src/*.cpp") # 引入开发者主程序代码。
include_directories(${PROJ_ROOT}/include) # 引入头文件,可以将DDGIAPI.h头文件放在此目录。
# 导入librtcore.so和libddgi.so
ADD_LIBRARY(rtcore SHARED IMPORTED)
SET_TARGET_PROPERTIES(rtcore
PROPERTIES IMPORTED_LOCATION
${CMAKE_SOURCE_DIR}/src/main/libs/librtcore.so)
ADD_LIBRARY(ddgi SHARED IMPORTED)
SET_TARGET_PROPERTIES(ddgi
PROPERTIES IMPORTED_LOCATION
${CMAKE_SOURCE_DIR}/src/main/libs/libddgi.so)
add_library(native-lib SHARED ${EXAMPLE_SRC})
target_link_libraries(
native-lib
...
ddgi # 链接ddgi库。
rtcore
android
log
z
...
)
3. Set up the Vulkan environment and initialize the DDGIAPI class.
// 设置DDGI SDK需要的Vulkan环境信息。
// 包括logicalDevice, queue, queueFamilyIndex信息。
void DDGIExample::SetupDDGIDeviceInfo()
{
m_ddgiDeviceInfo.physicalDevice = physicalDevice;
m_ddgiDeviceInfo.logicalDevice = device;
m_ddgiDeviceInfo.queue = queue;
m_ddgiDeviceInfo.queueFamilyIndex = vulkanDevice->queueFamilyIndices.graphics;
}
void DDGIExample::PrepareDDGI()
{
// 设置Vulkan环境信息。
SetupDDGIDeviceInfo();
// 调用DDGI的初始化函数。
m_ddgiRender->InitDDGI(m_ddgiDeviceInfo);
...
}
void DDGIExample::Prepare()
{
...
// 创建DDGIAPI对象。
std::unique_ptr<DDGIAPI> m_ddgiRender = make_unique<DDGIAPI>();
...
PrepareDDGI();
...
}
4. Create two Textures to save the diffuse global illumination and normal depth map of the camera's perspective. To improve rendering performance, Texture supports downscaling settings. The smaller the resolution, the better the rendering performance, but the aliasing of the rendering result, such as jagged edges, may be more serious.
// 创建用于保存渲染结果的Texture。
void DDGIExample::CreateDDGITexture()
{
VkImageUsageFlags usage = VK_IMAGE_USAGE_COLOR_ATTACHMENT_BIT | VK_IMAGE_USAGE_SAMPLED_BIT;
int ddgiTexWidth = width / m_shadingPara.ddgiDownSizeScale; // 纹理宽度。
int ddgiTexHeight = height / m_shadingPara.ddgiDownSizeScale; // 纹理高度。
glm::ivec2 size(ddgiTexWidth, ddgiTexHeight);
// 创建保存irradiance结果的Texture。
m_irradianceTex.CreateAttachment(vulkanDevice,
ddgiTexWidth,
ddgiTexHeight,
VK_FORMAT_R16G16B16A16_SFLOAT,
usage,
VK_IMAGE_LAYOUT_SHADER_READ_ONLY_OPTIMAL,
m_defaultSampler);
// 创建保存normal和depth结果的Texture。
m_normalDepthTex.CreateAttachment(vulkanDevice,
ddgiTexWidth,
ddgiTexHeight,
VK_FORMAT_R16G16B16A16_SFLOAT,
usage,
VK_IMAGE_LAYOUT_SHADER_READ_ONLY_OPTIMAL,
m_defaultSampler);
}
// 设置DDGIVulkanImage信息。
void DDGIExample::PrepareDDGIOutputTex(const vks::Texture& tex, DDGIVulkanImage *texture) const
{
texture->image = tex.image;
texture->format = tex.format;
texture->type = VK_IMAGE_TYPE_2D;
texture->extent.width = tex.width;
texture->extent.height = tex.height;
texture->extent.depth = 1;
texture->usage = tex.usage;
texture->layout = tex.imageLayout;
texture->layers = 1;
texture->mipCount = 1;
texture->samples = VK_SAMPLE_COUNT_1_BIT;
texture->tiling = VK_IMAGE_TILING_OPTIMAL;
}
void DDGIExample::PrepareDDGI()
{
...
// 设置Texture分辨率。
m_ddgiRender->SetResolution(width / m_downScale, height / m_downScale);
// 设置用于保存渲染结果的DDGIVulkanImage信息。
PrepareDDGIOutputTex(m_irradianceTex, &m_ddgiIrradianceTex);
PrepareDDGIOutputTex(m_normalDepthTex, &m_ddgiNormalDepthTex);
m_ddgiRender->SetAdditionalTexHandler(m_ddgiIrradianceTex, AttachmentTextureType::DDGI_IRRADIANCE);
m_ddgiRender->SetAdditionalTexHandler(m_ddgiNormalDepthTex, AttachmentTextureType::DDGI_NORMAL_DEPTH);
...
}
void DDGIExample::Prepare()
{
...
CreateDDGITexture();
...
PrepareDDGI();
...
}
5. Prepare meshes, materials, light sources, and camera data required for DDGI rendering.
// mesh结构体,支持submesh。
struct DDGIMesh {
std::string meshName;
std::vector<DDGIVertex> meshVertex;
std::vector<uint32_t> meshIndice;
std::vector<DDGIMaterial> materials;
std::vector<uint32_t> subMeshStartIndexes;
...
};
// 方向光结构体,当前仅支持1个方向光。
struct DDGIDirectionalLight {
CoordSystem coordSystem = CoordSystem::RIGHT_HANDED;
int lightId;
DDGI::Mat4f localToWorld;
DDGI::Vec4f color;
DDGI::Vec4f dirAndIntensity;
};
// 主相机结构体。
struct DDGICamera {
DDGI::Vec4f pos;
DDGI::Vec4f rotation;
DDGI::Mat4f viewMat;
DDGI::Mat4f perspectiveMat;
};
// 设置DDGI的光源信息。
void DDGIExample::SetupDDGILights()
{
m_ddgiDirLight.color = VecInterface(m_dirLight.color);
m_ddgiDirLight.dirAndIntensity = VecInterface(m_dirLight.dirAndPower);
m_ddgiDirLight.localToWorld = MatInterface(inverse(m_dirLight.worldToLocal));
m_ddgiDirLight.lightId = 0;
}
// 设置DDGI的相机信息。
void DDGIExample::SetupDDGICamera()
{
m_ddgiCamera.pos = VecInterface(m_camera.viewPos);
m_ddgiCamera.rotation = VecInterface(m_camera.rotation, 1.0);
m_ddgiCamera.viewMat = MatInterface(m_camera.matrices.view);
glm::mat4 yFlip = glm::mat4(1.0f);
yFlip[1][1] = -1;
m_ddgiCamera.perspectiveMat = MatInterface(m_camera.matrices.perspective * yFlip);
}
// 准备DDGI需要的网格信息。
// 以gltf格式的渲染场景为例。
void DDGIExample::PrepareDDGIMeshes()
{
for (constauto& node : m_models.scene.linearNodes) {
DDGIMesh tmpMesh;
tmpMesh.meshName = node->name;
if (node->mesh) {
tmpMesh.meshName = node->mesh->name; // 网格的名称。
tmpMesh.localToWorld = MatInterface(node->getMatrix()); // 网格的变换矩阵。
// 网格的骨骼蒙皮矩阵。
if (node->skin) {
tmpMesh.hasAnimation = true;
for (auto& matrix : node->skin->inverseBindMatrices) {
tmpMesh.boneTransforms.emplace_back(MatInterface(matrix));
}
}
// 网格的材质节点、顶点信息。
for (vkglTF::Primitive *primitive : node->mesh->primitives) {
...
}
}
m_ddgiMeshes.emplace(std::make_pair(node->index, tmpMesh));
}
}
void DDGIExample::PrepareDDGI()
{
...
// 转换成DDGI需要的数据格式。
SetupDDGILights();
SetupDDGICamera();
PrepareDDGIMeshes();
...
// 向DDGI传递数据。
m_ddgiRender->SetMeshs(m_ddgiMeshes);
m_ddgiRender->UpdateDirectionalLight(m_ddgiDirLight);
m_ddgiRender->UpdateCamera(m_ddgiCamera);
...
}
6. Set parameters such as the position and quantity of DDGI probes.
// 设置DDGI算法参数。
void DDGIExample::SetupDDGIParameters()
{
m_ddgiSettings.origin = VecInterface(3.5f, 1.5f, 4.25f, 0.f);
m_ddgiSettings.probeStep = VecInterface(1.3f, 0.55f, 1.5f, 0.f);
...
}
void DDGIExample::PrepareDDGI()
{
...
SetupDDGIParameters();
...
// 向DDGI传递数据。
m_ddgiRender->UpdateDDGIProbes(m_ddgiSettings);
...
}
7. Call the Prepare() function of DDGI to parse the previously passed data.
void DDGIExample::PrepareDDGI()
{
...
m_ddgiRender->Prepare();
}
8. Call DDGI's Render() to update and cache the indirect light information of the scene into the two DDGI Textures set in step 4.
*illustrate
In the current version, the rendering results are the diffuse indirect light result map and the normal depth map from the camera perspective. The developer uses the bilateral filtering algorithm combined with the normal depth map to upsample the diffuse indirect light results to calculate the diffuse indirect light of the screen size. Reflected global illumination results.
If the Render() function is not called, the rendering result is the result of the historical frame.
#define RENDER_EVERY_NUM_FRAME 2
void DDGIExample::Draw()
{
...
// 每两帧调用一次DDGIRender()。
if (m_ddgiON && m_frameCnt % RENDER_EVERY_NUM_FRAME == 0) {
m_ddgiRender->UpdateDirectionalLight(m_ddgiDirLight); // 更新光源信息。
m_ddgiRender->UpdateCamera(m_ddgiCamera); // 更新相机信息。
m_ddgiRender->DDGIRender(); // DDGI渲染(执行)一次,渲染结果保存在步骤4创建的Texture中。
}
...
}
void DDGIExample::Render()
{
if (!prepared) {
return;
}
SetupDDGICamera();
if (!paused || m_camera.updated) {
UpdateUniformBuffers();
}
Draw();
m_frameCnt++;
}
9. To superimpose DDGI indirect light results, the usage process is as follows:
// 最终着色shader。
// 通过上采样计算屏幕空间坐标对应的DDGI值。
vec3 Bilateral(ivec2 uv, vec3 normal)
{
...
}
void main()
{
...
vec3 result = vec3(0.0);
result += DirectLighting();
result += IndirectLighting();
vec3 DDGIIrradiances = vec3(0.0);
ivec2 texUV = ivec2(gl_FragCoord.xy);
texUV.y = shadingPara.ddgiTexHeight - texUV.y;
if (shadingPara.ddgiDownSizeScale == 1) { // 未降低分辨率。
DDGIIrradiances = texelFetch(irradianceTex, texUV, 0).xyz;
} else { // 降低分辨率。
ivec2 inDirectUV = ivec2(vec2(texUV) / vec2(shadingPara.ddgiDownSizeScale));
DDGIIrradiances = Bilateral(inDirectUV, N);
}
result += DDGILighting();
...
Image = vec4(result_t, 1.0);
}
Learn more details>>
Visit the official website of Huawei Developer Alliance
Get development guidance documents
Huawei Mobile Services Open Source Warehouse Address: GitHub , Gitee
Follow us to know the latest technical information of HMS Core for the first time~
**粗体** _斜体_ [链接](http://example.com) `代码` - 列表 > 引用
。你还可以使用@
来通知其他用户。