Introduction: Why is the heat map shaking?
Deck.GL is Uber's open source geographic data rendering framework. When using Deck.GL to draw a heat map, it was found that when the map was continuously zoomed in, the map layer was obviously jittered, and the thermal aggregation results were also problematic. The demo below shows this phenomenon. The yellow layer is the heat map layer, and the black dots represent the original data. Obviously, when the map is constantly zoomed in, the heat map points do not correspond to the original data points and are constantly shaking.
But the official demo did not show this phenomenon, so what is the problem?
If you have to say what is the difference between the two demos, the data is different. The map data of the test demo is at the indoor map level, while the official demo data is at the city level. Indoor map data does not appear to be different until five or six digits after the decimal point. The city-level data is one or two digits after the decimal point, so it is very likely that the data accuracy loss is caused.
Based on this conjecture, there are indeed many articles on the Internet that describe the problems caused by the loss of accuracy of WebGL. It seems to be a common problem, so how to solve it? Let's start with the data analysis to understand how different geographic rendering frameworks solve this problem.
Front background
Web Mercator projection
Because the earth is round, to display the map on a plane, it needs to be drawn on the plane through a certain projection transformation. Mercator projection is also called "equal angle positive axis cylindrical projection". Its equiangular characteristic can ensure the invariance of the object's shape, as well as the correctness of the direction and mutual position. specific principle of not be repeated here, and those who are interested can understand by themselves. The Web Mercator projection simplifies the ellipsoidal sphere of the earth into a prototype sphere, and the formula for coordinate conversion is as follows:
It can be seen from the formula that the conversion from latitude coordinates to Y-axis coordinates is non-linear. The calculation not only relies on trigonometric functions and logarithmic operations, but also must be calculated for each coordinate in each frame of rendering, which will obviously bring A lot of computational overhead.
Loss of accuracy
In the geographic rendering framework, it is necessary to convert the latitude and longitude coordinates of the elements into the pixel coordinates of the screen. As the map continues to zoom in, the displacement value of the transformation matrix from the latitude and longitude coordinates to the pixel coordinates becomes larger and larger, that is, the pixel coordinate value becomes larger and larger. Because the JS data is a 64-bit double-precision floating-point number, and the data of the shader program GLSL can only be a 32-bit double-precision floating-point number, there will inevitably be a loss of precision when transferring data from JS to the shader. If you don't deal with the loss of accuracy, then when you zoom in to a certain level, you will find that these elements are jittery when you move the map.
Math.fround
method can convert data from 64-bit to 32-bit:
Math.fround(-122.4000588); // -122.40006256103516
Assuming that the coordinate is [-122.4000588, 37.7900699]
, and converting it to a 32-bit floating point number is [-122.40006256103516, 37.790069580078125]
, the difference between the two points in the real world is 0.3325 meters.
So how do different geographic rendering frameworks deal with the problem of large amounts of calculation and loss of precision?
Mapbox's approach
Mapbox uses a tile coordinate system. The display elements of geographic information are usually static, so the map can be divided into several tiles in advance, and each tile contains the actual geographic information element. In this way, every time the camera changes, you only need to render the data in units of tiles in the viewport. The tile coordinate system is also easy to understand. The following figure shows the tile coordinate system with a zoom level of 2:
All the processes of Mapbox are carried out in the plane coordinate system, so the longitude and latitude coordinates of the elements are projected to the plane coordinates through Mercator projection, and the offset matrix of the tile relative to the center point is recalculated in real time during each rendering process. Transform the data into the tile coordinate system:
function pixelsToTileUnits(tile: {tileID: OverscaledTileID, tileSize: number},
pixelValue: number, z: number): number {
return pixelValue * (EXTENT / (tile.tileSize * Math.pow(2,
z - tile.tileID.overscaledZ)));
}
const translation = [
inViewportPixelUnitsUnits ? translate[0] : pixelsToTileUnits(tile, translate[0], this.transform.zoom),
inViewportPixelUnitsUnits ? translate[1] : pixelsToTileUnits(tile, translate[1], this.transform.zoom),
0
];
const translatedMatrix = new Float32Array(16);
mat4.translate(translatedMatrix, matrix, translation);
After obtaining the plane coordinates, in order to reduce the amount of data, Mapbox simplifies some elements, and cuts the elements according to the tile information, and then obtains the tiles contained in the current viewport, and finally renders the contained elements in units of tiles. For the specific process, please refer to the article .
In this step of rendering, the coordinates of the elements of each tile passed into the shader are not the latitude and longitude, nor the absolute coordinates of Mercator, but the coordinates relative to the current tile:
// 墨卡托坐标 -> 相对瓦片坐标 [0, 8192]
function transformPoint(x, y, extent, z2, tx, ty) {
return [
Math.round(extent * (x * z2 - tx)),
Math.round(extent * (y * z2 - ty))];
}
Since the relative tile coordinates are used, the 32-bit precision of GLSL is sufficient, so the precision problem does not exist. But correspondingly, every time the camera projection matrix changes, the projection matrix of each tile also needs to be recalculated.
After the general zoom level is 18, for example, indoor maps are generally at 22 level. The grid is very small, which will cause the segmentation time to be very long. Considering the user experience, when faced with a set of tiles to be rendered, Mapbox sorts them according to the distance from the center of the screen, and renders the center tiles first:
// 屏幕中点坐标
const centerCoord = MercatorCoordinate.fromLngLat(this.center);
const centerPoint = new Point(numTiles * centerCoord.x - 0.5, numTiles * centerCoord.y - 0.5);
// 覆盖瓦片数组按屏幕中心距离排序
tiles.sort((a, b) => centerPoint.dist(a.canonical) - centerPoint.dist(b.canonical));
In a nutshell, Mapbox puts Mercator projection calculations in the CPU, and the coordinates passed into the shader program are relative tile coordinates, avoiding the loss of GLSL accuracy, and greatly reducing data through operations such as element simplification and fragmentation. , Which effectively controls the amount of calculation in the CPU.
Deck.GL's approach
Deck.GL itself is positioned to process millions of frequently changing data points. Mercator projection on the CPU will seriously affect performance. Deck.GL passes the latitude and longitude coordinates directly to the GPU for conversion in the vertex shader. In this way, it will inevitably bring about the loss of precision when JS transfers data to GLSL. These errors may not be perceivable even when the map range is large, and at high zoom levels, they will cause visible offsets, that is, "jitter" phenomenon, and as the zoom level increases, the errors will become larger and larger.
Deck.GL has tested the Y-axis pixel error of different zoom levels at different latitudes:
Data is split into high and low bits
In order to solve this problem, Deck.GL v3 introduced a method of simulating 64-bit double-precision floating-point numbers in GLSL, splitting the data into high and low bits, and the high and low bits of each number will be in the GPU Calculation:
highPart = Math.fround(x)
lowPart = x - highPart
Then it simulates 64-bit floating-point operations by using 32-bit floating-point cascade operations. Obviously the cost is huge GPU consumption. For example, a 64-bit division operation needs to be mapped to 11 32-bit operations, and a 64-bit matrix operation (mat4 to vec4) requires 1952 32-bit operations.
The use of this solution does solve the jitter problem caused by the loss of precision, but the simulated 64-bit matrix operation seriously affects the performance of shader compilation and analysis, and also increases the data bandwidth transmitted from the CPU to the GPU. Some low-performance graphics card drivers are not compatible. Even if they are compatible, it may take a few seconds to compile it, causing the display to freeze.
Offset coordinate system
"Offset Coords" solution with the center of the screen as the origin of the dynamic coordinates after the v6.2 version to solve this problem. .
The basic idea of the offset coordinate system is that the difference between two similar coordinates can just erase the high bits, and only 32 bits are needed to store the difference, and the accuracy is completely sufficient. Therefore, we need to select a fixed point to calculate the difference. The fixed point selects the center point of the viewport, and the process of calculating the offset part should be done in the shader. Because the latitude and longitude coordinates of the viewport center point of each frame may change, if the matrix calculation of the offset part is repeated in the CPU for each frame, the performance cannot be supported.
The following code is based on the current coordinate system to choose whether to process the normal latitude and longitude or the difference between the latitude and longitude.
// deck.gl/shaders/project.glsl
vec4 project_position(vec4 position) {
// 处理经纬度 offset
if (project_uCoordinateSystem == COORDINATE_SYSTEM_LNGLAT_AUTO_OFFSET) {
// 与视口中心点的偏移,在经纬度坐标下保留低位,32 位足够用
float X = position.x - project_coordinate_origin.x;
float Y = position.y - project_coordinate_origin.y;
return project_offset_(vec4(X, Y, position.z, position.w));
} else {
// 省略正常处理经纬度 -> 世界坐标
return vec4(
project_mercator(position.xy) * WORLD_SCALE * u_project_scale,
project_scale(position.z),
position.w
);
}
}
So how to determine which calculation method to use? Deck.GL sets the threshold of the zoom level and switches between normal and Offset coordinate systems. If the zoom level is greater than 12, you need to calculate the coordinates of the viewport center in the latitude and longitude coordinate system:
const LNGLAT_AUTO_OFFSET_ZOOM_THRESHOLD = 12;
if (coordinateZoom < LNGLAT_AUTO_OFFSET_ZOOM_THRESHOLD) {
} else {
// 使用 Offset 坐标,传入经纬度坐标系下的视口中心点
const lng = Math.fround(viewport.longitude);
const lat = Math.fround(viewport.latitude);
shaderCoordinateOrigin = [lng, lat];
}
Therefore, in the vertex shader, the final pixel space coordinate calculation result can be divided into two parts: the matrix operation of the offset part of the world coordinate system and the projection result of the viewport center:
// 处理 offset 和正常经纬度到世界坐标系转换
vec4 project_pos = project_position(vec4(a_pos, 0.0, 1.0));
gl_Position = u_mvp_matrix * project_pos + u_viewport_center_projection;
The projection result of the center point of the viewport can be calculated in each rendering frame in the CPU, and the matrix calculation of the offset part is completed in the shader. Therefore, the calculation of each frame only needs to update a small amount of uniform, which can almost be done in the CPU or It is done at zero cost on the GPU.
The test result is as shown in the figure below. The new mixed coordinate system (yellow) has the same accuracy as the 64-bit mode (red), even if only 32 bits are used, while the original 32-bit mode (blue) exhibits jitter at the same zoom level.
Calculating the difference-Taylor expansion
In the above calculation process, there is another detail worth noting. How to estimate the difference in the Mercator projection coordinate system based on the difference in the world coordinate system (latitude and longitude)? In the offset coordinate system scenario, α is the dynamic screen center coordinate, the difference between other points and the center point is (x-α), and the f function is the conversion function from the world coordinate system to the offset coordinate system. What Taylor expansion does is to estimate the value of any point of the f function based on the value at f(α), combined with the derivative of the f function. The more series of Taylor expansion, the smaller the error of the representative simulation value.
Web Mercator projection formula:
Since the X-axis direction is linear, linear estimation can be used:
The Y-axis direction is non-linear, you can use Taylor's second-order expansion:
Using vector operations in GLSL can quickly implement the above conversion formula: u_pixels_per_degree
corresponds to [K11, K21], and u_pixels_per_degree2
corresponds to [0, K22], and the value of K11K21K22 is obtained by the derivative of f.
// offset:[delta lng, delta lat]
vec4 project_offset(vec4 offset) {
float dy = offset.y;
dy = clamp(dy, -1., 1.);
vec3 pixels_per_unit = u_pixels_per_degree + u_pixels_per_degree2 * dy;
return vec4(offset.xyz * pixels_per_unit, offset.w);
}
to sum up
So go back to the problem at the beginning of this article, and look at the source code of the heat map to find the reason. The heat map module of Deck.GL has no conversion when transferring coordinates, and the accuracy of the coordinates passed into the shader has been lost. From the previous section, we can know that Deck.GL has different strategies for accuracy loss in different versions, so it may be that there is no test coverage during the strategy migration process, which causes the heat map module to still have problems. So I took the initiative to mention issue and it was resolved quickly.
After using the new version of the heat map, the problem was solved:
To sum up, there are several solutions to the jitter problem of geographic information in WebGL rendering at high zoom levels:
- Using the coordinate system relative to the tiles can effectively solve the accuracy problem. However, as the zooming degree becomes larger and larger, the time for tile segmentation becomes longer and longer, and if the accuracy of non-tile data is to be solved, it needs to be converted to the corresponding tiles.
- Splitting the data into high and low bits, and splitting a Float64Array into two Float32Arrays, although the accuracy problem can be solved, but the cost is a significant increase in memory overhead and GPU computing.
- In the offset coordinate system, the difference between the two similar coordinates can just erase the high bits. Only 32 bits are needed to store the difference, and the accuracy is completely sufficient.
In addition to data jitter, there are also z-fighting and Z-buffer precision loss problems due to the loss of accuracy of WebGL. This article will not repeat them. If you are interested, you can search for relevant information on the Internet.
Reference
- How (sometimes) assuming the Earth is "flat" helps speed up rendering in deck.gl
- Rendering big geodata on the fly with GeoJSON-VT
Author: ES2049 | timeless
The article can be reprinted at will, but please keep the link to the original text.
You are very welcome to join ES2049 Studio if you are passionate. Please send your resume to caijun.hcj@alibaba-inc.com .
**粗体** _斜体_ [链接](http://example.com) `代码` - 列表 > 引用
。你还可以使用@
来通知其他用户。