头图

On the afternoon of July 17th, in the front-end special tour salon Beijing station, Agora cross-platform development engineer Lu Xuhui brought a Flutter2 rendering principle and how to achieve video rendering ". This article is a compilation of the content of the speech.

This sharing mainly includes 3 parts:

  1. Flutter2 overview.
  2. Practice of Flutter2 video rendering plug-in.
  3. Flutter2 rendering principle (source code).

Preface

In fact, Flutter1's domestic market share is not high. Many developers may know that Flutter’s upper-level language is based on Google’s Dart (a language that tried to replace JavaScript but ended in failure), and Dart language is also not a lot of developers. Too much acceptable to Flutter. Many domestic companies may still choose ReactNative or insist on native development, but with the advent of Flutter2 (full platform support), and Ali's Beihai framework (a cross-platform framework based on the rendering capabilities of the Flutter Engine to achieve the upper layer using JavaScript), I believe in Flutter2 The future can be expected. Considering that many readers may be front-end developers, in the third part I will cut in from the perspective of the Web. You will see a lot of familiar and unfamiliar content. It doesn’t matter whether you are a Flutter developer or whether you know Flutter. The important thing is Flutter's design ideas, I hope it will be helpful to everyone.

Flutter2 overview

Flutter2 is the latest version of Flutter released by Google in March 2021. It supports Null-Safety (Null-Safety) based on Dart1.12. You can compare TypeScript's "?". The compiler will ask you to check the data that may be empty. Perform verification, so that some null pointer problems can be avoided during the development process. And more importantly, it provides stable version support for the Web side, and support for the desktop side has also been incorporated.

Let's take a look at the overall architecture of Flutter2:

图片

The Web part of Flutter2 includes the Framework layer and the Browser layer. The Framework layer covers rendering, drawing, gesture processing, etc. The Browser layer covers CSS, HTML, Canvas, WebGL, etc. (after all, it still runs on the browser), and the final WebAssembly is for Use C and C++ to schedule the Skia rendering engine, which we will also introduce in detail in the third part.

In addition to the general Framework layer, the Native part also includes the Engine layer and the Embedder layer. The Engine layer mainly includes Dart virtual machine, Isolate initialization, as well as layer composition, GPU rendering, platform channels, text layout, etc., while the Embedder layer mainly uses Adapt to the characteristics of different platforms.

At first glance, the difference between Web and Native is quite big, but in fact, there is also an Engine layer based on Dart on the Web, called web_ui, which is mainly used to process Composition and Rendering on the Web.

图片

Next, take a brief look at the platform differences of Flutter2, as shown in the figure above. Currently Flutter2 supports 6 mainstream platforms, namely Web, Android, iOS, Windows, macOS and Linux. Compared with other cross-platform frameworks, such as ReactNative and Electron (representatives of mobile and desktop, respectively), Flutter2 has a richer platform support. Although ReactNative also has desktop support contributed by Microsoft and Expo's support for the Web, it still Not uniform enough.

For some build tools or package management tools, Flutter2 uses standard methods for each platform. For example, Web is still based on JavaScript, which benefits dart2js to compile Dart into JavaScript; in Android, it is based on Gradle system; in iOS and macOS, it is based on CocoaPods introduces Flutter into the project; in Windows and Linux, it is mainly based on CMake.

Regarding some features of Flutter, such as PlatformView, it provides the ability to bridge native controls, such as displaying an Element on the Web or displaying a custom View on Android and iOS. However, PlatformView is currently not supported on the desktop. This is not to say that it is technically impossible, but that it has not yet been developed. ExternalTexture is an external texture, and users can render their own graphics data. dart::ffi gives Flutter the ability to directly call C and C++, both of which are supported except for the Web.

Implementation of Flutter2 video rendering plug-in

1. Implementation process of rendering video plug-in

Next, I will share the practice of the sound network in the video rendering plug-in, which is mainly for the Web and desktop.

图片

As described in the previous platform differences, the Web does not support ExternalTexture, and Desktop does not support PlatformView. So on the Web, we use PlatformView to achieve video rendering. The basic process is to use ui.platformViewRegistry to register PlatformView and return to DivElement. After DivElement is created, package:js needs to be used to implement mutual calls between Dart and JavaScript.

Soundnet has a dedicated Web audio and video SDK, so we did not do too much operation on the Dart layer, but did a JS layer packaging. This packaging library dispatches the SDK to operate WebRTC to create the VideoElement, and finally append to the previous creation. Realize video rendering in DivElement.

图片

Next, let’s take a look at the desktop solution. Because it does not support PlatformView, we can only use the ExternalTexture solution to implement custom video rendering. Call the custom createTextureRender function in the Native layer through MethodChannel, which will dispatch FlutterTextureRegistry to create FlutterTexture, and at the same time Throw the textureId back to the Dart layer and bind it to the Texture Widget. The video data of the Native SDK will be converted into the image format at the AgoraRtcWrapper layer, and then we can use the MarkTextureFrameAvailable function of the FlutterTextureRegistry to notify FlutterTexture to obtain the image data from the callback.

2. Some pits encountered in the development of

We will also encounter some problems in the process of plug-in development. Here is a brief share with you:

图片

As far as the desktop is concerned, macOS is the OC header file, and Windows is the C++ header file. Linux is the C header file, this part is not completely unified, and even some APIs are different, so there will be a lot of trouble in the desktop development process, after all, it is not completely stable at present.

To cite some specific cases, as shown in the figure above, the first three are all problems encountered on the Web.

1. ui.platformViewRegistry will report an error on the Web because it is not defined in the ui.dart of the Framework layer, but defined in web_ui/ui.dart, but it does not affect the operation, so you can choose to use the ignore annotation ignore it.

2. We use dart::js, for example, to build a JavaScript object. At this time, we will use the @JS annotation to declare. If no external constructor is added, it can run normally in Debug mode, but in Profile and Release mode. Will report an error.

3.dart::io is mainly used to make calls to specific platforms, such as platform judgments that are unavailable on the Web. We can use if(dart.library.html) to point to a custom Dart file when importing and define empty implementations for related APIs, or use kIsWeb to not execute related APIs on the Web.

4. On Windows, EncodableValue is used to communicate between Dart and C++ (std::variant based on C++17 can be understood as type1|type2|type3 in TypeScript). When dealing with int32 and int64, the Framework layer directly judges whether the maximum value of int32 is exceeded, and if it exceeds, it is directly marked as int64. Developers who have used the sound network SDK may know that the type of our user ID is uint32, uint32 takes Part of the value range is greater than int32 and less than int64, so if you simply use std::get to get it, no matter if you specify int32_t or int64_t, you may get an error. Fortunately, it provides the LongValue function, which is judged internally and used uniformly. int64 returns.

Next is the key point of this topic: Flutter2 rendering principles. Many principles of the Flutter engine are common, but they are implemented in Dart on the Web, and mainly in C and C++ on Native.

Flutter2's rendering principle

1、Flutter Framework

Before officially starting, let’s briefly review. As mentioned earlier, the Flutter framework is divided into a Framework part and an Engine part, and the rendering process is also completed by these two parts, but different from other frameworks, it is directly handed over to the upper layer after processing. The characteristics of the lower layer, Flutter Engine will provide some Builders for Framework use, so many processes are completed by the back and forth scheduling of these two parts.

图片

Let’s take a look at the entire rendering process of Flutter. UserInput handles user input, and Animation is animation, but these two parts are not the focus of today. Build is mainly used to make Widget generate RenderObject that can be recognized by the Flutter framework, and Layout is mainly used to determine For component position and size, Paint is mainly used to transform the rendered object into Layer, which is then merged by Composition, and finally Rasterize rasterizes for GPU rendering.

Flutter is based on a tree structure when dealing with UI. From the figure below we can see three tree structures, namely Widget Tree, Element Tree and Render Tree.

图片

We start with Widget and create a Container, which contains Row (Flex layout container), and Row contains Image and Text. The Container contains ColoredBox, which can be used as a background or border. Image contains RawImage, and Text contains RichText. Only ColoredBox, Row, RawImage and RichTexth will be converted to RenderObjectElement, and they will eventually generate corresponding RerderObjects.

So let’s take a look at what RenderObject is. It’s the object that really needs to be rendered. The attach function will hand over the rendering process to the PipelineOwner. The three functions in the figure below are mainly used to determine whether Layout is needed and whether it needs to be synthesized. And whether it needs to be drawn.

图片

Now look at the main function of PipelineOwner. It is used to manage the rendering process. First, Flutter will register a frame callback when it is initialized. Flutter's frame is managed by itself, and then the three functions flushLayout, flushCompositingBits and flushPaint will be triggered in the callback. , They correspond to the 3 mark functions of RenderObject mentioned earlier.

图片

There are 3 arrays in PipelineOwner. The previously marked RenderObjects will be stored in these 3 arrays respectively, and these RenderObjects can be quickly traversed during the final flush. After being processed by the PipelineOwner, it will call the compositeFrame function of RenderView, which we will explain later.

Let's first focus on the flushPaint function. FlushPaint will call RenderObject's paint function. This is an abstract function that is not implemented by itself, but is implemented by subclasses that inherit it.

图片

You can see that the first parameter of the paint function is PaintingContext. Let's take a look at some of its APIs. Their return values are all Layer, including the following pushClipRect and other functions that return different subclasses of Layer. So one of the duties of the paint function is to convert RenderObject to Layer and add it to the ContainerLayer of its members. By the way, the LayerHandle here is a reference count to handle automatic release.

图片

Another responsibility of the paint function is to save the Canvas drawing instructions through PictureRecorder for the RenderObject that needs to be drawn.

图片

Canvas is mainly used to draw objects that need to be drawn, such as the aforementioned RichText, RawImage, etc. In addition, it can also perform operations such as transform and clipPath.

图片

In the Canvas factory structure here, useCanvasKit will be judged and different Canvas will be constructed. Why is there this logic? Click here to not list it first, and it will be introduced later. Let's first follow the Render Pipeline and look down.

After the aforementioned PipeLineOwner process ends, the compositeFrame function of RenderView will be called for Layer synthesis. In the compositeFrame function, we can see several very important Classes, namely Scene and SceneBuilder. Scene is the product of Layer synthesis and is constructed by SceneBuiler.

图片

As shown in the figure, it will finally call the _window.render function, where _window is SingletonFlutterWindow, which is a singleton RenderView, which will be described in detail later, let’s take a look at the process of Build Scene first.

图片

Here we can see part of the source code of Layer. As mentioned earlier, there is a ContainerLayer in RenderObject. BuildScene calls ContainerLayer's buildScene function (the right half of the figure above), and then calls Layer's addToScene function, which is similar to RenderObject's paint function. , Is also an abstract function that needs to be implemented by the subclasses of Layer. For example, the addToScene function of ContainerLayer is to traverse the Child Tree to call the addToScene of the sub-Layer respectively.

图片

So what does addToScene do? It actually calls the pushXXX function provided by SceneBuilder. The return value of these functions is also Layer, but it is EngineLayer. Layer is the abstraction of the layer in the Framework, and EngineLayer is the abstraction of the layer in the Engine. Then combine these EngineLayer into Scene in Engine layer.

2、Flutter Engine

The Framework layer has been introduced almost, let's take a look at the Engine layer.

To briefly review, our Widget will go through this conversion process: Widget->RenderObject->Layer->EngineLayer->Scene, then how does this Scene render?

图片

Here we see the previously mentioned SingletonFlutterWindow. Its render function calls EnginePlatformDispatcher's render function. Here we see the familiar useCanvasKit. According to the judgment, the Scene is forced to be a different Scene. So what does this useCanvasKit mean? We then look down.

At this time, we have to introduce a concept, that is, Web Renderer. There are two rendering modes in Flutter Web: one is based on HTML tags, which will map Flutter's Widgets to different tags. You cannot simply use tags. The representation will be drawn using Canvas, which is somewhat similar to the representation of ReactNative.

The other is based on the CanvasKit rendering mode, which downloads a 2MB wasm file to call the Skia rendering engine. Widget rendering is drawn by this engine.

图片

We can specify the rendering mode during flutter build or run through command line parameters. It is worth mentioning that the default rendering mode is auto, which is CanvasKit on desktop browsers, and HTML on mobile WebView.

First, let’s take a look at the HTML rendering mode. Take the API Example of our Flutter SDK as an example. Through the Elements Tree, we can see that its label levels are quite large. The <canvas> tag in the picture points to the "Basic" text. , This shows that Canvas is used for text rendering in this mode, so why use Canvas to draw text instead of using the browser's default text rendering capabilities? That's because to erase the difference in platform rendering performance, especially the word wrap processing, etc. Flutter has a built-in text typesetting engine, which will render based on this engine. Here is an extension. For example, the input box component is similar to Text when it is not focused. If the focus is obtained, Flutter will add an <input> tag, and then receive the input text information. When the focus is lost It’s a very clever scheme to hide it when it’s time.

图片

Next we look at some details in HTML rendering mode. The Canvas that was not shown before will be displayed here. In the HTML rendering mode, the SurfaceCanvas will be constructed. You can see the List from the picture on the right, which is a collection of drawing instructions.

图片

For SceneBuilder, here is its subclass SurfaceSceneBuilder, we can first look at the PersistedSurface on the right side of the figure below.

图片

It is a subclass of EngineLayer and has a rootElement property and a visitChildren function, which is also an abstract function. PersistedLeafSurface is an EngineLayer without children, so its visitChildren is an empty implementation, and PersistedPicture and PersistedPlatformView are derived from it, corresponding to the picture text (we mentioned that the text is drawn using Canvas) and the platform View. PersistedContainerSurface is the EngineLayer of a container. It also has many subclasses, such as PersistedClipPath, PersistedTransform, etc. These EngineLayer correspond to the custom tags in the complex Elements Tree of the previous API Example.

After the build function of SurfaceSceneBuilder is executed, the webOnlyRootElement in the generated SurfaceScene already contains our entire Html Element.

Finally, we can see that SurfaceScene will call DomRenderer's renderScene function to add these Elements to _sceneHostElement.

图片

At this point, the HTML rendering mode is over.

Let's take a look at the rendering mode of CanvasKit. From the Elements Tree, we can see that the level of this mode is very simple. All rendering is performed in a canvas. The #shadow-root used here is a feature of HTML. , You can achieve style isolation.

图片

Similarly, we start with Canvas, here is CanvasKitCanvas, and the drawing instructions are stored in the _commands attribute of CkPictureSnapshot.

图片

For SceneBuilder, the subclass of CanvasKit rendering mode is LayerSceneBuilder. The Layer here is similar to PersistedSurface in HTML rendering mode. It is derived from EngineLayer, and there is a ContainerLayer that contains all children, as well as corresponding PictureLayer and PlatformViewLayer. But the difference is that it has a paint function, where the paint function is the function that actually manipulates the GPU to draw.

The LayerScene generated by the build function of LayerSceneBuilder contains a root node called LayerTree, which corresponds to webOnlyRootElement in HTML rendering mode.

图片

Since the paint function mentioned here is the real drawing, let's take a look at when it is called.

When How To Render Scene was mentioned earlier, LayerScene draws by calling the draw function of rasterizer. Rasterizer is the class responsible for rasterization for GPU rendering. Here, acquireFrame will be called to obtain frameSize from LayerTree to build SurfaceFrame. At the same time, SkSurface will be built inside, and a series of scheduling operations on Skia such as binding WebGLContext will be bound.

The Frame generated by context.acquireFrame is just a simple aggregation class, don't care too much, and then call the raster function of Frame for rasterization. The last addToScene is to add the HTML tag of canvas in baseSurface to skiaSceneHost.

图片

The rasterization stage is composed of preroll and paint, respectively calculating the drawing boundary, traversing the LayerTree and calling the paint function of all layers. The PaintContext here is different from the PaintingContext of the Framework. It holds all the canvases so that different layers can paint them. operate.

图片

At this point, the process in CanvasKit rendering mode is almost finished. Let's finally take a look at how it is finally displayed in HTML. In fact, DomRenderer is finally used in CanvasKit rendering mode. In the initialization process of Flutter, we can see that the first half of the initializeCanvasKit function is the wasm resource and corresponding JavaScript file that we mentioned before introducing Skia; the second half is Created a skiaSceneHost root node, this Element is the previous baseSurface.addToScene reference.

图片

The whole rendering principle is introduced here. Of course, there are many details in the whole rendering. For example, in addition to baseSurface and backupSurface in SurfaceFactory, the rendering can be cached. Each of these points can be discussed as a separate topic. Finally, a summary flow chart is posted, and you can review the entire process in conjunction with the previous article.

图片

At the end of the sharing, I will attach the GitHub link of the Flutter RTC SDK. At present, we have made Flutter2 adaptation on the dev/flutter branch. Screen sharing is also supported on the Web and desktop. You can experience it yourself. If you have any questions or suggestions, you are welcome to give feedback. If the experience is good, you are also welcome to order Star in our warehouse.

图片


RTE开发者社区
647 声望966 粉丝

RTE 开发者社区是聚焦实时互动领域的中立开发者社区。不止于纯粹的技术交流,我们相信开发者具备更加丰盈的个体价值。行业发展变革、开发者职涯发展、技术创业创新资源,我们将陪跑开发者,共享、共建、共成长。