头图

I. Overview

In scenarios such as video conferences, online classes, and game live broadcasts, screen sharing is one of the most common functions. Screen sharing is the real-time sharing of screen images. There are several end-to-end steps: screen recording and capture, video encoding and packaging, real-time transmission, video decapsulation and decoding, and video rendering.

Generally speaking, during real-time screen sharing, the sharing initiator captures the picture of the specified source (including the specified screen, specified area, specified program, etc.) on the screen at a fixed sampling frequency (usually 8-15 frames are enough), and then undergoes video encoding and compression. (The scheme that keeps the text/graphic edge information undistorted should be selected), and then distributed at the corresponding frame rate on the real-time network.

Therefore, screen capture is the basis for real-time screen sharing, and its application scenarios are also very extensive.

Nowadays, the application of Flutter is becoming more and more extensive, and there are more and more pure Flutter projects. In this article, we mainly share the implementation of Flutter's screen capture.

Second, the realization process

Before introducing the implementation process in detail, let's take a look at what capabilities the native system provides for screen recording.

1. iOS 11.0 provides ReplayKit 2 for capturing global screen content across apps, but it can only be launched through the control center; iOS 12.0 provides the ability to launch ReplayKit from within an app on this basis.

2. The Android 5.0 system provides the MediaProjection function, and the global screen content can be collected only by obtaining the user's consent in the pop-up window.

Let's take a look at the difference between the screen capture capabilities of Android / iOS.

1. iOS ReplayKit collects screen data by starting a Broadcast Upload Extension sub-process. It is necessary to solve the problem of communication and interaction between the main App process and the screen capture sub-process. At the same time, the sub-process also has other functions such as the maximum runtime memory of 50M. limits.

2. Android's MediaProjection runs directly in the main process of the App, and can easily obtain the Surface of the screen data.

Although native code cannot be avoided, we can try to implement Flutter screen capture with the least native code. The screen capture capabilities at both ends are abstracted and encapsulated into a common Dart layer interface. After only one deployment is completed, you can happily start and stop screen capture at the Dart layer.

Next, we will introduce the implementation process of iOS and Android respectively.

1、iOS

Open the Runner Xcode Project in the iOS directory in the Flutter App project, create a new Broadcast Upload Extension Target, and process the business logic of the ReplayKit subprocess here.

First of all, it is necessary to deal with the cross-process communication between the main App process and the ReplayKit sub-process. Since the audio/video buffer callbacks for screen capture are very frequent, for performance and Flutter plug-in ecology, processing audio and video buffers on the native side is obviously the most reliable at present. solution, the remaining solution is to start and stop signaling and the transmission of necessary configuration information.

For the operation of starting ReplayKit , you can use Flutter's MethodChannel to create a new RPSystemBroadcastPickerView on the native side, which is a system-provided View, including a Button that directly pops up the startup screen capture window after clicking. By traversing the Sub View to find the Button and trigger the click operation, the problem of starting ReplayKit is solved.

static Future<bool?> launchReplayKitBroadcast(String extensionName) async {
    return await _channel.invokeMethod(
        'launchReplayKitBroadcast', {'extensionName': extensionName});
}
- (void)handleMethodCall:(FlutterMethodCall*)call result:(FlutterResult)result {
    if ([@"launchReplayKitBroadcast" isEqualToString:call.method]) {
        [self launchReplayKitBroadcast:call.arguments[@"extensionName"] result:result];
    } else {
        result(FlutterMethodNotImplemented);
    }
}
​
- (void)launchReplayKitBroadcast:(NSString *)extensionName result:(FlutterResult)result {
    if (@available(iOS 12.0, *)) {
        RPSystemBroadcastPickerView *broadcastPickerView = [[RPSystemBroadcastPickerView alloc] initWithFrame:CGRectMake(0, 0, 44, 44)];
        NSString *bundlePath = [[NSBundle mainBundle] pathForResource:extensionName ofType:@"appex" inDirectory:@"PlugIns"];
        if (!bundlePath) {
            NSString *nullBundlePathErrorMessage = [NSString stringWithFormat:@"Can not find path for bundle `%@.appex`", extensionName];
            NSLog(@"%@", nullBundlePathErrorMessage);
            result([FlutterError errorWithCode:@"NULL_BUNDLE_PATH" message:nullBundlePathErrorMessage details:nil]);
            return;
        }
​
        NSBundle *bundle = [NSBundle bundleWithPath:bundlePath];
        if (!bundle) {
            NSString *nullBundleErrorMessage = [NSString stringWithFormat:@"Can not find bundle at path: `%@`", bundlePath];
            NSLog(@"%@", nullBundleErrorMessage);
            result([FlutterError errorWithCode:@"NULL_BUNDLE" message:nullBundleErrorMessage details:nil]);
            return;
        }
​
        broadcastPickerView.preferredExtension = bundle.bundleIdentifier;
        for (UIView *subView in broadcastPickerView.subviews) {
            if ([subView isMemberOfClass:[UIButton class]]) {
                UIButton *button = (UIButton *)subView;
                [button sendActionsForControlEvents:UIControlEventAllEvents];
            }
        }
        result(@(YES));
    } else {
        NSString *notAvailiableMessage = @"RPSystemBroadcastPickerView is only available on iOS 12.0 or above";
        NSLog(@"%@", notAvailiableMessage);
        result([FlutterError errorWithCode:@"NOT_AVAILIABLE" message:notAvailiableMessage details:nil]);
    }
}

Then there is the synchronization problem of configuration information:

The first solution is to use the App Group capability of iOS, share configuration information between processes through the NSUserDefaults persistent configuration, enable the App Group capability in the Runner Target and the Broadcast Upload Extension Target respectively, and set the same App Group ID, and then you can pass -[ NSUserDefaults initWithSuiteName] Read and write the configuration in this App Group.

Future<void> setParamsForCreateEngine(int appID, String appSign, bool onlyCaptureVideo) async {
    await SharedPreferenceAppGroup.setInt('ZG_SCREEN_CAPTURE_APP_ID', appID);
    await SharedPreferenceAppGroup.setString('ZG_SCREEN_CAPTURE_APP_SIGN', appSign);
    await SharedPreferenceAppGroup.setInt("ZG_SCREEN_CAPTURE_SCENARIO", 0);
    await SharedPreferenceAppGroup.setBool("ZG_SCREEN_CAPTURE_ONLY_CAPTURE_VIDEO", onlyCaptureVideo);
}
- (void)syncParametersFromMainAppProcess {
    // Get parameters for [createEngine]
    self.appID = [(NSNumber *)[self.userDefaults valueForKey:@"ZG_SCREEN_CAPTURE_APP_ID"] unsignedIntValue];
    self.appSign = (NSString *)[self.userDefaults valueForKey:@"ZG_SCREEN_CAPTURE_APP_SIGN"];
    self.scenario = (ZegoScenario)[(NSNumber *)[self.userDefaults valueForKey:@"ZG_SCREEN_CAPTURE_SCENARIO"] intValue];
}

The second solution is to use cross-process notification CFNotificationCenterGetDarwinNotifyCenter to carry configuration information to implement inter-process communication.

Next is to stop the operation of ReplayKit. The above-mentioned CFNotification cross-process notification is also used. After the main Flutter App initiates a notification to end screen capture, the ReplayKit subprocess calls -[RPBroadcastSampleHandler finishBroadcastWithError:] to end screen capture after receiving the notification.

static Future<bool?> finishReplayKitBroadcast(String notificationName) async {
    return await _channel.invokeMethod(
        'finishReplayKitBroadcast', {'notificationName': notificationName});
}
- (void)handleMethodCall:(FlutterMethodCall*)call result:(FlutterResult)result {
    if ([@"finishReplayKitBroadcast" isEqualToString:call.method]) {
        NSString *notificationName = call.arguments[@"notificationName"];
        CFNotificationCenterPostNotification(CFNotificationCenterGetDarwinNotifyCenter(), (CFStringRef)notificationName, NULL, nil, YES);
        result(@(YES));
    } else {
        result(FlutterMethodNotImplemented);
    }
}

// Add an observer for stop broadcast notification
CFNotificationCenterAddObserver(CFNotificationCenterGetDarwinNotifyCenter(),
                                (__bridge const void *)(self),
                                onBroadcastFinish,
                                (CFStringRef)@"ZGFinishReplayKitBroadcastNotificationName",
                                NULL,
                                CFNotificationSuspensionBehaviorDeliverImmediately);

// Handle stop broadcast notification from main app process
static void onBroadcastFinish(CFNotificationCenterRef center, void *observer, CFStringRef name, const void *object, CFDictionaryRef userInfo) {
​
    // Stop broadcast
    [[ZGScreenCaptureManager sharedManager] stopBroadcast:^{
        RPBroadcastSampleHandler *handler = [ZGScreenCaptureManager sharedManager].sampleHandler;
        if (handler) {
            // Finish broadcast extension process with no error
            #pragma clang diagnostic push
            #pragma clang diagnostic ignored "-Wnonnull"
            [handler finishBroadcastWithError:nil];
            #pragma clang diagnostic pop
        } else {
            NSLog(@"⚠️ RPBroadcastSampleHandler is null, can not stop broadcast upload extension process");
        }
    }];
}

image.png

                        (iOS 实现流程图示)

2、Android

The implementation of Android is relatively simple compared to iOS. When starting screen capture, you can directly use Flutter's MethodChannel to pop up a pop-up window requesting the user for screen capture permission through MediaProjectionManager on the native side. After receiving the confirmation, you can call MediaProjectionManager.getMediaProjection() function to get the MediaProjection object. .

It should be noted that due to Android's tightening of permission management, if the target API version (Target SDK) of your App is greater than or equal to 29, that is, Android Q (10.0), you need to start an additional foreground service. According to the migration document of Android Q, functions such as MediaProjection that require the use of foreground services must run in a separate foreground service.

First, you need to implement a class that inherits android.app.Service by yourself, and call the above getMediaProjection() function in the onStartCommand callback to obtain the MediaProjection object.

@Override
public int onStartCommand(Intent intent, int flags, int startId) {

    int resultCode = intent.getIntExtra("code", -1);
    Intent resultData = intent.getParcelableExtra("data");

    String notificationText = intent.getStringExtra("notificationText");
    int notificationIcon = intent.getIntExtra("notificationIcon", -1);
    createNotificationChannel(notificationText, notificationIcon);

    MediaProjectionManager manager = (MediaProjectionManager)getSystemService(Context.MEDIA_PROJECTION_SERVICE);
    MediaProjection mediaProjection = manager.getMediaProjection(resultCode, resultData);
    RequestMediaProjectionPermissionManager.getInstance().onMediaProjectionCreated(mediaProjection, RequestMediaProjectionPermissionManager.ERROR_CODE_SUCCEED);

    return super.onStartCommand(intent, flags, startId);
}

Then you also need to register this class in AndroidManifest.xml .

<service
    android:name=".internal.MediaProjectionService"
    android:enabled="true"
    android:foregroundServiceType="mediaProjection"
/>

Then judge the system version when the screen capture is started. If it is running in Android Q and later systems, start the foreground service, otherwise you can directly obtain the MediaProjection object.

@RequiresApi(api = Build.VERSION_CODES.LOLLIPOP)
private void createMediaProjection(int resultCode, Intent intent) {
    if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.Q) {
        service = new Intent(this.context, MediaProjectionService.class);
        service.putExtra("code", resultCode);
        service.putExtra("data", intent);
        service.putExtra("notificationIcon", this.foregroundNotificationIcon);
        service.putExtra("notificationText", this.foregroundNotificationText);
        this.context.startForegroundService(service);
    } else {
        MediaProjectionManager manager = (MediaProjectionManager) context.getSystemService(Context.MEDIA_PROJECTION_SERVICE);
        MediaProjection mediaProjection = manager.getMediaProjection(resultCode, intent);
        this.onMediaProjectionCreated(mediaProjection, ERROR_CODE_SUCCEED);
    }
}

Then, according to the needs of the business scenario, the consumer who collects the buffer from the screen gets Surface . For example, if you want to save the screen recording, you can get the Surface from MediaRecoder . If you want to record the screen live, you can call the interface of the audio and video live SDK to get the Surface.

With MediaProjection and the consumer's Surface , the next step is to call the MediaProjection.createVirtualDisplay() function to pass in the Surface to create an instance of VirtualDisplay , so as to obtain the screen capture buffer.

VirtualDisplay virtualDisplay = mediaProjection.createVirtualDisplay("ScreenCapture", width, height, 1,

DisplayManager.VIRTUAL_DISPLAY_FLAG_PUBLIC, surface, null, handler);

The last is to end the screen capture. Compared with the complicated operation of iOS, Android only needs to release the VirtualDisplay and MediaProjection instance objects.

3. Practical examples

Here is a sample Demo that implements iOS/Android screen capture and uses Zego RTC Flutter SDK for live streaming.

Download link: https://github.com/zegoim/zego-express-example-screen-capture-flutter

The Zego RTC Flutter SDK provides a docking entry for video frame data on the native side, and the screen capture buffer obtained in the above process can be sent to the RTC SDK to quickly realize screen sharing and streaming.

After obtaining the SampleBuffer from the system, the iOS side can directly send it to the RTC SDK, and the SDK can automatically process video and audio frames.

- (void)processSampleBuffer:(CMSampleBufferRef)sampleBuffer withType:(RPSampleBufferType)sampleBufferType {
    [[ZGScreenCaptureManager sharedManager] handleSampleBuffer:sampleBuffer withType:sampleBufferType];
}

The Android side needs to first obtain a SurfaceTexture from the RTC SDK and initialize the required Surface, Handler, and then create a VirtualDisplay object through the MediaProjection object obtained in the above process. At this time, the RTC SDK can obtain the screen capture video frame data.

SurfaceTexture texture = ZegoCustomVideoCaptureManager.getInstance().getSurfaceTexture(0);
texture.setDefaultBufferSize(width, height);
Surface surface = new Surface(texture);
HandlerThread handlerThread = new HandlerThread("ZegoScreenCapture");
handlerThread.start();
Handler handler = new Handler(handlerThread.getLooper());

VirtualDisplay virtualDisplay = mediaProjection.createVirtualDisplay("ScreenCapture", width, height, 1,
    DisplayManager.VIRTUAL_DISPLAY_FLAG_PUBLIC, surface, null, handler);

4. Summary and Outlook

Finally, let's summarize the main content of Flutter's screen capture implementation.

First of all, we need to understand the screen capture capabilities provided by iOS / Android natively in principle, and then introduce the interaction between Flutter and native, and how to control the start and stop of screen capture on the Flutter side. The last example shows how to connect with Zego RTC SDK to realize screen sharing and push streaming.

At present, Flutter on Desktop is stabilizing, Zego RTC Flutter SDK has provided preliminary support for Windows, we will continue to explore the application of Flutter on desktop, so stay tuned!


ZEGO即构
30 声望15 粉丝

音视频云服务商