Many people's impression of screen sharing is still in the scene of PPT reporting on the PC side, but in fact, today's screen sharing has already crossed the border. For example, in a scene that everyone is very familiar with-game live broadcast, the anchor needs to show his screen to the audience in the form of "screen sharing", and the requirements for real-time and fluency are very high.
For many mobile game anchors, it is a common practice at present to share the mobile game screen live by using the transfer on the PC side; in fact, by calling the Rongyun screen sharing SDK, you can have real-time screen sharing directly on the mobile phone. Ability.
This article will focus on the issue of iOS screen sharing, and learn about the development process of the iOS ReplayKit framework, the function evolution at each stage, and the code and ideas for implementing corresponding functions in conjunction with the Rongyun screen sharing SDK.
01 History of
ReplayKit for iOS screen recording started to appear in iOS9.
iOS9
WWDC15 provided the ReplayKit framework for the first time. Its initial appearance was mainly used to record videos and store them in photo albums.
The iOS9 start recording and stop recording APIs have great limitations:
Only MP4 files generated by the system can be obtained, and they cannot be obtained directly. You need to save them to the album first, and then obtain them from the album;
The source data cannot be obtained, that is, pcm and yuv data;
Developers are given low permissions and cannot record other apps. If they exit the background, they will not be recorded, and only the current APP screen can be recorded.
The controllable behavior is:
Stop recording, a video preview window will pop up, and you can save or cancel or share the video file;
After recording, you can view, edit, or share through designated methods.
The API to start recording video is shown below.
/*!
Deprecated. Use startRecordingWithHandler: instead.
@abstract Starts app recording with a completion handler. Note that before recording actually starts, the user may be prompted with UI to confirm recording.
@param microphoneEnabled Determines whether the microphone input should be included in the recorded movie audio.
@discussion handler Called after user interactions are complete. Will be passed an optional NSError in the RPRecordingErrorDomain domain if there was an issue starting the recording.
*/
[[RPScreenRecorder sharedRecorder] startRecordingWithMicrophoneEnabled:YES handler:^(NSError * _Nullable error) {
if (error) {
//TODO.....
}
}];
When calling to start recording, the system will pop up a pop-up window that requires the user to confirm before recording normally.
The API to stop recording video is shown below.
/*! @abstract Stops app recording with a completion handler.
@discussion handler Called when the movie is ready. Will return an instance of RPPreviewViewController on success which should be presented using [UIViewController presentViewController:animated:completion:]. Will be passed an optional NSError in the RPRecordingErrorDomain domain if there was an issue stopping the recording.
*/
[[RPScreenRecorder sharedRecorder] stopRecordingWithHandler:^(RPPreviewViewController *previewViewController, NSError * error){
[self presentViewController:previewViewController animated:YES completion:^{
//TODO.....
}];
}];
iOS10
After the release of WWDC16, Apple upgraded ReplayKit, opened up source data acquisition channels, and added two Extension Targets. Specific circumstances include:
Added two Extension Targets, UI and Upload;
Increase developer permissions, allow users to log in to the service and set up live broadcast and source data operations;
You can only record the screen through the extended area, you can record not only your own APP, but also other APPs;
Only the APP screen can be recorded, not the iOS system screen.
The method of creating Extension is shown in the figure below.
UI Extension
/*
这俩API可以理解为弹窗触发的事件回调函数;
*/
- (void)userDidFinishSetup {
//触发 Host App 的RPBroadcastActivityViewControllerDelegate
}
- (void)userDidCancelSetup {
//触发 Host App 的RPBroadcastActivityViewControllerDelegate
}
Upload Extension
- (void)broadcastStartedWithSetupInfo:(NSDictionary<NSString *,NSObject *> *)setupInfo {
// User has requested to start the broadcast. Setup info from the UI extension can be supplied but optional.
//这里主要就是做一些初始化的行为操作
}
- (void)broadcastPaused {
// User has requested to pause the broadcast. Samples will stop being delivered.
//接收系统暂停信号
}
- (void)broadcastResumed {
// User has requested to resume the broadcast. Samples delivery will resume.
//接收系统恢复信号
}
- (void)broadcastFinished {
// User has requested to finish the broadcast.
//接收系统完成信号
}
//这里就是此次更新最炸的点,我们可以拿到系统源数据,而且系统还分了三类,分别为视频帧、App内声音、麦克风
- (void)processSampleBuffer:(CMSampleBufferRef)sampleBuffer withType:(RPSampleBufferType)sampleBufferType {
switch (sampleBufferType) {
case RPSampleBufferTypeVideo:
// Handle video sample buffer
break;
case RPSampleBufferTypeAudioApp:
// Handle audio sample buffer for app audio
break;
case RPSampleBufferTypeAudioMic:
// Handle audio sample buffer for mic audio
break;
default:
break;
}
}
Host APP
RPBroadcastControllerDelegate
//start
if (![RPScreenRecorder sharedRecorder].isRecording) {
[RPBroadcastActivityViewController loadBroadcastActivityViewControllerWithHandler:^(RPBroadcastActivityViewController * _Nullable broadcastActivityViewController, NSError * _Nullable error) {
if (error) {
NSLog(@"RPBroadcast err %@", [error localizedDescription]);
}
broadcastActivityViewController.delegate = self; /*RPBroadcastActivityViewControllerDelegate*/
[self presentViewController:broadcastActivityViewController animated:YES completion:nil];
}];
}
#pragma mark- RPBroadcastActivityViewControllerDelegate
- (void)broadcastActivityViewController:(RPBroadcastActivityViewController *)broadcastActivityViewController didFinishWithBroadcastController:(RPBroadcastController *)broadcastController error:(NSError *)error {
if (error) {
//TODO:
NSLog(@"broadcastActivityViewController:%@",error.localizedDescription);
return;
}
[broadcastController startBroadcastWithHandler:^(NSError * _Nullable error) {
if (!error) {
NSLog(@"success");
} else {
NSLog(@"startBroadcast:%@",error.localizedDescription);
}
}];
}
#pragma mark- RPBroadcastControllerDelegate
- (void)broadcastController:(RPBroadcastController *)broadcastController didFinishWithError:(nullable NSError *)error{
NSLog(@"didFinishWithError: %@", error);
}
- (void)broadcastController:(RPBroadcastController *)broadcastController didUpdateServiceInfo:(NSDictionary <NSString *, NSObject <NSCoding> *> *)serviceInf {
NSLog(@"didUpdateServiceInfo: %@", serviceInf);
}
iOS11
After WWDC17, Apple upgraded ReplayKit2 again, and added data acquisition outside the APP, which can be obtained directly in the Host App, including:
You can directly process the recorded APP screen data in the Host APP;
You can record the screen data of the iOS system, but it needs to be turned on manually through the control center.
Start the APP screen recording
[[RPScreenRecorder sharedRecorder] startCaptureWithHandler:^(CMSampleBufferRef _Nonnull sampleBuffer, RPSampleBufferType bufferType, NSError * _Nullable error) {
[self.videoOutputStream write:sampleBuffer error:nil];
} completionHandler:^(NSError * _Nullable error) {
NSLog(@"startCaptureWithHandler:%@",error.localizedDescription);
}];
Stop APP screen recording
[[RPScreenRecorder sharedRecorder] stopCaptureWithHandler:^(NSError * _Nullable error) {
[self.assetWriter finishWritingWithCompletionHandler:^{
//TODO
}];
}];
iOS12
Apple updated ReplayKit on WWDC18 and added RPSystemBroadcastPickerView. The class is used to start system recording in the APP, which greatly simplifies the screen recording process.
if (@available(iOS 12.0, *)) {
self.systemBroadcastPickerView = [[RPSystemBroadcastPickerView alloc] initWithFrame:CGRectMake(0, 0, 50, 80)];
self.systemBroadcastPickerView.preferredExtension = ScreenShareBuildID;
self.systemBroadcastPickerView.showsMicrophoneButton = NO;
self.navigationItem.rightBarButtonItem = [[UIBarButtonItem alloc] initWithCustomView:self.systemBroadcastPickerView];
} else {
// Fallback on earlier versions
}
02 Rongyun
In order to reduce the integration burden of developers, Rongyun specially created the RongRTCReplayKitExt library to serve the screen sharing business.
Design ideas
Upload Extension
SampleHandler performs data reception and RCRTCReplayKitEngine initialization configuration;
RCRTCReplayKitEngine initializes socket communication, processes YUV data to i420, and controls memory peak.
App
Original release process:
IM connection-join room-publish resource (RCRTCScreenShareOutputStream);
Socket initialization is performed internally, and the protocol receives processed data and pushes the stream.
Code example
Upload extension
#import "SampleHandler.h"
#import <RongRTCReplayKitExt/RongRTCReplayKitExt.h>
static NSString *const ScreenShareGroupID = @"group.cn.rongcloud.rtcquickdemo.screenshare";
@interface SampleHandler ()<RongRTCReplayKitExtDelegate>
@end
@implementation SampleHandler
- (void)broadcastStartedWithSetupInfo:(NSDictionary<NSString *, NSObject *> *)setupInfo {
// User has requested to start the broadcast. Setup info from the UI extension can be supplied but optional.
[[RCRTCReplayKitEngine sharedInstance] setupWithAppGroup:ScreenShareGroupID delegate:self];
}
- (void)broadcastPaused {
// User has requested to pause the broadcast. Samples will stop being delivered.
}
- (void)broadcastResumed {
// User has requested to resume the broadcast. Samples delivery will resume.
}
- (void)broadcastFinished {
[[RCRTCReplayKitEngine sharedInstance] broadcastFinished];
}
- (void)processSampleBuffer:(CMSampleBufferRef)sampleBuffer withType:(RPSampleBufferType)sampleBufferType API_AVAILABLE(ios(10.0)) {
switch (sampleBufferType) {
case RPSampleBufferTypeVideo:
[[RCRTCReplayKitEngine sharedInstance] sendSampleBuffer:sampleBuffer withType:RPSampleBufferTypeVideo];
break;
case RPSampleBufferTypeAudioApp:
// Handle audio sample buffer for app audio
break;
case RPSampleBufferTypeAudioMic:
// Handle audio sample buffer for mic audio
break;
default:
break;
}
}
#pragma mark - RongRTCReplayKitExtDelegate
-(void)broadcastFinished:(RCRTCReplayKitEngine *)broadcast reason:(RongRTCReplayKitExtReason)reason {
NSString *tip = @"";
switch (reason) {
case RongRTCReplayKitExtReasonRequestedByMain:
tip = @"屏幕共享已结束......";
break;
case RongRTCReplayKitExtReasonDisconnected:
tip = @"应用断开.....";
break;
case RongRTCReplayKitExtReasonVersionMismatch:
tip = @"集成错误(SDK 版本号不相符合)........";
break;
}
NSError *error = [NSError errorWithDomain:NSStringFromClass(self.class)
code:0
userInfo:@{
NSLocalizedFailureReasonErrorKey:tip
}];
[self finishBroadcastWithError:error];
}
Host App
- (void)joinRoom {
RCRTCVideoStreamConfig *videoConfig = [[RCRTCVideoStreamConfig alloc] init];
videoConfig.videoSizePreset = RCRTCVideoSizePreset720x480;
videoConfig.videoFps = RCRTCVideoFPS30;
[[RCRTCEngine sharedInstance].defaultVideoStream setVideoConfig:videoConfig];
RCRTCRoomConfig *config = [[RCRTCRoomConfig alloc] init];
config.roomType = RCRTCRoomTypeNormal;
[self.engine enableSpeaker:YES];
__weak typeof(self) weakSelf = self;
[self.engine joinRoom:self.roomId
config:config
completion:^(RCRTCRoom *_Nullable room, RCRTCCode code) {
__strong typeof(weakSelf) strongSelf = weakSelf;
if (code == RCRTCCodeSuccess) {
self.room = room;
room.delegate = self;
[self publishScreenStream];
} else {
[UIAlertController alertWithString:@"加入房间失败" inCurrentViewController:strongSelf];
}
}];
}
- (void)publishScreenStream {
self.videoOutputStream = [[RCRTCScreenShareOutputStream alloc] initWithAppGroup:ScreenShareGroupID];
RCRTCVideoStreamConfig *videoConfig = self.videoOutputStream.videoConfig;
videoConfig.videoSizePreset = RCRTCVideoSizePreset1280x720;
videoConfig.videoFps = RCRTCVideoFPS24;
[self.videoOutputStream setVideoConfig:videoConfig];
[self.room.localUser publishStream:self.videoOutputStream
completion:^(BOOL isSuccess, RCRTCCode desc) {
if (isSuccess) {
NSLog(@"发布自定义流成功");
} else {
NSLog(@"发布自定义流失败%@", [NSString stringWithFormat:@"订阅远端流失败:%ld", (long) desc]);
}
}];
}
03 Some notes
First, ReplayKit2's memory cannot exceed 50MB. Once the peak is exceeded, the system will be forced to recycle. Therefore, when processing data in Extension, you need to pay special attention to memory release.
Second, for communication before the process, if CFDefaultcenter cannot carry parameters, only messages can be sent; if parameters need to be carried, local file caching must be done. One issue to note here is that you can print out data when running in debug mode, but you cannot get local file data under release. For the specific implementation, please refer to the guide in Github.
Finally, I want to make a little spit: when the screen recording is abnormally ended, the system often displays a pop-up window, and the pop-up window cannot be deleted, and the device can only be restarted-this should be regarded as a more annoying bug in the iOS system.
**粗体** _斜体_ [链接](http://example.com) `代码` - 列表 > 引用
。你还可以使用@
来通知其他用户。