关于ios:iOS-屏幕实时共享功能实践内附详细代码

57次阅读

共计 9212 个字符,预计需要花费 24 分钟才能阅读完成。

很多人对屏幕共享的印象还只停留在 PC 端做 PPT 汇报的场景中,但事实上,明天的屏幕共享早已跨界出圈了。比方一个大家很相熟的场景 —— 游戏直播,主播就须要将本人的画面以“屏幕共享”的模式展现给观众,并且对实时性、流畅性的要求十分高。

对于很多手游主播来说,目前比拟常见的做法是,通过借助 PC 端的直达将手机游戏画面进行直播分享;而实际上,通过调用融云屏幕共享 SDK,间接在手机端就能够领有屏幕实时共享的能力。

本文就将次要围绕 iOS 屏幕共享问题展开讨论,一起理解 iOS ReplayKit 框架的倒退过程,各个阶段的性能演变,以及联合融云屏幕共享 SDK 实现相应性能的代码和思路。

01 ReplayKit 发展史

iOS 端屏幕录制 ReplayKit 是从 iOS9 开始呈现的。

iOS9

WWDC15 首次提供 ReplayKit 框架,它的初期呈现次要用于录制视频,存于相册。

iOS9 开始录制和进行录制两个 API 有很大的局限性:

只能获取系统生成好的 MP4 文件,且不能间接获取,须要先保留到相册,再从相册获取;

不能够获取源数据,也就是 pcm 和 yuv 数据;

给开发者的权限低,不能录制其余 APP,且退出后盾就不会录制了,只能录制以后 APP 画面。

可控行为在于:

进行录制可弹出一个视频预览窗口,能进行保留或勾销或分享该视频文件;

录制实现后能够进行查看、编辑,或者通过指定形式分享。

开始录制视频的 API 如下所示。

/*!
 Deprecated. Use startRecordingWithHandler: instead.

 @abstract Starts app recording with a completion handler. Note that before recording actually starts, the user may be prompted with UI to confirm recording.
 @param microphoneEnabled Determines whether the microphone input should be included in the recorded movie audio.
 @discussion handler Called after user interactions are complete. Will be passed an optional NSError in the RPRecordingErrorDomain domain if there was an issue starting the recording.
 */
[[RPScreenRecorder sharedRecorder] startRecordingWithMicrophoneEnabled:YES handler:^(NSError * _Nullable error) {if (error) {//TODO.....}
}];

调用开始录屏的时候,零碎会弹出一个弹窗,须要用户进行确认后能力失常录制。

进行录制视频的 API 如下所示。


/*! @abstract Stops app recording with a completion handler.
 @discussion handler Called when the movie is ready. Will return an instance of RPPreviewViewController on success which should be presented using [UIViewController presentViewController:animated:completion:]. Will be passed an optional NSError in the RPRecordingErrorDomain domain if there was an issue stopping the recording.
 */
[[RPScreenRecorder sharedRecorder] stopRecordingWithHandler:^(RPPreviewViewController *previewViewController, NSError *  error){
            [self presentViewController:previewViewController animated:YES completion:^{//TODO.....}];
}];

iOS10

通过 WWDC16 公布,苹果对 ReplayKit 进行了降级,凋谢了源数据获取路径,减少了两个 Extension 的 Target。具体情况包含:

新增 UI 和 Upload 两个 Extension 的 Target;

减少开发者权限,容许用户登录到服务并设立了直播、源数据的操作;

只能通过扩大区录制屏幕,不仅能够录制本人的 APP,还能够录制其余 APP;

只能录制 APP 屏幕,不能录制 iOS 零碎屏幕。

创立 Extension 的办法如下图所示。

UI Extension

/*
这俩 API 能够了解为弹窗触发的事件回调函数;*/
- (void)userDidFinishSetup {// 触发  Host App 的 RPBroadcastActivityViewControllerDelegate}

- (void)userDidCancelSetup {// 触发  Host App 的 RPBroadcastActivityViewControllerDelegate}

Upload Extension

- (void)broadcastStartedWithSetupInfo:(NSDictionary<NSString *,NSObject *> *)setupInfo {
    // User has requested to start the broadcast. Setup info from the UI extension can be supplied but optional.
  // 这里次要就是做一些初始化的行为操作
}

- (void)broadcastPaused {
    // User has requested to pause the broadcast. Samples will stop being delivered.
  // 接管零碎暂停信号
}

- (void)broadcastResumed {
    // User has requested to resume the broadcast. Samples delivery will resume.
  // 接管零碎复原信号
}

- (void)broadcastFinished {
    // User has requested to finish the broadcast.
  // 接管零碎实现信号
}

 // 这里就是此次更新最炸的点,咱们能够拿到零碎源数据,而且零碎还分了三类,别离为视频帧、App 内声音、麦克风
- (void)processSampleBuffer:(CMSampleBufferRef)sampleBuffer withType:(RPSampleBufferType)sampleBufferType {switch (sampleBufferType) {
        case RPSampleBufferTypeVideo:
            // Handle video sample buffer
            break;
        case RPSampleBufferTypeAudioApp:
            // Handle audio sample buffer for app audio
            break;
        case RPSampleBufferTypeAudioMic:
            // Handle audio sample buffer for mic audio
            break;
            
        default:
            break;
    }
}

Host APP

RPBroadcastControllerDelegate
//start
if (![RPScreenRecorder sharedRecorder].isRecording) {[RPBroadcastActivityViewController loadBroadcastActivityViewControllerWithHandler:^(RPBroadcastActivityViewController * _Nullable broadcastActivityViewController, NSError * _Nullable error) {if (error) {NSLog(@"RPBroadcast err %@", [error localizedDescription]);
          }
          broadcastActivityViewController.delegate = self; /*RPBroadcastActivityViewControllerDelegate*/
          [self presentViewController:broadcastActivityViewController animated:YES completion:nil];
      }];
  }

#pragma mark- RPBroadcastActivityViewControllerDelegate
- (void)broadcastActivityViewController:(RPBroadcastActivityViewController *)broadcastActivityViewController didFinishWithBroadcastController:(RPBroadcastController *)broadcastController error:(NSError *)error {
  if(error){
    //TODO:
    NSLog(@"broadcastActivityViewController:%@",error.localizedDescription);
    return;
  }
  
   [broadcastController startBroadcastWithHandler:^(NSError * _Nullable error) {if (!error) {NSLog(@"success");
        } else {NSLog(@"startBroadcast:%@",error.localizedDescription);
        }
    }];
}

#pragma mark- RPBroadcastControllerDelegate
- (void)broadcastController:(RPBroadcastController *)broadcastController didFinishWithError:(nullable NSError *)error{NSLog(@"didFinishWithError: %@", error);
}
- (void)broadcastController:(RPBroadcastController *)broadcastController didUpdateServiceInfo:(NSDictionary <NSString *, NSObject <NSCoding> *> *)serviceInf {NSLog(@"didUpdateServiceInfo: %@", serviceInf);
}

iOS11

通过 WWDC17,苹果对 ReplayKit2 进行了再次降级,新增了 APP 外数据获取,能够间接在 Host App 中获取,具体包含:

能够间接在 Host APP 中解决录制的 APP 屏幕数据;

能够录制 iOS 零碎的屏幕数据,然而须要通过控制中心手动开启。

启动 APP 屏幕录制

[[RPScreenRecorder sharedRecorder] startCaptureWithHandler:^(CMSampleBufferRef  _Nonnull sampleBuffer, RPSampleBufferType bufferType, NSError * _Nullable error) {[self.videoOutputStream write:sampleBuffer error:nil];
} completionHandler:^(NSError * _Nullable error) {NSLog(@"startCaptureWithHandler:%@",error.localizedDescription);
}];

进行 APP 屏幕录制


[[RPScreenRecorder sharedRecorder] stopCaptureWithHandler:^(NSError * _Nullable error) {
        [self.assetWriter finishWritingWithCompletionHandler:^{//TODO}];
}];

iOS12

苹果在 WWDC18 上针对 ReplayKit 更新,新增了 RPSystemBroadcastPickerView,类用于在 APP 内可启动零碎录制,极大地简化了屏幕录制的流程。


if (@available(iOS 12.0, *)) {self.systemBroadcastPickerView = [[RPSystemBroadcastPickerView alloc] initWithFrame:CGRectMake(0, 0, 50, 80)];
        self.systemBroadcastPickerView.preferredExtension = ScreenShareBuildID;
        self.systemBroadcastPickerView.showsMicrophoneButton = NO;
        self.navigationItem.rightBarButtonItem = [[UIBarButtonItem alloc] initWithCustomView:self.systemBroadcastPickerView];
} else {// Fallback on earlier versions}

02 融云 RongRTCReplayKitExt

为加重开发者的集成累赘,融云专门打造了 RongRTCReplayKitExt 库,以服务于屏幕共享业务。

设计思路

Upload Extension

SampleHandler 进行数据的接管、RCRTCReplayKitEngine 初始化配置;

RCRTCReplayKitEngine 初始化 socket 通信、解决 YUV 数据转 i420、管制内存峰值。

App

原有公布流程:

IM 连贯 – 退出房间 – 公布资源(RCRTCScreenShareOutputStream);

外部做 socket 初始化、实现协定接管解决过的数据、推流。

代码示例

Upload extension

#import "SampleHandler.h"
#import <RongRTCReplayKitExt/RongRTCReplayKitExt.h>

static NSString *const ScreenShareGroupID = @"group.cn.rongcloud.rtcquickdemo.screenshare";

@interface SampleHandler ()<RongRTCReplayKitExtDelegate>
@end

@implementation SampleHandler

- (void)broadcastStartedWithSetupInfo:(NSDictionary<NSString *, NSObject *> *)setupInfo {
    // User has requested to start the broadcast. Setup info from the UI extension can be supplied but optional.
    [[RCRTCReplayKitEngine sharedInstance] setupWithAppGroup:ScreenShareGroupID delegate:self];
}

- (void)broadcastPaused {// User has requested to pause the broadcast. Samples will stop being delivered.}

- (void)broadcastResumed {// User has requested to resume the broadcast. Samples delivery will resume.}

- (void)broadcastFinished {[[RCRTCReplayKitEngine sharedInstance] broadcastFinished];
}

- (void)processSampleBuffer:(CMSampleBufferRef)sampleBuffer withType:(RPSampleBufferType)sampleBufferType  API_AVAILABLE(ios(10.0)) {switch (sampleBufferType) {
        case RPSampleBufferTypeVideo:
            [[RCRTCReplayKitEngine sharedInstance] sendSampleBuffer:sampleBuffer withType:RPSampleBufferTypeVideo];
            break;
        case RPSampleBufferTypeAudioApp:
            // Handle audio sample buffer for app audio
            break;
        case RPSampleBufferTypeAudioMic:
            // Handle audio sample buffer for mic audio
            break;

        default:
            break;
    }
}

#pragma mark - RongRTCReplayKitExtDelegate
-(void)broadcastFinished:(RCRTCReplayKitEngine *)broadcast reason:(RongRTCReplayKitExtReason)reason {
    NSString *tip = @"";
    switch (reason) {
        case RongRTCReplayKitExtReasonRequestedByMain:
            tip = @"屏幕共享已完结......";
            break;
        case RongRTCReplayKitExtReasonDisconnected:
            tip = @"利用断开.....";
            break;
        case RongRTCReplayKitExtReasonVersionMismatch:
            tip = @"集成谬误(SDK 版本号不相符合)........";
            break;
    }

    NSError *error = [NSError errorWithDomain:NSStringFromClass(self.class)
                                         code:0
                                     userInfo:@{NSLocalizedFailureReasonErrorKey:tip}];
    [self finishBroadcastWithError:error];
}

Host App

- (void)joinRoom {RCRTCVideoStreamConfig *videoConfig = [[RCRTCVideoStreamConfig alloc] init];
    videoConfig.videoSizePreset = RCRTCVideoSizePreset720x480;
    videoConfig.videoFps = RCRTCVideoFPS30;
    [[RCRTCEngine sharedInstance].defaultVideoStream setVideoConfig:videoConfig];

    RCRTCRoomConfig *config = [[RCRTCRoomConfig alloc] init];
    config.roomType = RCRTCRoomTypeNormal;

    [self.engine enableSpeaker:YES];

    __weak typeof(self) weakSelf = self;
    [self.engine joinRoom:self.roomId
                   config:config
               completion:^(RCRTCRoom *_Nullable room, RCRTCCode code) {__strong typeof(weakSelf) strongSelf = weakSelf;
                   if (code == RCRTCCodeSuccess) {
                       self.room = room;
                       room.delegate = self;
                       [self publishScreenStream];
                   } else {[UIAlertController alertWithString:@"退出房间失败" inCurrentViewController:strongSelf];
                   }
               }];
}

- (void)publishScreenStream {self.videoOutputStream = [[RCRTCScreenShareOutputStream alloc] initWithAppGroup:ScreenShareGroupID];

    RCRTCVideoStreamConfig *videoConfig = self.videoOutputStream.videoConfig;
    videoConfig.videoSizePreset = RCRTCVideoSizePreset1280x720;
    videoConfig.videoFps = RCRTCVideoFPS24;
    [self.videoOutputStream setVideoConfig:videoConfig];
    
    [self.room.localUser publishStream:self.videoOutputStream
                            completion:^(BOOL isSuccess, RCRTCCode desc) {if (isSuccess) {NSLog(@"公布自定义流胜利");
                                } else {NSLog(@"公布自定义流失败 %@", [NSString stringWithFormat:@"订阅远端流失败:%ld", (long) desc]);
                                }
                            }];
}

03 一些注意事项

第一,ReplayKit2 内存不能超过 50MB,一旦超过峰值零碎会进行强制回收,所以在 Extension 外面解决数据须要分外留神内存开释。

第二,过程之前通信,如果是以 CFDefaultcenter 不能携带参数,只能发送音讯;如果须要携带参数,必须做本地文件缓存。这里须要留神的一个问题是,在 debug 模式下运行能够打印出数据,在 release 下获取不到本地文件数据,具体实现可参见 Github 中的攻略。

最初想小小吐槽一下:非正常完结录屏时,零碎往往会呈现弹窗,且这个弹窗无奈删除,只能重启设施 —— 这应该算是 iOS 零碎一个比拟宜人的 BUG 了。

正文完
 0