文档反馈
文档反馈

屏幕共享

通过 NERTC SDK 可以在视频通话或互动直播过程中实现屏幕共享,主播或连麦者可以将自己的屏幕内容,以视频的方式分享给远端参会者或在线观众观看,从而提升沟通效率,一般适用于多人视频聊天、在线会议以及在线教育场景。

NERTC SDK 以辅流的形式实现屏幕共享,即单独为屏幕共享开启一路上行的视频流,摄像头的视频流作为主流,屏幕共享的视频流作为辅流,两路视频流并行,主播同时上行摄像头画面和屏幕画面两路画面。

Android

注意事项

本端共享屏幕

  1. 加入房间之后通过 startScreenCapture 开启屏幕共享。
  2. 通过 setupLocalSubStreamVideoCanvas 设置本端的辅流视频画布。
  3. 通过 stopScreenCapture 关闭辅流形式的屏幕共享。

示例代码:

    //先请求屏幕共享权限
    @TargetApi(Build.VERSION_CODES.LOLLIPOP)
    private void startScreenCapture() { 
        MediaProjectionManager mediaProjectionManager =
                (MediaProjectionManager) getApplication().getSystemService(
                        Context.MEDIA_PROJECTION_SERVICE);
        startActivityForResult(
                mediaProjectionManager.createScreenCaptureIntent(), CAPTURE_PERMISSION_REQUEST_CODE);
    }

    //在权限请求返回中打开屏幕共享接口
    @TargetApi(Build.VERSION_CODES.LOLLIPOP)
    @Override
    public void onActivityResult(int requestCode, int resultCode, Intent data) {
        if (requestCode != CAPTURE_PERMISSION_REQUEST_CODE)
            return;
        if(resultCode != Activity.RESULT_OK) {
            showToast("你拒绝了录屏请求!");
            getUiKitButtons().find("screen_cast", Boolean.class).setState(false);
            return;
        }
        NERtcScreenConfig screenProfile = new NERtcScreenConfig();
        screenProfile.videoProfile = mScreenProfile;
        screenProfile.contentPrefer = mScreenContent;
        screenProfile.frameRate = mScreenFps;
        screenProfile.minFramerate = mScreenMinFps;
        screenProfile.bitrate = mScreenEncodeBitrate;
        screenProfile.minBitrate = mScreenEncodeMinBitrate;
        mScreenService.startScreenCapture(screenProfile, data, new MediaProjection.Callback() {
            @Override
            public void onStop() {
                super.onStop();
                showToast("录屏已停止");
            }
        });
        NERtcEx.getInstance().setupLocalSubStreamVideoCanvas(mScreenView);
    }

    // 停止桌面共享
    NERtcEx.getInstance().stopScreenCapture();

观看远端屏幕共享

  1. 远端用户加入房间。
  2. 通过 setupRemoteSubStreamVideoCanvas 设置远端的辅流视频回放画布。
  3. 收到 onUserSubStreamVideoStart 其他用户开启屏幕共享辅流通道的回调。
  4. 通过 subscribeRemoteSubStreamVideo 订阅远端的屏幕共享辅流视频,订阅之后才能接收远端的辅流视频数据。
  5. 收到 onUserSubStreamVideoStop 其他用户关闭辅流的回调,结束屏幕共享。

示例代码:

public void onUserSubStreamVideoStart(long uid,int maxProfile) {
    Log.i(TAG, "onUserSubStreamVideoStart uid: " + uid);
    NERtcEx.getInstance().subscribeRemoteSubStreamVideo(
        user.userId, true);
    NERtcEx.getInstance().setupRemoteSubStreamVideoCanvas(view, uid);
}

iOS

基于 iOS 系统的屏幕共享功能,需要在 App Extension 中通过 iOS 原生的 ReplayKit 特性实现录屏进程,并配合主 App 进程进行推流。需要进行屏幕共享的时候,使用 Apple ReplayKit 框架进行屏幕录制,接收系统采集的屏幕图像,并将其发送给 SDK 以传输视频流数据。

屏幕共享的主要流程包括:

  1. (可选)创建 App Group。 App Group 用于在主 App 进程和扩展程序之间之间进行视频数据和控制指令的传输。
  2. 通过 Xcode 在工程中创建一个 Target,类型为 Broadcast Upload Extension, 用于开启屏幕共享的进程。
  3. 添加 ReplayKit 扩展,并使用 Apple ReplayKit 框架进行屏幕录制。
  4. 将录屏数据作为自定义视频源发送给 SDK,并使用 SDK 进行视频流的传输。

注意事项

添加ReplayKit

步骤一(可选)创建 App Group

  1. Certificates, Identifiers & Profiles 页面中注册 App Group。

    操作步骤请参考注册 App Group

  2. 为您的 App ID 启用 App Group 功能。

    操作步骤请参考启用 App Group

  3. 重新下载 Provisioning Profile 并配置到 XCode 中。

步骤二 创建 Extension 录屏进程

创建一个类型为 Broadcast Upload Extension 的Target,用于存放屏幕共享功能的实现代码。

  1. 在 Xcode 中打开项目的工程文件。
  2. 在菜单中选择 Editor > Add Target...
  3. iOS 页签中 选择 Broadcast Upload Extension,并单击 Next

    Add Target

  4. Product Name 中为 Extension 命名,单后单击 Finish

步骤三 创建 App Group 数据池

数据池用于扩展 ReplayKit 程序和主工程之间通信。

- (void)broadcastStartedWithSetupInfo:(NSDictionary<NSString *,NSObject *> *)setupInfo {
    self.userDefautls = [[NSUserDefaults alloc] initWithSuiteName:<#kAppGroupName#>];
}

步骤四 通过 ReplayKit 实现屏幕共享

压缩裁剪采集图片,发送到宿主App,并通过 ReplayKit 实现屏幕共享。

  1. 采集到的屏幕视频数据通过 processSampleBuffer:withType: 给用户。使用云信SDK音频采集,忽略音频数据回调。
  2. 将视频数据压缩后存入共享内存。
  3. 主程序监测到视频数据变更后,通过SDK自定义视频数据进行发送。
- (void)processSampleBuffer:(CMSampleBufferRef)sampleBuffer withType:(RPSampleBufferType)sampleBufferType {

    switch (sampleBufferType) {
        case RPSampleBufferTypeVideo: {
            @autoreleasepool {
                CVImageBufferRef pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
                NSDictionary *frame = [self createI420VideoFrameFromPixelBuffer:pixelBuffer];
                [self.userDefautls setObject:frame forKey:<#KeyPath#>];
                [self.userDefautls synchronize];
            }
            break;
        }
        case RPSampleBufferTypeAudioApp:
            // Handle audio sample buffer for app audiobreak;
        case RPSampleBufferTypeAudioMic:
            // Handle audio sample buffer for mic audiobreak;

        default:
            break;
    }
}

数据压缩采用的是 libyuv 第三方工具。

- (NSDictionary *)createI420VideoFrameFromPixelBuffer:(CVPixelBufferRef)pixelBuffer
{
    CVPixelBufferLockBaseAddress(pixelBuffer, 0);

    // 转I420
    int psrc_w = (int)CVPixelBufferGetWidth(pixelBuffer);
    int psrc_h = (int)CVPixelBufferGetHeight(pixelBuffer);
    uint8 *src_y = (uint8 *)CVPixelBufferGetBaseAddressOfPlane(pixelBuffer, 0);
    uint8 *src_uv = (uint8 *)CVPixelBufferGetBaseAddressOfPlane(pixelBuffer, 1);
    int y_stride = (int)CVPixelBufferGetBytesPerRowOfPlane(pixelBuffer, 0);
    int uv_stride = (int)CVPixelBufferGetBytesPerRowOfPlane(pixelBuffer, 1);
    uint8 *i420_buf = (uint8 *)malloc((psrc_w * psrc_h * 3) >> 1);

    libyuv::NV12ToI420(&src_y[0],                              y_stride,
                       &src_uv[0],                             uv_stride,
                       &i420_buf[0],                           psrc_w,
                       &i420_buf[psrc_w * psrc_h],             psrc_w >> 1,
                       &i420_buf[(psrc_w * psrc_h * 5) >> 2],  psrc_w >> 1,
                       psrc_w, psrc_h);

    // 缩放至720
    int pdst_w = 720;
    int pdst_h = psrc_h * (pdst_w/(double)psrc_w);
    libyuv::FilterMode filter = libyuv::kFilterNone;
    uint8 *pdst_buf = (uint8 *)malloc((pdst_w * pdst_h * 3) >> 1);
    libyuv::I420Scale(&i420_buf[0],                          psrc_w,
                      &i420_buf[psrc_w * psrc_h],            psrc_w >> 1,
                      &i420_buf[(psrc_w * psrc_h * 5) >> 2], psrc_w >> 1,
                      psrc_w, psrc_h,
                      &pdst_buf[0],                          pdst_w,
                      &pdst_buf[pdst_w * pdst_h],            pdst_w >> 1,
                      &pdst_buf[(pdst_w * pdst_h * 5) >> 2], pdst_w >> 1,
                      pdst_w, pdst_h,
                      filter);

    free(i420_buf);

    CVPixelBufferUnlockBaseAddress(pixelBuffer, 0);

    NSUInteger dataLength = pdst_w * pdst_h * 3 >> 1;
    NSData *data = [NSData dataWithBytesNoCopy:pdst_buf length:dataLength];

    NSDictionary *frame = @{
        @"width": @(pdst_w),
        @"height": @(pdst_h),
        @"data": data,
        @"timestamp": @(CACurrentMediaTime() * 1000)
    };
    return frame;
}

屏幕分享主程序

  1. 初始化 SDK,配置允许使用外部视频源,确保视频通话功能正常。
//开启外部视频源,并将外部视频源配置为屏幕共享
NERtcEngine *coreEngine = [NERtcEngine sharedEngine];
[coreEngine enableLocalAudio:YES];
[[[NTESDemoLogic sharedLogic] getCoreEngine] setExternalVideoSource:YES isScreen:YES];
NERtcEngineContext *context = [[NERtcEngineContext alloc] init];
context.engineDelegate = self;
context.appKey = <#请输入您的AppKey#>;
[coreEngine setupEngineWithContext:context];
  1. 在 RPSystemBroadcastPickerView 中添加扩展程序。
- (void)addSystemBroadcastPickerIfPossible
{
    if (@available(iOS 12.0, *)) {
        // Not recommend
        RPSystemBroadcastPickerView *picker = [[RPSystemBroadcastPickerView alloc] initWithFrame:CGRectMake(0, 0, 120, 64)];
        picker.showsMicrophoneButton = NO;
        picker.preferredExtension = <#扩展程序的BundleId#>;
        [self.view addSubview:picker];
        picker.center = self.view.center;

        UIButton *button = [picker.subviews filteredArrayUsingPredicate:[NSPredicate predicateWithBlock:^BOOL(id  _Nullable evaluatedObject, NSDictionary<NSString *,id> * _Nullable bindings) {
            return [evaluatedObject isKindOfClass:UIButton.class];
        }]].firstObject;
        [button setImage:nil forState:UIControlStateNormal];
        [button setTitle:@"Start Share" forState:UIControlStateNormal];
        [button setTitleColor:self.navigationController.navigationBar.tintColor forState:UIControlStateNormal];

        UIBarButtonItem *leftItem = [[UIBarButtonItem alloc] initWithCustomView:picker];
        self.navigationItem.leftBarButtonItem = leftItem;
    }
}
  1. 添加监听事件。
- (void)setupUserDefaults
{
    // 通过UserDefaults建立数据通道,接收Extension传递来的视频帧self.userDefaults = [[NSUserDefaults alloc] initWithSuiteName:<#AppGroupName#>];
    [self.userDefaults addObserver:self forKeyPath:<#KeyPath#> options:NSKeyValueObservingOptionNew context:KVOContext];
}
  1. 监听到数据帧变化,校验后推送外部视频帧到SDK。
- (void)observeValueForKeyPath:(NSString *)keyPath ofObject:(id)object change:(NSDictionary<NSKeyValueChangeKey,id> *)change context:(void *)context
{
    if ([keyPath isEqualToString:<#KeyPath#>]) {
        if (self.currentUserID) {
            NSDictionary *i420Frame = change[NSKeyValueChangeNewKey];
            NERtcVideoFrame *frame = [[NERtcVideoFrame alloc] init];
            frame.format = kNERtcVideoFormatI420;
            frame.width = [i420Frame[@"width"] unsignedIntValue];
            frame.height = [i420Frame[@"height"] unsignedIntValue];
            frame.buffer = (void *)[i420Frame[@"data"] bytes];
            frame.timestamp = [i420Frame[@"timestamp"] unsignedLongLongValue];
            int ret = [NERtcEngine.sharedEngine pushExternalVideoFrame:frame]; // 推送外部视频帧到SDKif (ret != 0) {
                NSLog(@"发送视频流失败:%d", ret);
                return;
            }
        }
    }
}
  1. 设置视频回放画布,并开启屏幕共享。屏幕共享内容以辅流形式发送。

    1. 通过 setupLocalSubStreamVideoCanvas 设置本端的辅流视频画布。
    2. 加入房间后,通过 startScreenCapture 开启屏幕共享,屏幕共享内容以辅流形式发送。
    3. 若有需要,可以通过 setLocalRenderSubStreamScaleMode 设置本端的辅流渲染缩放模式。
//设置本端的辅流视频画布
NERtcVideoCanvas *subStreamCanvas = nil;
if([NTESDemoSettings boolForKey:keyNRTCDemoLocalSubStreamExternalRender])
{
   NTESExternalRenderView *externalview = [[NTESExternalRenderView alloc] initWithFrame:CGRectZero format:SDL_FCC_I420];
   subStreamCanvas = [NERtcVideoCanvas localCanvasWithExternalRender:externalview];
   [NTESDemoLogic sharedLogic].userManager.me.screenRenderView = externalview;
 }else{
   UIView *view = [[UIView alloc] initWithFrame:CGRectZero];
   subStreamCanvas = [NERtcVideoCanvas localSubStreamCanvasWithView:view];
   [NTESDemoLogic sharedLogic].userManager.me.screenRenderView = (NTESExternalRenderView *)view;
 }
 [[[NTESDemoLogic sharedLogic] getCoreEngine] setupLocalSubStreamVideoCanvas:subStreamCanvas];


//设置本端的辅流渲染缩放模式
NSString *key = keyNRTCDemoLocalSubStreamRenderScaleMode;
if (settings[key]) {
   NERtcVideoRenderScaleMode renderMode = (NERtcVideoRenderScaleMode)[settings jsonInteger:key];
   [[[NTESDemoLogic sharedLogic] getCoreEngine] setLocalRenderSubStreamScaleMode:renderMode];
}


//屏幕共享开启和关闭
- (void)onMenuMySubStream:(id)sender {
    NTESUser *me = findMe();
    int result = 0;
    BOOL toStart = !me.screenConnected;
    if (toStart) {
        NERtcVideoSubStreamEncodeConfiguration *config = [[NERtcVideoSubStreamEncodeConfiguration alloc] init];
        if([NTESDemoSettings objectForKey:keyNRTCDemoLocalVideoSubStreamProfileType]) {
            NSInteger value = [NTESDemoSettings integerForKey:keyNRTCDemoLocalVideoSubStreamProfileType];
            config.maxProfile = (NERtcVideoProfileType)value;
        }

        if ([NTESDemoSettings objectForKey:keyNRTCDemoLocalSubStreamEncodeFrameRate]) {
            NSInteger value = [NTESDemoSettings integerForKey:keyNRTCDemoLocalSubStreamEncodeFrameRate];
            config.frameRate = value;
        }

        if ([NTESDemoSettings objectForKey:keyNRTCDemoLocalSubStreamEncodeMinFrameRate]) {
            NSInteger value = [NTESDemoSettings integerForKey:keyNRTCDemoLocalSubStreamEncodeMinFrameRate];
            config.minFrameRate = value;
        }

        if ([NTESDemoSettings objectForKey:keyNRTCDemoLocalSubStreamEncodeBitrate]) {
            NSInteger value = [NTESDemoSettings integerForKey:keyNRTCDemoLocalSubStreamEncodeBitrate];
            config.bitrate = value;
        }

        if ([NTESDemoSettings objectForKey:keyNRTCDemoLocalSubStreamEncodeMinBitrate]) {
            NSInteger value = [NTESDemoSettings integerForKey:keyNRTCDemoLocalSubStreamEncodeMinBitrate];
            config.minBitrate = value;
        }

        if ([NTESDemoSettings objectForKey:keyNRTCDemoLocalSubStreamEncodeContentPrefer]) {
            NSInteger value = [NTESDemoSettings integerForKey:keyNRTCDemoLocalSubStreamEncodeContentPrefer];
            config.contentPrefer = value;
        }

        //屏幕共享开启
        result = [[[NTESDemoLogic sharedLogic] getCoreEngine] startScreenCapture:config];
    }else{
        //屏幕共享关闭
        result = [[[NTESDemoLogic sharedLogic] getCoreEngine] stopScreenCapture];
    }
    NTESCheckResultAndReturn(result, nil);
    me.screenConnected = toStart;
    if (self.eventDelegate && [self.eventDelegate respondsToSelector:@selector(handlerEventSubStreamStart:)]) {
        [self.eventDelegate handlerEventSubStreamStart:me.screenConnected];
    }
}
  1. 不使用该功能时,需要移除观察者,并关闭屏幕共享。
[self.userDefaults removeObserver:self forKeyPath:<#KeyPath#>];
 [[[NTESDemoLogic sharedLogic] getCoreEngine] stopScreenCapture];

观看远端屏幕共享

  1. 远端用户加入房间。
  2. 通过 setupRemoteSubStreamVideoCanvas|设置远端的辅流视频回放画布 设置远端的辅流视频回放画布。
  3. 收到 onNERtcEngineUserSubStreamDidStartWithUserID 其他用户开启屏幕共享辅流通道的回调。
  4. 通过 subscribeRemoteSubStreamVideo 订阅或取消订阅远端的屏幕共享辅流视频,订阅之后才能接收远端的辅流视频数据。
  5. 管理屏幕共享任务。
  6. 收到 onNERtcEngineUserSubStreamDidStop 其他用户关闭辅流的回调,结束屏幕共享。

示例代码:

//其他用户开启屏幕共享辅流通道的回调
- (void)onNERtcEngineUserSubStreamDidStartWithUserID:(uint64_t)userID subStreamProfile:(NERtcVideoProfileType)profile {
    NTESUser *user = [[NTESDemoLogic sharedLogic].userManager userWithID:userID];
    if (!user || user.isMe) {
        return;
    }

    //设置远端的辅流视频回放画布
    NERtcVideoCanvas *subCanvas = nil;
    if([NTESDemoSettings boolForKey:keyNRTCDemoRemoteSubStreamExternalRender]){
        NTESExternalRenderView *externalview = [[NTESExternalRenderView alloc] initWithFrame:CGRectZero format:SDL_FCC_I420];
        subCanvas = [NERtcVideoCanvas remoteCanvasWithExternalRender:externalview];
        user.screenRenderView = externalview;
    }
    else{
        VIEW_CLASS *view = [[VIEW_CLASS alloc] initWithFrame:CGRectZero];
        subCanvas = [NERtcVideoCanvas remoteSubStreamCanvasWithView:view];
        user.screenRenderView = (NTESExternalRenderView *)view;
    }
    [[[NTESDemoLogic sharedLogic] getCoreEngine] setupRemoteSubStreamVideoCanvas:subCanvas forUserID:userID];


    VideoUserCell *cell = [self.mainView cellForUserID:userID];
    cell.screenRenderView = (UIView *)user.screenRenderView;

    if ([NTESDemoSettings boolForKey:keyNRTCDemoChanelEnableMeetingScene defaultVal:NO] && [self.mainView cellForUserID:userID]) {
        [self subscribeSubStreamWithUserID:userID];
        return;
    }

    // screen start之前就操作过订阅了
    if (user.isScreenSubscribed) {
        return;
    }

    BOOL autoSubscribe = [NTESDemoSettings boolForKey:keyNRTCDemoAutoSubscribeRemoteSubStream defaultVal:YES];
    if (autoSubscribe && [self.mainView cellForUserID:userID]) {
        //订阅远端的屏幕共享辅流视频
        [self subscribeSubStreamWithUserID:userID];
    }
}


//其他用户关闭辅流的回调
- (void)onNERtcEngineUserSubStreamDidStop:(uint64_t)userID {
    NTESUser *user = [[NTESDemoLogic sharedLogic].userManager userWithID:userID];
    if (!user || user.isMe) {
        return;
    }

    VideoUserCell *cell = [self.mainView cellForUserID:userID];
    cell.screenRenderView = nil;

    //SubStream
    NERtcVideoCanvas *subCanvas = [NERtcVideoCanvas remoteSubStreamCanvasWithView:nil];
    [[[NTESDemoLogic sharedLogic] getCoreEngine] setupRemoteSubStreamVideoCanvas:subCanvas forUserID:userID];
}


//设置远端的屏幕共享辅流视频渲染缩放模式
NSString *key = keyNRTCDemoRemoteSubStreamRenderScaleMode;
if (settings[key]) {
   NERtcVideoRenderScaleMode renderMode = (NERtcVideoRenderScaleMode)[settings jsonInteger:key];
   NSArray *users = [NTESDemoLogic sharedLogic].userManager.users;
   for (NTESUser *user in users) {
      if (user.userID != [NTESUser selfID]) {
         [[[NTESDemoLogic sharedLogic] getCoreEngine] setRemoteRenderSubStreamVideoScaleMode:renderMode forUserID:user.userID];
      }
    }
}

Windows/macOS

注意事项

本端共享屏幕

  1. 初始化后通过 setupLocalSubStreamVideoCanvas 设置本端的辅流视频回放画布。
  2. 加入房间后,根据需求开启屏幕共享并设置屏幕共享的方式,屏幕共享内容以辅流形式发送。
  3. 管理屏幕共享任务。
  4. 通过 stopScreenCapture 关闭辅流形式的屏幕共享。

Windows 示例代码

// 开启屏幕共享(Windows)------------------------------------------- 
// 设置本地辅流画布 
nertc::NERtcVideoCanvas canvas; 
canvas.cb = nullptr; 
canvas.user_data = nullptr; 
canvas.window = window; 
canvas.scaling_mode = kNERtcVideoScaleFit; 
rtc_engine_->setupLocalSubStreamVideoCanvas(&canvas); 
// 更新辅流画布缩放模式 
rtc_engine_->setLocalSubStreamRenderMode(kNERtcVideoScaleCropFill); 

// 获取本地辅流启动参数 
int res = 0; 
HWND hwnd = GetSelectCaptureWindow(); 
if ((!IsWindow(hwnd) || !IsWindowVisible(hwnd) || IsIconic(hwnd)) && hwnd != NULL) { 
    ShowLogWarning("the window has been destroyed, hide, or minimized, please select a window again."); 
    RefreshCaptureWindow(); 
    return; 
} 
int fps = 5; 
int max_profile_cur_sel = (int)SendDlgItemMessage(m_hWnd, IDC_CaptureProfile, CB_GETCURSEL, 0, 0); 
nertc::NERtcScreenCaptureParameters capture_params; 
capture_params.profile = nertc::kNERtcScreenProfileHD720P; 
switch (max_profile_cur_sel) 
{ 
    case kNERtcScreenProfile480P: 
        capture_params.profile = nertc::kNERtcScreenProfile480P; 
        capture_params.dimensions = { 640,480 }; 
        fps = 5; 
        break; 
    case kNERtcScreenProfileHD720P: 
        capture_params.profile = nertc::kNERtcScreenProfileHD720P; 
        capture_params.dimensions = { 1280,720 }; 
        fps = 5; 
        break; 
    case kNERtcScreenProfileHD1080P: 
        capture_params.profile = nertc::kNERtcScreenProfileHD1080P; 
        capture_params.dimensions = { 1920,1080 }; 
        fps = 5; 
        break; 
    case kNERtcScreenProfileCustom: 
    { 
        capture_params.profile = nertc::kNERtcScreenProfileCustom; 
        int w = GetDlgItemInt(m_hWnd, IDC_Capture_Width, NULL, FALSE); 
        if (w <= 0) { 
            return; 
        } 
        int h = GetDlgItemInt(m_hWnd, IDC_Capture_Height, NULL, FALSE); 
        if (h <= 0) { 
            return; 
        } 
        capture_params.dimensions = { w,h }; 
        fps = GetDlgItemInt(m_hWnd, IDC_Capture_Fps, NULL, FALSE); 
        if (fps <= 0) { 
            return; 
        } 
    } 
        break; 
    default: 
        break; 
} 

capture_params.frame_rate = fps; 
int bps = 0; 
GetDlgItemInt(m_hWnd, IDC_SCREEN_BPS, &bps, FALSE); 
capture_params.bitrate = bps; 
capture_params.capture_mouse_cursor = true; 
capture_params.window_focus = false; 
capture_params.excluded_window_list = nullptr; 
capture_params.excluded_window_count = 0; 
capture_params.prefer = kNERtcSubStreamContentPreferDetails; 
//启动辅流 
if (hwnd == nullptr) { 
    // 根据具体场景,设置要排除窗口的句柄 
    HWND* wnd_list = new HWND[exclude_wnd_list_.size()]; 
    int index = 0; 
    for (auto e : exclude_wnd_list_) { 
        *(wnd_list + capture_params.excluded_window_count++) = e; 
    } 
    capture_params.excluded_window_list = wnd_list; 

    res = nrtc_engine_->startScreenCaptureByScreenRect({ 0,0,0,0 }, { 0,0,0,0 }, capture_params); 

    // 及时释放掉申请的内存 
    delete[] wnd_list; 
    wnd_list = nullptr; 
} else { 
    res = nrtc_engine_->startScreenCaptureByWindowId(hwnd, { 0,0,0,0 }, capture_params); 
} 

// 暂停屏幕共享 
nrtc_engine_->pauseScreenCapture(); 

// 恢复屏幕共享 
nrtc_engine_->resumeScreenCapture(); 

// 更新取屏区域 
nrtc_engine_->updateScreenCaptureRegion({ 0,0,640,480 }); 

// 停止屏幕共享 
nrtc_engine_->stopScreenCapture();

macOS 示例代码示例代码

// 设置本地辅流画布 
nertc::NERtcVideoCanvas canvas; 
canvas.cb = nullptr; 
canvas.user_data = nullptr; 
canvas.window = window; 
canvas.scaling_mode = kNERtcVideoScaleFit; 
rtc_engine_->setupLocalSubStreamVideoCanvas(&canvas); 

// 更新辅流画布缩放模式 
rtc_engine_->setLocalSubStreamRenderMode(kNERtcVideoScaleCropFill); 

// 设置辅流参数(屏幕共享)(macOS)
int res = 0;
int max_profile_cur_sel = kNERtcVideoProfileHD720P;
nertc::NERtcScreenCaptureParameters capture_params;
capture_params.profile = nertc::kNERtcScreenProfileHD720P;

switch (max_profile_cur_sel)
{
    case kNERtcScreenProfile480P:
        capture_params.profile = nertc::kNERtcScreenProfile480P;
        capture_params.dimensions = { 640,480 };
        capture_params.frame_rate = 5;
        break;
    case kNERtcScreenProfileHD720P:
        capture_params.profile = nertc::kNERtcScreenProfileHD720P;
        capture_params.dimensions = { 1280,720 };
        capture_params.frame_rate = 5;
        break;
    case kNERtcScreenProfileHD1080P:
        capture_params.profile = nertc::kNERtcScreenProfileHD1080P;
        capture_params.dimensions = { 1920,1080 };
        capture_params.frame_rate = 5;
        break;
    case kNERtcScreenProfileCustom:
        capture_params.profile = nertc::kNERtcScreenProfileCustom;
        // 根据具体的交互逻辑,获取到自定义长宽
        capture_params.dimensions = { width,height };
        capture_params.frame_rate = 10;
    default:
        break;
}

capture_params.bitrate = 1500000;
capture_params.capture_mouse_cursor = true;
int errCode = nertc::kNERtcNoError;

// 首先创建一个nertc::IRtcEngineEx实例对象
nertc::IRtcEngineEx* nrtc_engine_ = ...;

// 是准备分享整个桌面(true),还是分享应用(false)
bool isDisplayShare = true;

if (isDisplayShare) {
    // 根据具体场景,获取到要排除的窗口id列表,假定数目为N
    intptr_t excluded_wnd_list[N] = [...]; 
    capture_params.excluded_window_list = excluded_wnd_list;
    capture_params.excluded_window_count = N;

    errCode = nrtc_engine_->startScreenCaptureByDisplayId((uint32_t)windowId, region_rect, captureParam);
} else {
    errCode = nrtc_engine_->startScreenCaptureByWindowId((void *)&windowId, region_rect, captureParam);
}

// 暂停屏幕共享
nrtc_engine_->pauseScreenCapture();

// 恢复屏幕共享
nrtc_engine_->resumeScreenCapture();

// 更新取屏区域
nrtc_engine_->updateScreenCaptureRegion({ 0,0,640,480 });

// 停止屏幕共享
nrtc_engine_->stopScreenCapture();

观看远端屏幕共享

  1. 远端用户加入房间。
  2. 通过 setupRemoteSubStreamVideoCanvas 设置远端的辅流视频回放画布。
  3. 收到 onUserSubStreamVideoStart 其他用户开启屏幕共享辅流通道的回调。
  4. 通过 subscribeRemoteVideoSubStream 订阅远端的屏幕共享辅流视频,订阅之后才能接收远端的辅流视频数据。
  5. 管理屏幕共享任务。
  6. 收到 onUserSubStreamVideoStop 其他用户关闭辅流的回调,结束屏幕共享。

Windows 示例代码

// 远端辅流处理------------------------------------------- 
// 设置远端辅流画布 
nertc::NERtcVideoCanvas canvas; 
canvas.cb = nullptr; 
canvas.user_data = nullptr; 
canvas.window = window; 
canvas.scaling_mode = kNERtcVideoScaleFit; 
rtc_engine_->setupRemoteSubStreamVideoCanvas(uid, &canvas); 
// 更新远端辅流画布缩放模式 
rtc_engine_->setRemoteSubSteamRenderMode(uid, kNERtcVideoScaleCropFill); 

// 监听远端辅流开启 
void onUserSubStreamVideoStart(uid_t uid, NERtcVideoProfileType max_profile) override{ 
        //订阅远端辅流 
        rtc_engine_->subscribeRemoteVideoSubStream(uid, true); 
        //取消订阅远端辅流 
        rtc_engine_->subscribeRemoteVideoSubStream(uid, false); 
} 

//监听远端辅流停止 
void onUserSubStreamVideoStop(uid_t uid) override{ 
        // 取消远端辅流画布 
        rtc_engine_->setupRemoteSubStreamVideoCanvas(uid, nullptr); 
}

macOS 示例代码 示例代码

// 远端辅流处理------------------------------------------- 
// 设置远端辅流画布 
nertc::NERtcVideoCanvas canvas; 
canvas.cb = nullptr; 
canvas.user_data = nullptr; 
canvas.window = window; 
canvas.scaling_mode = kNERtcVideoScaleFit; 
rtc_engine_->setupRemoteSubStreamVideoCanvas(uid, &canvas); 

// 更新远端辅流画布缩放模式 
rtc_engine_->setRemoteSubSteamRenderMode(uid, kNERtcVideoScaleCropFill); 

// 监听远端辅流开启 
void onUserSubStreamVideoStart(uid_t uid, NERtcVideoProfileType max_profile) override{ 
        //订阅远端辅流 
        rtc_engine_->subscribeRemoteVideoSubStream(uid, true); 
        //取消订阅远端辅流 
        rtc_engine_->subscribeRemoteVideoSubStream(uid, false); 
} 

//监听远端辅流停止 
void onUserSubStreamVideoStop(uid_t uid) override{ 
        // 取消远端辅流画布 
        rtc_engine_->setupRemoteSubStreamVideoCanvas(uid, nullptr); 
}

设置屏幕共享的窗口范围

屏幕共享时,您可能需要限制共享屏幕的窗口范围,例如部分涉及敏感信息的窗口区域不进行屏幕共享。NERTC SDK 支持设置共享屏幕的区域范围,可设置为:

如果您基于 Qt 来开发音视频能力,请注意 Qt 中通过 QWidget::winId() 得到的 WId 类型的值在不同平台上有差异。Qt 在 Mac 下的实现返回值实际上是 NSView 对象指针,Windows平台上返回的是窗口的句柄HWND,而 SDK 接口需要的是 NSWindow 窗口的 ID,即成员 windowNumber。您可以参考以下代码,通过 WId 获取 Mac 下窗口的 ID:

/////// file: macx_helper.h
#ifndef HIDETITLEBAR_H
#define HIDETITLEBAR_H

#include <QQuickWindow>
#include <QScreen>
#include <QGuiApplication>

class MacXHelpers : public QObject
{
    Q_OBJECT
public:
    MacXHelpers() {}

public slots:
    int getWindowId(WId wid);
};

#endif // HIDETITLEBAR_H

/////// file: macx_helper.mm
#include "macx_helpers.h"

#import <AppKit/AppKit.h>

int MacXHelpers::getWindowId(WId wid)
{
    NSView *nativeView = reinterpret_cast<NSView *>(wid);
    NSWindow* nativeWindow = nativeView.window;
    if (nativeWindow)
    {
        return nativeWindow.windowNumber;
    }

    return 0;
}

Web

注意事项

初始化时开启屏幕共享

在 Chrome 上屏幕共享,可以直接在 createStream 时把 video 字段设为 falsescreen 字段设为 true 即可。这样就可以初始化的时候,开启了屏幕共享。

注意:此时没有开启摄像头。

//创建local的时候,设置screen为ture
let localStream = NRTC.createStream({
  uid: uid,
  audio: true,
  video: false,
  screen: true,
})

localStream
.init()
.catch(error => {
  console.error('初始化本地流失败 ' + error);
})
.then(() => {
  console.log('初始化本地流成功');
});

通话中途开启屏幕共享

createStream 时把 video 字段设为 truescreen 字段设为 false,先打开了摄像头。但是中途如果需要开启屏幕共享,可以参考以下代码实现。

注意

localStream.close({
  type: 'video'
}).then(()=>{
  console.log('关闭摄像头 sucess')

    let config = {
      type: 'screen', // video:摄像头,screen:屏幕共享,audio: 麦克风
    }
    localStream.open(config).then(()=>{
      //打开屏幕共享成功
    }).catchn(err=>{
      //打开屏幕共享失败
    })

})

设置屏幕共享属性

在开启屏幕之前可以设置相关属性,可以根据用户喜好,调整屏幕共享画面的清晰度,获得较高的用户体验。在调用 localStream.init() 或者 localStream.open() 接口之前调用即可。

let resolution = WebRTC2.VIDEO_QUALITY_1080p
localStream.setScreenProfile(resolution)

参数说明:

参数名 类型 说明
resolution Number 设置的屏幕共享分辨率

resolution,即视频分辨率设置参数如下:

resolution可选值 类型 说明
WebRTC2.VIDEO_QUALITY_480p number 屏幕共享低分辨率 640x480
WebRTC2.VIDEO_QUALITY_720p number 屏幕共享中分辨率 1280x720
WebRTC2.VIDEO_QUALITY_1080p number 屏幕共享高分辨率 1920x1080
×

反馈成功

非常感谢您的反馈,我们会继续努力做得更好。