当前位置: 首页 > news >正文

WebRTC音视频通话-实现iOS端调用ossrs视频通话服务

WebRTC音视频通话-实现iOS端调用ossrs视频通话服务

之前搭建ossrs服务,可以查看:https://blog.csdn.net/gloryFlow/article/details/132257196
这里iOS端使用GoogleWebRTC联调ossrs实现视频通话功能。

一、iOS端调用ossrs视频通话效果图

iOS端端效果图
在这里插入图片描述

ossrs效果图
在这里插入图片描述

一、WebRTC是什么?

WebRTC (Web Real-Time Communications) 是一项实时通讯技术,它允许网络应用或者站点,在不借助中间媒介的情况下,建立浏览器之间点对点(Peer-to-Peer)的连接,实现视频流、音频流或者其他任意数据的传输。

查看https://zhuanlan.zhihu.com/p/421503695

需要了解的关键

  • NAT
    Network Address Translation(网络地址转换)
  • STUN
    Session Traversal Utilities for NAT(NAT会话穿越应用程序)
  • TURN
    Traversal Using Relay NAT(通过Relay方式穿越NAT)
  • ICE
    Interactive Connectivity Establishment(交互式连接建立)
  • SDP
    Session Description Protocol(会话描述协议)
  • WebRTC
    Web Real-Time Communications(web实时通讯技术)

WebRTC offer交换流程如图所示
在这里插入图片描述

二、实现iOS端调用ossrs视频通话

创建好实现iOS端调用ossrs视频通话的工程。如果使用P2P点对点的音视频通话,信令服务器,stun/trunP2P穿透和转发服务器这类需要自己搭建了。ossrs中包含stun/trun穿透和转发服务器。我这边实现iOS端调用ossrs服务。

2.1、权限设置

在iOS端调用ossrs视频通话需要相机、语音权限

在info.plist中添加

<key>NSCameraUsageDescription</key>
<string>APP需要获取相机权限</string>
<key>NSMicrophoneUsageDescription</key>
<string>APP需要获取麦克风权限</string>

2.2、工程需要用到GoogleWebRTC

工程需要用到GoogleWebRTC库,在podfile文件中引入库,注意不同版本的GoogleWebRTC代码还是有些差别的。

target 'WebRTCApp' dopod 'GoogleWebRTC'
pod 'ReactiveObjC'
pod 'SocketRocket'
pod 'HGAlertViewController', '~> 1.0.1'end

之后执行pod install

2.3、GoogleWebRTC主要API

在使用GoogleWebRTC前,先看下主要的类

  • RTCPeerConnection

RTCPeerConnection是WebRTC用于构建点对点连接器

  • RTCPeerConnectionFactory

RTCPeerConnectionFactory是RTCPeerConnection工厂类

  • RTCVideoCapturer

RTCVideoCapturer是摄像头采集器,获取画面与音频,这个之后可以替换掉。可以自定义,方便获取CMSampleBufferRef进行画面的美颜滤镜、虚拟头像等处理。

  • RTCVideoTrack

RTCVideoTrack是视频轨Track

  • RTCAudioTrack

RTCAudioTrack是音频轨Track

  • RTCDataChannel

RTCDataChannel是建立高吞吐量、低延时的信道,可以传输数据。

  • RTCMediaStream

RTCMediaStream是媒体流(摄像头的视频、麦克风的音频)的同步流。

  • SDP

SDP即Session Description Protocol(会话描述协议)
SDP由一行或多行UTF-8文本组成,每行以一个字符的类型开头,后跟等号(=),然后是包含值或描述的结构化文本,其格式取决于类型。如下为一个SDP内容示例:

v=0
o=alice 2890844526 2890844526 IN IP4
s=
c=IN IP4
t=0 0
m=audio 49170 RTP/AVP 0
a=rtpmap:0 PCMU/8000
m=video 51372 RTP/AVP 31
a=rtpmap:31 H261/90000
m=video 53000 RTP/AVP 32
a=rtpmap:32 MPV/90000

这是会用到的WebRTC主要的API类。

2.4、使用WebRTC代码实现

使用WebRTC实现P2P音视频流程如图
在这里插入图片描述

这里调用ossrs实现步骤如下
在这里插入图片描述

关键点设置

初始化RTCPeerConnectionFactory

#pragma mark - Lazy
- (RTCPeerConnectionFactory *)factory {if (!_factory) {RTCInitializeSSL();RTCDefaultVideoEncoderFactory *videoEncoderFactory = [[RTCDefaultVideoEncoderFactory alloc] init];RTCDefaultVideoDecoderFactory *videoDecoderFactory = [[RTCDefaultVideoDecoderFactory alloc] init];_factory = [[RTCPeerConnectionFactory alloc] initWithEncoderFactory:videoEncoderFactory decoderFactory:videoDecoderFactory];}return _factory;
}

通过RTCPeerConnectionFactory生成RTCPeerConnection

self.peerConnection = [self.factory peerConnectionWithConfiguration:newConfig constraints:constraints delegate:nil];

将RTCAudioTrack及RTCVideoTrack添加到peerConnection

NSString *streamId = @"stream";// Audio
RTCAudioTrack *audioTrack = [self createAudioTrack];
self.localAudioTrack = audioTrack;RTCRtpTransceiverInit *audioTrackTransceiver = [[RTCRtpTransceiverInit alloc] init];
audioTrackTransceiver.direction = RTCRtpTransceiverDirectionSendOnly;
audioTrackTransceiver.streamIds = @[streamId];[self.peerConnection addTransceiverWithTrack:audioTrack init:audioTrackTransceiver];// Video
RTCVideoTrack *videoTrack = [self createVideoTrack];
self.localVideoTrack = videoTrack;
RTCRtpTransceiverInit *videoTrackTransceiver = [[RTCRtpTransceiverInit alloc] init];
videoTrackTransceiver.direction = RTCRtpTransceiverDirectionSendOnly;
videoTrackTransceiver.streamIds = @[streamId];
[self.peerConnection addTransceiverWithTrack:videoTrack init:videoTrackTransceiver];

设置摄像头RTCCameraVideoCapturer及文件视频Capturer

- (RTCVideoTrack *)createVideoTrack {RTCVideoSource *videoSource = [self.factory videoSource];// 经过测试比1920*1080大的尺寸,无法通过srs播放[videoSource adaptOutputFormatToWidth:1920 height:1080 fps:20];// 如果是模拟器if (TARGET_IPHONE_SIMULATOR) {self.videoCapturer = [[RTCFileVideoCapturer alloc] initWithDelegate:videoSource];} else{self.videoCapturer = [[RTCCameraVideoCapturer alloc] initWithDelegate:videoSource];}RTCVideoTrack *videoTrack = [self.factory videoTrackWithSource:videoSource trackId:@"video0"];return videoTrack;
}

摄像头本地采集的画面本地显示
startCaptureLocalVideo的renderer为RTCEAGLVideoView

- (void)startCaptureLocalVideo:(id<RTCVideoRenderer>)renderer {if (!self.isPublish) {return;}if (!renderer) {return;}if (!self.videoCapturer) {return;}RTCVideoCapturer *capturer = self.videoCapturer;if ([capturer isKindOfClass:[RTCCameraVideoCapturer class]]) {if (!([RTCCameraVideoCapturer captureDevices].count > 0)) {return;}AVCaptureDevice *frontCamera = RTCCameraVideoCapturer.captureDevices.firstObject;
//        if (frontCamera.position != AVCaptureDevicePositionFront) {
//            return;
//        }RTCCameraVideoCapturer *cameraVideoCapturer = (RTCCameraVideoCapturer *)capturer;AVCaptureDeviceFormat *formatNilable;NSArray *supportDeviceFormats = [RTCCameraVideoCapturer supportedFormatsForDevice:frontCamera];NSLog(@"supportDeviceFormats:%@",supportDeviceFormats);formatNilable = supportDeviceFormats[4];
//        if (supportDeviceFormats && supportDeviceFormats.count > 0) {
//            NSMutableArray *formats = [NSMutableArray arrayWithCapacity:0];
//            for (AVCaptureDeviceFormat *format in supportDeviceFormats) {
//                CMVideoDimensions videoVideoDimensions = CMVideoFormatDescriptionGetDimensions(format.formatDescription);
//                float width = videoVideoDimensions.width;
//                float height = videoVideoDimensions.height;
//                // only use 16:9 format.
//                if ((width / height) >= (16.0/9.0)) {
//                    [formats addObject:format];
//                }
//            }
//
//            if (formats.count > 0) {
//                NSArray *sortedFormats = [formats sortedArrayUsingComparator:^NSComparisonResult(AVCaptureDeviceFormat *obj1, AVCaptureDeviceFormat *obj2) {
//                    CMVideoDimensions f1VD = CMVideoFormatDescriptionGetDimensions(obj1.formatDescription);
//                    CMVideoDimensions f2VD = CMVideoFormatDescriptionGetDimensions(obj2.formatDescription);
//                    float width1 = f1VD.width;
//                    float width2 = f2VD.width;
//                    float height2 = f2VD.height;
//                    // only use 16:9 format.
//                    if ((width2 / height2) >= (1.7)) {
//                        return NSOrderedAscending;
//                    } else {
//                        return NSOrderedDescending;
//                    }
//                }];
//
//                if (sortedFormats && sortedFormats.count > 0) {
//                    formatNilable = sortedFormats.lastObject;
//                }
//            }
//        }if (!formatNilable) {return;}NSArray *formatArr = [RTCCameraVideoCapturer supportedFormatsForDevice:frontCamera];for (AVCaptureDeviceFormat *format in formatArr) {NSLog(@"AVCaptureDeviceFormat format:%@", format);}[cameraVideoCapturer startCaptureWithDevice:frontCamera format:formatNilable fps:20 completionHandler:^(NSError *error) {NSLog(@"startCaptureWithDevice error:%@", error);}];}if ([capturer isKindOfClass:[RTCFileVideoCapturer class]]) {RTCFileVideoCapturer *fileVideoCapturer = (RTCFileVideoCapturer *)capturer;[fileVideoCapturer startCapturingFromFileNamed:@"beautyPicture.mp4" onError:^(NSError * _Nonnull error) {NSLog(@"startCaptureLocalVideo startCapturingFromFileNamed error:%@", error);}];}[self.localVideoTrack addRenderer:renderer];
}

创建的createOffer

- (void)offer:(void (^)(RTCSessionDescription *sdp))completion {if (self.isPublish) {self.mediaConstrains = self.publishMediaConstrains;} else {self.mediaConstrains = self.playMediaConstrains;}RTCMediaConstraints *constrains = [[RTCMediaConstraints alloc] initWithMandatoryConstraints:self.mediaConstrains optionalConstraints:self.optionalConstraints];NSLog(@"peerConnection:%@",self.peerConnection);__weak typeof(self) weakSelf = self;[weakSelf.peerConnection offerForConstraints:constrains completionHandler:^(RTCSessionDescription * _Nullable sdp, NSError * _Nullable error) {if (error) {NSLog(@"offer offerForConstraints error:%@", error);}if (sdp) {[weakSelf.peerConnection setLocalDescription:sdp completionHandler:^(NSError * _Nullable error) {if (error) {NSLog(@"offer setLocalDescription error:%@", error);}if (completion) {completion(sdp);}}];}}];
}

设置setRemoteDescription

- (void)setRemoteSdp:(RTCSessionDescription *)remoteSdp completion:(void (^)(NSError * _Nullable error))completion {[self.peerConnection setRemoteDescription:remoteSdp completionHandler:completion];
}

整体代码如下

WebRTCClient.h

#import <Foundation/Foundation.h>
#import <WebRTC/WebRTC.h>
#import <UIKit/UIKit.h>@protocol WebRTCClientDelegate;
@interface WebRTCClient : NSObject@property (nonatomic, weak) id<WebRTCClientDelegate> delegate;/**connect工厂*/
@property (nonatomic, strong) RTCPeerConnectionFactory *factory;/**是否push*/
@property (nonatomic, assign) BOOL isPublish;/**connect*/
@property (nonatomic, strong) RTCPeerConnection *peerConnection;/**RTCAudioSession*/
@property (nonatomic, strong) RTCAudioSession *rtcAudioSession;/**DispatchQueue*/
@property (nonatomic) dispatch_queue_t audioQueue;/**mediaConstrains*/
@property (nonatomic, strong) NSDictionary *mediaConstrains;/**publishMediaConstrains*/
@property (nonatomic, strong) NSDictionary *publishMediaConstrains;/**playMediaConstrains*/
@property (nonatomic, strong) NSDictionary *playMediaConstrains;/**optionalConstraints*/
@property (nonatomic, strong) NSDictionary *optionalConstraints;/**RTCVideoCapturer摄像头采集器*/
@property (nonatomic, strong) RTCVideoCapturer *videoCapturer;/**local语音localAudioTrack*/
@property (nonatomic, strong) RTCAudioTrack *localAudioTrack;/**localVideoTrack*/
@property (nonatomic, strong) RTCVideoTrack *localVideoTrack;/**remoteVideoTrack*/
@property (nonatomic, strong) RTCVideoTrack *remoteVideoTrack;/**RTCVideoRenderer*/
@property (nonatomic, weak) id<RTCVideoRenderer> remoteRenderView;/**localDataChannel*/
@property (nonatomic, strong) RTCDataChannel *localDataChannel;/**localDataChannel*/
@property (nonatomic, strong) RTCDataChannel *remoteDataChannel;- (instancetype)initWithPublish:(BOOL)isPublish;- (void)startCaptureLocalVideo:(id<RTCVideoRenderer>)renderer;- (void)answer:(void (^)(RTCSessionDescription *sdp))completionHandler;- (void)offer:(void (^)(RTCSessionDescription *sdp))completionHandler;#pragma mark - Hiden or show Video
- (void)hidenVideo;- (void)showVideo;#pragma mark - Hiden or show Audio
- (void)muteAudio;- (void)unmuteAudio;- (void)speakOff;- (void)speakOn;- (void)changeSDP2Server:(RTCSessionDescription *)sdpurlStr:(NSString *)urlStrstreamUrl:(NSString *)streamUrlclosure:(void (^)(BOOL isServerRetSuc))closure;@end@protocol WebRTCClientDelegate <NSObject>- (void)webRTCClient:(WebRTCClient *)client didDiscoverLocalCandidate:(RTCIceCandidate *)candidate;
- (void)webRTCClient:(WebRTCClient *)client didChangeConnectionState:(RTCIceConnectionState)state;
- (void)webRTCClient:(WebRTCClient *)client didReceiveData:(NSData *)data;@end

WebRTCClient.m

#import "WebRTCClient.h"
#import "HttpClient.h"@interface WebRTCClient ()<RTCPeerConnectionDelegate, RTCDataChannelDelegate>@property (nonatomic, strong) HttpClient *httpClient;@end@implementation WebRTCClient- (instancetype)initWithPublish:(BOOL)isPublish {self = [super init];if (self) {self.isPublish = isPublish;self.httpClient = [[HttpClient alloc] init];RTCMediaConstraints *constraints = [[RTCMediaConstraints alloc] initWithMandatoryConstraints:nil optionalConstraints:self.optionalConstraints];RTCConfiguration *newConfig = [[RTCConfiguration alloc] init];newConfig.sdpSemantics = RTCSdpSemanticsUnifiedPlan;self.peerConnection = [self.factory peerConnectionWithConfiguration:newConfig constraints:constraints delegate:nil];[self createMediaSenders];[self createMediaReceivers];// srs not support data channel.// self.createDataChannel()[self configureAudioSession];self.peerConnection.delegate = self;}return self;
}- (void)createMediaSenders {if (!self.isPublish) {return;}NSString *streamId = @"stream";// AudioRTCAudioTrack *audioTrack = [self createAudioTrack];self.localAudioTrack = audioTrack;RTCRtpTransceiverInit *audioTrackTransceiver = [[RTCRtpTransceiverInit alloc] init];audioTrackTransceiver.direction = RTCRtpTransceiverDirectionSendOnly;audioTrackTransceiver.streamIds = @[streamId];[self.peerConnection addTransceiverWithTrack:audioTrack init:audioTrackTransceiver];// VideoRTCVideoTrack *videoTrack = [self createVideoTrack];self.localVideoTrack = videoTrack;RTCRtpTransceiverInit *videoTrackTransceiver = [[RTCRtpTransceiverInit alloc] init];videoTrackTransceiver.direction = RTCRtpTransceiverDirectionSendOnly;videoTrackTransceiver.streamIds = @[streamId];[self.peerConnection addTransceiverWithTrack:videoTrack init:videoTrackTransceiver];
}- (void)createMediaReceivers {if (!self.isPublish) {return;}if (self.peerConnection.transceivers.count > 0) {RTCRtpTransceiver *transceiver = self.peerConnection.transceivers.firstObject;if (transceiver.mediaType == RTCRtpMediaTypeVideo) {RTCVideoTrack *track = (RTCVideoTrack *)transceiver.receiver.track;self.remoteVideoTrack = track;}}
}- (void)configureAudioSession {[self.rtcAudioSession lockForConfiguration];@try {NSError *error;[self.rtcAudioSession setCategory:AVAudioSessionCategoryPlayAndRecord withOptions:AVAudioSessionCategoryOptionDefaultToSpeaker error:&error];NSError *modeError;[self.rtcAudioSession setMode:AVAudioSessionModeVoiceChat error:&modeError];NSLog(@"configureAudioSession error:%@, modeError:%@", error, modeError);} @catch (NSException *exception) {NSLog(@"configureAudioSession exception:%@", exception);}[self.rtcAudioSession unlockForConfiguration];
}- (RTCAudioTrack *)createAudioTrack {/// enable google 3A algorithm.NSDictionary *mandatory = @{@"googEchoCancellation": kRTCMediaConstraintsValueTrue,@"googAutoGainControl": kRTCMediaConstraintsValueTrue,@"googNoiseSuppression": kRTCMediaConstraintsValueTrue,};RTCMediaConstraints *audioConstrains = [[RTCMediaConstraints alloc] initWithMandatoryConstraints:mandatory optionalConstraints:self.optionalConstraints];RTCAudioSource *audioSource = [self.factory audioSourceWithConstraints:audioConstrains];RTCAudioTrack *audioTrack = [self.factory audioTrackWithSource:audioSource trackId:@"audio0"];return audioTrack;
}- (RTCVideoTrack *)createVideoTrack {RTCVideoSource *videoSource = [self.factory videoSource];// 经过测试比1920*1080大的尺寸,无法通过srs播放[videoSource adaptOutputFormatToWidth:1920 height:1080 fps:20];// 如果是模拟器if (TARGET_IPHONE_SIMULATOR) {self.videoCapturer = [[RTCFileVideoCapturer alloc] initWithDelegate:videoSource];} else{self.videoCapturer = [[RTCCameraVideoCapturer alloc] initWithDelegate:videoSource];}RTCVideoTrack *videoTrack = [self.factory videoTrackWithSource:videoSource trackId:@"video0"];return videoTrack;
}- (void)offer:(void (^)(RTCSessionDescription *sdp))completion {if (self.isPublish) {self.mediaConstrains = self.publishMediaConstrains;} else {self.mediaConstrains = self.playMediaConstrains;}RTCMediaConstraints *constrains = [[RTCMediaConstraints alloc] initWithMandatoryConstraints:self.mediaConstrains optionalConstraints:self.optionalConstraints];NSLog(@"peerConnection:%@",self.peerConnection);__weak typeof(self) weakSelf = self;[weakSelf.peerConnection offerForConstraints:constrains completionHandler:^(RTCSessionDescription * _Nullable sdp, NSError * _Nullable error) {if (error) {NSLog(@"offer offerForConstraints error:%@", error);}if (sdp) {[weakSelf.peerConnection setLocalDescription:sdp completionHandler:^(NSError * _Nullable error) {if (error) {NSLog(@"offer setLocalDescription error:%@", error);}if (completion) {completion(sdp);}}];}}];
}- (void)answer:(void (^)(RTCSessionDescription *sdp))completion {RTCMediaConstraints *constrains = [[RTCMediaConstraints alloc] initWithMandatoryConstraints:self.mediaConstrains optionalConstraints:self.optionalConstraints];__weak typeof(self) weakSelf = self;[weakSelf.peerConnection answerForConstraints:constrains completionHandler:^(RTCSessionDescription * _Nullable sdp, NSError * _Nullable error) {if (error) {NSLog(@"answer answerForConstraints error:%@", error);}if (sdp) {[weakSelf.peerConnection setLocalDescription:sdp completionHandler:^(NSError * _Nullable error) {if (error) {NSLog(@"answer setLocalDescription error:%@", error);}if (completion) {completion(sdp);}}];}}];
}- (void)setRemoteSdp:(RTCSessionDescription *)remoteSdp completion:(void (^)(NSError * _Nullable error))completion {[self.peerConnection setRemoteDescription:remoteSdp completionHandler:completion];
}- (void)setRemoteCandidate:(RTCIceCandidate *)remoteCandidate {[self.peerConnection addIceCandidate:remoteCandidate];
}- (void)setMaxBitrate:(int)maxBitrate {NSMutableArray *videoSenders = [NSMutableArray arrayWithCapacity:0];for (RTCRtpSender *sender in self.peerConnection.senders) {if (sender.track && [kRTCMediaStreamTrackKindVideo isEqualToString:sender.track.kind]) {[videoSenders addObject:sender];}}if (videoSenders.count > 0) {RTCRtpSender *firstSender = [videoSenders firstObject];RTCRtpParameters *parameters = firstSender.parameters;NSNumber *maxBitrateBps = [NSNumber numberWithInt:maxBitrate];parameters.encodings.firstObject.maxBitrateBps = maxBitrateBps;}
}- (void)setMaxFramerate:(int)maxFramerate {NSMutableArray *videoSenders = [NSMutableArray arrayWithCapacity:0];for (RTCRtpSender *sender in self.peerConnection.senders) {if (sender.track && [kRTCMediaStreamTrackKindVideo isEqualToString:sender.track.kind]) {[videoSenders addObject:sender];}}if (videoSenders.count > 0) {RTCRtpSender *firstSender = [videoSenders firstObject];RTCRtpParameters *parameters = firstSender.parameters;NSNumber *maxFramerateNum = [NSNumber numberWithInt:maxFramerate];// 该版本暂时没有maxFramerate,需要更新到最新版本parameters.encodings.firstObject.maxFramerate = maxFramerateNum;}
}- (void)startCaptureLocalVideo:(id<RTCVideoRenderer>)renderer {if (!self.isPublish) {return;}if (!renderer) {return;}if (!self.videoCapturer) {return;}RTCVideoCapturer *capturer = self.videoCapturer;if ([capturer isKindOfClass:[RTCCameraVideoCapturer class]]) {if (!([RTCCameraVideoCapturer captureDevices].count > 0)) {return;}AVCaptureDevice *frontCamera = RTCCameraVideoCapturer.captureDevices.firstObject;
//        if (frontCamera.position != AVCaptureDevicePositionFront) {
//            return;
//        }RTCCameraVideoCapturer *cameraVideoCapturer = (RTCCameraVideoCapturer *)capturer;AVCaptureDeviceFormat *formatNilable;NSArray *supportDeviceFormats = [RTCCameraVideoCapturer supportedFormatsForDevice:frontCamera];NSLog(@"supportDeviceFormats:%@",supportDeviceFormats);formatNilable = supportDeviceFormats[4];
//        if (supportDeviceFormats && supportDeviceFormats.count > 0) {
//            NSMutableArray *formats = [NSMutableArray arrayWithCapacity:0];
//            for (AVCaptureDeviceFormat *format in supportDeviceFormats) {
//                CMVideoDimensions videoVideoDimensions = CMVideoFormatDescriptionGetDimensions(format.formatDescription);
//                float width = videoVideoDimensions.width;
//                float height = videoVideoDimensions.height;
//                // only use 16:9 format.
//                if ((width / height) >= (16.0/9.0)) {
//                    [formats addObject:format];
//                }
//            }
//
//            if (formats.count > 0) {
//                NSArray *sortedFormats = [formats sortedArrayUsingComparator:^NSComparisonResult(AVCaptureDeviceFormat *obj1, AVCaptureDeviceFormat *obj2) {
//                    CMVideoDimensions f1VD = CMVideoFormatDescriptionGetDimensions(obj1.formatDescription);
//                    CMVideoDimensions f2VD = CMVideoFormatDescriptionGetDimensions(obj2.formatDescription);
//                    float width1 = f1VD.width;
//                    float width2 = f2VD.width;
//                    float height2 = f2VD.height;
//                    // only use 16:9 format.
//                    if ((width2 / height2) >= (1.7)) {
//                        return NSOrderedAscending;
//                    } else {
//                        return NSOrderedDescending;
//                    }
//                }];
//
//                if (sortedFormats && sortedFormats.count > 0) {
//                    formatNilable = sortedFormats.lastObject;
//                }
//            }
//        }if (!formatNilable) {return;}NSArray *formatArr = [RTCCameraVideoCapturer supportedFormatsForDevice:frontCamera];for (AVCaptureDeviceFormat *format in formatArr) {NSLog(@"AVCaptureDeviceFormat format:%@", format);}[cameraVideoCapturer startCaptureWithDevice:frontCamera format:formatNilable fps:20 completionHandler:^(NSError *error) {NSLog(@"startCaptureWithDevice error:%@", error);}];}if ([capturer isKindOfClass:[RTCFileVideoCapturer class]]) {RTCFileVideoCapturer *fileVideoCapturer = (RTCFileVideoCapturer *)capturer;[fileVideoCapturer startCapturingFromFileNamed:@"beautyPicture.mp4" onError:^(NSError * _Nonnull error) {NSLog(@"startCaptureLocalVideo startCapturingFromFileNamed error:%@", error);}];}[self.localVideoTrack addRenderer:renderer];
}- (void)renderRemoteVideo:(id<RTCVideoRenderer>)renderer {if (!self.isPublish) {return;}self.remoteRenderView = renderer;
}- (RTCDataChannel *)createDataChannel {RTCDataChannelConfiguration *config = [[RTCDataChannelConfiguration alloc] init];RTCDataChannel *dataChannel = [self.peerConnection dataChannelForLabel:@"WebRTCData" configuration:config];if (!dataChannel) {return nil;}dataChannel.delegate = self;self.localDataChannel = dataChannel;return dataChannel;
}- (void)sendData:(NSData *)data {RTCDataBuffer *buffer = [[RTCDataBuffer alloc] initWithData:data isBinary:YES];[self.remoteDataChannel sendData:buffer];
}- (void)changeSDP2Server:(RTCSessionDescription *)sdpurlStr:(NSString *)urlStrstreamUrl:(NSString *)streamUrlclosure:(void (^)(BOOL isServerRetSuc))closure {__weak typeof(self) weakSelf = self;[self.httpClient changeSDP2Server:sdp urlStr:urlStr streamUrl:streamUrl closure:^(NSDictionary *result) {if (result && [result isKindOfClass:[NSDictionary class]]) {NSString *sdp = [result objectForKey:@"sdp"];if (sdp && [sdp isKindOfClass:[NSString class]] && sdp.length > 0) {RTCSessionDescription *remoteSDP = [[RTCSessionDescription alloc] initWithType:RTCSdpTypeAnswer sdp:sdp];[weakSelf setRemoteSdp:remoteSDP completion:^(NSError * _Nullable error) {NSLog(@"changeSDP2Server setRemoteDescription error:%@", error);}];}}}];
}#pragma mark - Hiden or show Video
- (void)hidenVideo {[self setVideoEnabled:NO];
}- (void)showVideo {[self setVideoEnabled:YES];
}- (void)setVideoEnabled:(BOOL)isEnabled {[self setTrackEnabled:[RTCVideoTrack class] isEnabled:isEnabled];
}- (void)setTrackEnabled:(Class)track isEnabled:(BOOL)isEnabled {for (RTCRtpTransceiver *transceiver in self.peerConnection.transceivers) {if (transceiver && [transceiver isKindOfClass:track]) {transceiver.sender.track.isEnabled = isEnabled;}}
}#pragma mark - Hiden or show Audio
- (void)muteAudio {[self setAudioEnabled:NO];
}- (void)unmuteAudio {[self setAudioEnabled:YES];
}- (void)speakOff {__weak typeof(self) weakSelf = self;dispatch_async(self.audioQueue, ^{[weakSelf.rtcAudioSession lockForConfiguration];@try {NSError *error;[self.rtcAudioSession setCategory:AVAudioSessionCategoryPlayAndRecord withOptions:AVAudioSessionCategoryOptionDefaultToSpeaker error:&error];NSError *ooapError;[self.rtcAudioSession overrideOutputAudioPort:AVAudioSessionPortOverrideNone error:&ooapError];NSLog(@"speakOff error:%@, ooapError:%@", error, ooapError);} @catch (NSException *exception) {NSLog(@"speakOff exception:%@", exception);}[weakSelf.rtcAudioSession unlockForConfiguration];});
}- (void)speakOn {__weak typeof(self) weakSelf = self;dispatch_async(self.audioQueue, ^{[weakSelf.rtcAudioSession lockForConfiguration];@try {NSError *error;[self.rtcAudioSession setCategory:AVAudioSessionCategoryPlayAndRecord withOptions:AVAudioSessionCategoryOptionDefaultToSpeaker error:&error];NSError *ooapError;[self.rtcAudioSession overrideOutputAudioPort:AVAudioSessionPortOverrideSpeaker error:&ooapError];NSError *activeError;[self.rtcAudioSession setActive:YES error:&activeError];NSLog(@"speakOn error:%@, ooapError:%@, activeError:%@", error, ooapError, activeError);} @catch (NSException *exception) {NSLog(@"speakOn exception:%@", exception);}[weakSelf.rtcAudioSession unlockForConfiguration];});
}- (void)setAudioEnabled:(BOOL)isEnabled {[self setTrackEnabled:[RTCAudioTrack class] isEnabled:isEnabled];
}#pragma mark - RTCPeerConnectionDelegate
/** Called when the SignalingState changed. */
- (void)peerConnection:(RTCPeerConnection *)peerConnection
didChangeSignalingState:(RTCSignalingState)stateChanged {NSLog(@"peerConnection didChangeSignalingState:%ld", (long)stateChanged);
}/** Called when media is received on a new stream from remote peer. */
- (void)peerConnection:(RTCPeerConnection *)peerConnection didAddStream:(RTCMediaStream *)stream {NSLog(@"peerConnection didAddStream");if (self.isPublish) {return;}NSArray *videoTracks = stream.videoTracks;if (videoTracks && videoTracks.count > 0) {RTCVideoTrack *track = videoTracks.firstObject;self.remoteVideoTrack = track;}if (self.remoteVideoTrack && self.remoteRenderView) {id<RTCVideoRenderer> remoteRenderView = self.remoteRenderView;RTCVideoTrack *remoteVideoTrack = self.remoteVideoTrack;[remoteVideoTrack addRenderer:remoteRenderView];}/**if let audioTrack = stream.audioTracks.first{print("audio track faund")audioTrack.source.volume = 8}*/
}/** Called when a remote peer closes a stream.*  This is not called when RTCSdpSemanticsUnifiedPlan is specified.*/
- (void)peerConnection:(RTCPeerConnection *)peerConnection didRemoveStream:(RTCMediaStream *)stream {NSLog(@"peerConnection didRemoveStream");
}/** Called when negotiation is needed, for example ICE has restarted. */
- (void)peerConnectionShouldNegotiate:(RTCPeerConnection *)peerConnection {NSLog(@"peerConnection peerConnectionShouldNegotiate");
}/** Called any time the IceConnectionState changes. */
- (void)peerConnection:(RTCPeerConnection *)peerConnectiondidChangeIceConnectionState:(RTCIceConnectionState)newState {NSLog(@"peerConnection didChangeIceConnectionState:%ld", newState);if (self.delegate && [self.delegate respondsToSelector:@selector(webRTCClient:didChangeConnectionState:)]) {[self.delegate webRTCClient:self didChangeConnectionState:newState];}
}/** Called any time the IceGatheringState changes. */
- (void)peerConnection:(RTCPeerConnection *)peerConnectiondidChangeIceGatheringState:(RTCIceGatheringState)newState {NSLog(@"peerConnection didChangeIceGatheringState:%ld", newState);
}/** New ice candidate has been found. */
- (void)peerConnection:(RTCPeerConnection *)peerConnectiondidGenerateIceCandidate:(RTCIceCandidate *)candidate {NSLog(@"peerConnection didGenerateIceCandidate:%@", candidate);if (self.delegate && [self.delegate respondsToSelector:@selector(webRTCClient:didDiscoverLocalCandidate:)]) {[self.delegate webRTCClient:self didDiscoverLocalCandidate:candidate];}
}/** Called when a group of local Ice candidates have been removed. */
- (void)peerConnection:(RTCPeerConnection *)peerConnectiondidRemoveIceCandidates:(NSArray<RTCIceCandidate *> *)candidates {NSLog(@"peerConnection didRemoveIceCandidates:%@", candidates);
}/** New data channel has been opened. */
- (void)peerConnection:(RTCPeerConnection *)peerConnectiondidOpenDataChannel:(RTCDataChannel *)dataChannel {NSLog(@"peerConnection didOpenDataChannel:%@", dataChannel);self.remoteDataChannel = dataChannel;
}/** Called when signaling indicates a transceiver will be receiving media from*  the remote endpoint.*  This is only called with RTCSdpSemanticsUnifiedPlan specified.*/
- (void)peerConnection:(RTCPeerConnection *)peerConnectiondidStartReceivingOnTransceiver:(RTCRtpTransceiver *)transceiver {NSLog(@"peerConnection didStartReceivingOnTransceiver:%@", transceiver);
}/** Called when a receiver and its track are created. */
- (void)peerConnection:(RTCPeerConnection *)peerConnectiondidAddReceiver:(RTCRtpReceiver *)rtpReceiverstreams:(NSArray<RTCMediaStream *> *)mediaStreams {NSLog(@"peerConnection didAddReceiver");
}/** Called when the receiver and its track are removed. */
- (void)peerConnection:(RTCPeerConnection *)peerConnectiondidRemoveReceiver:(RTCRtpReceiver *)rtpReceiver {NSLog(@"peerConnection didRemoveReceiver");
}#pragma mark - RTCDataChannelDelegate
/** The data channel state changed. */
- (void)dataChannelDidChangeState:(RTCDataChannel *)dataChannel {NSLog(@"dataChannelDidChangeState:%@", dataChannel);
}/** The data channel successfully received a data buffer. */
- (void)dataChannel:(RTCDataChannel *)dataChannel
didReceiveMessageWithBuffer:(RTCDataBuffer *)buffer {if (self.delegate && [self.delegate respondsToSelector:@selector(webRTCClient:didReceiveData:)]) {[self.delegate webRTCClient:self didReceiveData:buffer.data];}
}#pragma mark - Lazy
- (RTCPeerConnectionFactory *)factory {if (!_factory) {RTCInitializeSSL();RTCDefaultVideoEncoderFactory *videoEncoderFactory = [[RTCDefaultVideoEncoderFactory alloc] init];RTCDefaultVideoDecoderFactory *videoDecoderFactory = [[RTCDefaultVideoDecoderFactory alloc] init];for (RTCVideoCodecInfo *codec in videoEncoderFactory.supportedCodecs) {if (codec.parameters) {NSString *profile_level_id = codec.parameters[@"profile-level-id"];if (profile_level_id && [profile_level_id isEqualToString:@"42e01f"]) {videoEncoderFactory.preferredCodec = codec;break;}}}_factory = [[RTCPeerConnectionFactory alloc] initWithEncoderFactory:videoEncoderFactory decoderFactory:videoDecoderFactory];}return _factory;
}- (dispatch_queue_t)audioQueue {if (!_audioQueue) {_audioQueue = dispatch_queue_create("cn.ifour.webrtc", NULL);}return _audioQueue;
}- (RTCAudioSession *)rtcAudioSession {if (!_rtcAudioSession) {_rtcAudioSession = [RTCAudioSession sharedInstance];}return _rtcAudioSession;
}- (NSDictionary *)mediaConstrains {if (!_mediaConstrains) {_mediaConstrains = [[NSDictionary alloc] initWithObjectsAndKeys:kRTCMediaConstraintsValueFalse, kRTCMediaConstraintsOfferToReceiveAudio,kRTCMediaConstraintsValueFalse, kRTCMediaConstraintsOfferToReceiveVideo,kRTCMediaConstraintsValueTrue, @"IceRestart",nil];}return _mediaConstrains;
}- (NSDictionary *)publishMediaConstrains {if (!_publishMediaConstrains) {_publishMediaConstrains = [[NSDictionary alloc] initWithObjectsAndKeys:kRTCMediaConstraintsValueFalse, kRTCMediaConstraintsOfferToReceiveAudio,kRTCMediaConstraintsValueFalse, kRTCMediaConstraintsOfferToReceiveVideo,kRTCMediaConstraintsValueTrue, @"IceRestart",nil];}return _publishMediaConstrains;
}- (NSDictionary *)playMediaConstrains {if (!_playMediaConstrains) {_playMediaConstrains = [[NSDictionary alloc] initWithObjectsAndKeys:kRTCMediaConstraintsValueTrue, kRTCMediaConstraintsOfferToReceiveAudio,kRTCMediaConstraintsValueTrue, kRTCMediaConstraintsOfferToReceiveVideo,kRTCMediaConstraintsValueTrue, @"IceRestart",nil];}return _playMediaConstrains;
}- (NSDictionary *)optionalConstraints {if (!_optionalConstraints) {_optionalConstraints = [[NSDictionary alloc] initWithObjectsAndKeys:kRTCMediaConstraintsValueTrue, @"DtlsSrtpKeyAgreement",nil];}return _optionalConstraints;
}@end

三、本地视频画面显示

使用RTCEAGLVideoView本地摄像头视频画面

self.localRenderer = [[RTCEAGLVideoView alloc] initWithFrame:CGRectZero];
//        self.localRenderer.videoContentMode = UIViewContentModeScaleAspectFill;[self addSubview:self.localRenderer];[self.webRTCClient startCaptureLocalVideo:self.localRenderer];

代码如下

PublishView.h

#import <UIKit/UIKit.h>
#import "WebRTCClient.h"@interface PublishView : UIView- (instancetype)initWithFrame:(CGRect)frame webRTCClient:(WebRTCClient *)webRTCClient;@end

PublishView.m

#import "PublishView.h"@interface PublishView ()@property (nonatomic, strong) WebRTCClient *webRTCClient;
@property (nonatomic, strong) RTCEAGLVideoView *localRenderer;@end@implementation PublishView- (instancetype)initWithFrame:(CGRect)frame webRTCClient:(WebRTCClient *)webRTCClient {self = [super initWithFrame:frame];if (self) {self.webRTCClient = webRTCClient;self.localRenderer = [[RTCEAGLVideoView alloc] initWithFrame:CGRectZero];
//        self.localRenderer.videoContentMode = UIViewContentModeScaleAspectFill;[self addSubview:self.localRenderer];[self.webRTCClient startCaptureLocalVideo:self.localRenderer];}return self;
}- (void)layoutSubviews {[super layoutSubviews];self.localRenderer.frame = self.bounds;NSLog(@"self.localRenderer frame:%@", NSStringFromCGRect(self.localRenderer.frame));
}@end

四、ossrs推流rtc服务

我这里通过调用rtc/v1/publish/从ossrs获得remotesdp,这里请求的地址如下:https://192.168.10.100:1990/rtc/v1/publish/

使用NSURLSessionDataTask实现http请求,请求代码如下

HttpClient.h

#import <Foundation/Foundation.h>
#import <WebRTC/WebRTC.h>@interface HttpClient : NSObject<NSURLSessionDelegate>- (void)changeSDP2Server:(RTCSessionDescription *)sdpurlStr:(NSString *)urlStrstreamUrl:(NSString *)streamUrlclosure:(void (^)(NSDictionary *result))closure;@end

WebRTCClient.m

#import "HttpClient.h"
#import "IPUtil.h"@interface HttpClient ()@property (nonatomic, strong) NSURLSession *session;@end@implementation HttpClient- (instancetype)init
{self = [super init];if (self) {self.session = [NSURLSession sessionWithConfiguration:[NSURLSessionConfiguration defaultSessionConfiguration] delegate:self delegateQueue:[NSOperationQueue mainQueue]];}return self;
}- (void)changeSDP2Server:(RTCSessionDescription *)sdpurlStr:(NSString *)urlStrstreamUrl:(NSString *)streamUrlclosure:(void (^)(NSDictionary *result))closure {//设置URLNSURL *urlString = [NSURL URLWithString:urlStr];//创建可变请求对象NSMutableURLRequest* mutableRequest = [[NSMutableURLRequest alloc] initWithURL:urlString];//设置请求类型[mutableRequest setHTTPMethod:@"POST"];//创建字典,存放要上传的数据NSMutableDictionary *dict = [[NSMutableDictionary alloc] init];[dict setValue:urlStr forKey:@"api"];[dict setValue:[self createTid] forKey:@"tid"];[dict setValue:streamUrl forKey:@"streamurl"];[dict setValue:sdp.sdp forKey:@"sdp"];[dict setValue:[IPUtil localWiFiIPAddress] forKey:@"clientip"];//将字典转化NSData类型NSData *dictPhoneData = [NSJSONSerialization dataWithJSONObject:dict options:0 error:nil];//设置请求体[mutableRequest setHTTPBody:dictPhoneData];//设置请求头[mutableRequest addValue:@"application/json" forHTTPHeaderField:@"Content-Type"];[mutableRequest addValue:@"application/json" forHTTPHeaderField:@"Accept"];//创建任务NSURLSessionDataTask *dataTask = [self.session dataTaskWithRequest:mutableRequest completionHandler:^(NSData * _Nullable data, NSURLResponse * _Nullable response, NSError * _Nullable error) {if (error == nil) {NSLog(@"请求成功:%@",data);NSString *dataString = [[NSString alloc] initWithData:data encoding:kCFStringEncodingUTF8];NSLog(@"请求成功 dataString:%@",dataString);NSDictionary *result = [NSJSONSerialization JSONObjectWithData:data options:NSJSONReadingAllowFragments error:nil];NSLog(@"NSURLSessionDataTask result:%@", result);if (closure) {closure(result);}} else {NSLog(@"网络请求失败!");}}];//启动任务[dataTask resume];
}- (NSString *)createTid {NSDate *date = [[NSDate alloc] init];int timeInterval = (int)([date timeIntervalSince1970]);int random = (int)(arc4random());NSString *str = [NSString stringWithFormat:@"%d*%d", timeInterval, random];if (str.length > 7) {NSString *tid = [str substringToIndex:7];return tid;}return @"";
}#pragma mark -session delegate
-(void)URLSession:(NSURLSession *)session didReceiveChallenge:(NSURLAuthenticationChallenge *)challenge completionHandler:(void (^)(NSURLSessionAuthChallengeDisposition, NSURLCredential * _Nullable))completionHandler {NSURLSessionAuthChallengeDisposition disposition = NSURLSessionAuthChallengePerformDefaultHandling;__block NSURLCredential *credential = nil;if ([challenge.protectionSpace.authenticationMethod isEqualToString:NSURLAuthenticationMethodServerTrust]) {credential = [NSURLCredential credentialForTrust:challenge.protectionSpace.serverTrust];if (credential) {disposition = NSURLSessionAuthChallengeUseCredential;} else {disposition = NSURLSessionAuthChallengePerformDefaultHandling;}} else {disposition = NSURLSessionAuthChallengePerformDefaultHandling;}if (completionHandler) {completionHandler(disposition, credential);}
}@end

这里用到了获取ip的类,代码如下

IPUtil.h

#import <Foundation/Foundation.h>@interface IPUtil : NSObject+ (NSString *)localWiFiIPAddress;@end

IPUtil.m

#import "IPUtil.h"#include <arpa/inet.h>
#include <netdb.h>#include <net/if.h>#include <ifaddrs.h>
#import <dlfcn.h>#import <SystemConfiguration/SystemConfiguration.h>@implementation IPUtil+ (NSString *)localWiFiIPAddress
{BOOL success;struct ifaddrs * addrs;const struct ifaddrs * cursor;success = getifaddrs(&addrs) == 0;if (success) {cursor = addrs;while (cursor != NULL) {// the second test keeps from picking up the loopback addressif (cursor->ifa_addr->sa_family == AF_INET && (cursor->ifa_flags & IFF_LOOPBACK) == 0){NSString *name = [NSString stringWithUTF8String:cursor->ifa_name];if ([name isEqualToString:@"en0"])  // Wi-Fi adapterreturn [NSString stringWithUTF8String:inet_ntoa(((struct sockaddr_in *)cursor->ifa_addr)->sin_addr)];}cursor = cursor->ifa_next;}freeifaddrs(addrs);}return nil;
}@end

五、调用ossrs推流rtc服务

通过ossrs推流rtc服务,实现本地createOffer之后设置setLocalDescription,再调用rtc/v1/publish/

代码如下

- (void)publishBtnClick {__weak typeof(self) weakSelf = self;[self.webRTCClient offer:^(RTCSessionDescription *sdp) {[weakSelf.webRTCClient changeSDP2Server:sdp urlStr:@"https://192.168.10.100:1990/rtc/v1/publish/" streamUrl:@"webrtc://192.168.10.100:1990/live/livestream" closure:^(BOOL isServerRetSuc) {NSLog(@"isServerRetSuc:%@",(isServerRetSuc?@"YES":@"NO"));}];}];
}

在ViewController上的界面及推流操作

PublishViewController.h

#import <UIKit/UIKit.h>
#import "PublishView.h"@interface PublishViewController : UIViewController@end

PublishViewController.m

#import "PublishViewController.h"@interface PublishViewController ()<WebRTCClientDelegate>@property (nonatomic, strong) WebRTCClient *webRTCClient;@property (nonatomic, strong) PublishView *publishView;@property (nonatomic, strong) UIButton *publishBtn;@end@implementation PublishViewController- (void)viewDidLoad {[super viewDidLoad];// Do any additional setup after loading the view.self.view.backgroundColor = [UIColor whiteColor];self.publishView = [[PublishView alloc] initWithFrame:CGRectZero webRTCClient:self.webRTCClient];[self.view addSubview:self.publishView];self.publishView.backgroundColor = [UIColor lightGrayColor];self.publishView.frame = self.view.bounds;CGFloat screenWidth = CGRectGetWidth(self.view.bounds);CGFloat screenHeight = CGRectGetHeight(self.view.bounds);self.publishBtn = [UIButton buttonWithType:UIButtonTypeCustom];self.publishBtn.frame = CGRectMake(50, screenHeight - 160, screenWidth - 2*50, 46);self.publishBtn.layer.cornerRadius = 4;self.publishBtn.backgroundColor = [UIColor grayColor];[self.publishBtn setTitle:@"publish" forState:UIControlStateNormal];[self.publishBtn addTarget:self action:@selector(publishBtnClick) forControlEvents:UIControlEventTouchUpInside];[self.view addSubview:self.publishBtn];self.webRTCClient.delegate = self;
}- (void)publishBtnClick {__weak typeof(self) weakSelf = self;[self.webRTCClient offer:^(RTCSessionDescription *sdp) {[weakSelf.webRTCClient changeSDP2Server:sdp urlStr:@"https://192.168.10.100:1990/rtc/v1/publish/" streamUrl:@"webrtc://192.168.10.100:1990/live/livestream" closure:^(BOOL isServerRetSuc) {NSLog(@"isServerRetSuc:%@",(isServerRetSuc?@"YES":@"NO"));}];}];
}#pragma mark - WebRTCClientDelegate
- (void)webRTCClient:(WebRTCClient *)client didDiscoverLocalCandidate:(RTCIceCandidate *)candidate {NSLog(@"webRTCClient didDiscoverLocalCandidate");
}- (void)webRTCClient:(WebRTCClient *)client didChangeConnectionState:(RTCIceConnectionState)state {NSLog(@"webRTCClient didChangeConnectionState");/**RTCIceConnectionStateNew,RTCIceConnectionStateChecking,RTCIceConnectionStateConnected,RTCIceConnectionStateCompleted,RTCIceConnectionStateFailed,RTCIceConnectionStateDisconnected,RTCIceConnectionStateClosed,RTCIceConnectionStateCount,*/UIColor *textColor = [UIColor blackColor];BOOL openSpeak = NO;switch (state) {case RTCIceConnectionStateCompleted:case RTCIceConnectionStateConnected:textColor = [UIColor greenColor];openSpeak = YES;break;case RTCIceConnectionStateDisconnected:textColor = [UIColor orangeColor];break;case RTCIceConnectionStateFailed:case RTCIceConnectionStateClosed:textColor = [UIColor redColor];break;case RTCIceConnectionStateNew:case RTCIceConnectionStateChecking:case RTCIceConnectionStateCount:textColor = [UIColor blackColor];break;default:break;}dispatch_async(dispatch_get_main_queue(), ^{NSString *text = [NSString stringWithFormat:@"%ld", state];[self.publishBtn setTitle:text forState:UIControlStateNormal];[self.publishBtn setTitleColor:textColor forState:UIControlStateNormal];if (openSpeak) {[self.webRTCClient speakOn];}
//        if textColor == .green {
//            self?.webRTCClient.speakerOn()
//        }});
}- (void)webRTCClient:(WebRTCClient *)client didReceiveData:(NSData *)data {NSLog(@"webRTCClient didReceiveData");
}#pragma mark - Lazy
- (WebRTCClient *)webRTCClient {if (!_webRTCClient) {_webRTCClient = [[WebRTCClient alloc] initWithPublish:YES];}return _webRTCClient;
}@end

当点击按钮开启rtc推流。效果图如下

在这里插入图片描述

六、WebRTC视频文件推流

WebRTC还为我们提供了视频文件推流RTCFileVideoCapturer

if ([capturer isKindOfClass:[RTCFileVideoCapturer class]]) {RTCFileVideoCapturer *fileVideoCapturer = (RTCFileVideoCapturer *)capturer;[fileVideoCapturer startCapturingFromFileNamed:@"beautyPicture.mp4" onError:^(NSError * _Nonnull error) {NSLog(@"startCaptureLocalVideo startCapturingFromFileNamed error:%@", error);}];}

推送的本地视频效果图如下

在这里插入图片描述

至此实现了WebRTC音视频通话的iOS端调用ossrs视频通话服务功能。内容较多,描述可能不准确,请见谅。

七、小结

WebRTC音视频通话-实现iOS端调用ossrs视频通话服务。内容较多,描述可能不准确,请见谅。本文地址:https://blog.csdn.net/gloryFlow/article/details/132262724

学习记录,每天不停进步。

相关文章:

WebRTC音视频通话-实现iOS端调用ossrs视频通话服务

WebRTC音视频通话-实现iOS端调用ossrs视频通话服务 之前搭建ossrs服务&#xff0c;可以查看&#xff1a;https://blog.csdn.net/gloryFlow/article/details/132257196 这里iOS端使用GoogleWebRTC联调ossrs实现视频通话功能。 一、iOS端调用ossrs视频通话效果图 iOS端端效果图…...

uniapp的UI框架组件库——uView

在写uniapp项目时候&#xff0c;官方所推荐的样式库并不能满足日常的需求&#xff0c;也不可能自己去写相应的样式&#xff0c;费时又费力&#xff0c;所以我们一般会去使用第三方的组件库UI&#xff0c;就像vue里我们所熟悉的elementUI组件库一样的道理&#xff0c;在uniapp中…...

由于找不到msvcp140.dll无法继续执行代码是什么原因

使用计算机过程中&#xff0c;有时会遇到一些错误提示&#xff0c;其中之一就是关于msvcp140.dll文件丢失或损坏的错误。msvcp140.dll是Windows系统中非常重要的文件&#xff0c;是Microsoft Visual C Redistributable中动态链接库的文件&#xff0c;如果缺失或损坏&#xff0c…...

kafka生产者幂等与事务

目录 前言&#xff1a; 幂等 事务 总结&#xff1a; 参考资料 前言&#xff1a; Kafka 消息交付可靠性保障以及精确处理一次语义的实现。 所谓的消息交付可靠性保障&#xff0c;是指 Kafka 对 Producer 和 Consumer 要处理的消息提供什么样的承诺。常见的承诺有以下三…...

Docker容器:docker基础概述、安装、网络及资源控制

文章目录 一.docker容器概述1.什么是容器2. docker与虚拟机的区别2.1 docker虚拟化产品有哪些及其对比2.2 Docker与虚拟机的区别 3.Docker容器的使用场景4.Docker容器的优点5.Docker 的底层运行原理6.namespace的六项隔离7.Docker核心概念 二.Docker安装 及管理1.安装 Docker1.…...

实验篇——亚细胞定位

实验篇——亚细胞定位 文章目录 前言一、亚细胞定位的在线网站1. UniProt2. WoLFPSORT3. BUSCA4. TargetP-2.0 二、代码实现1. 基于UniProt&#xff08;不会&#xff09;2. 基于WoLFPSORT后续&#xff08;已完善&#xff0c;有关代码放置于[python爬虫学习&#xff08;一&#…...

【日常积累】HTTP和HTTPS的区别

背景 在运维面试中&#xff0c;经常会遇到面试官提问http和https的区别&#xff0c;今天咱们先来简单了解一下。 超文本传输协议HTTP被用于在Web浏览器和网站服务器之间传递信息&#xff0c;HTTP协议以明文方式发送内容&#xff0c;不提供任何方式的数据加密&#xff0c;如果…...

Qt creator之对齐参考线——新增可视化缩进功能

Qt creator随着官方越来越重视&#xff0c;更新频率也在不断加快&#xff0c;今天无意中发现qt creator新版有了对齐参考线&#xff0c;也称可视化缩进Visualize Indent&#xff0c;默认为启用状态。 下图为旧版Qt Creator显示设置栏&#xff1a; 下图为新版本Qt Creator显示设…...

Go语言之依赖管理

go module go module是Go1.11版本之后官方推出的版本管理工具&#xff0c;并且从Go1.13版本开始&#xff0c;go module将是Go语言默认的依赖管理工具。 GO111MODULE 要启用go module支持首先要设置环境变量GO111MODULE 通过它可以开启或关闭模块支持&#xff0c;它有三个可选…...

【定时任务处理中的分页问题】

最近要做一个定时任务处理的需求&#xff0c;在分页处理上。发现了大家容易遇到的一些"坑"&#xff0c;特此分析记录一下。 场景 现在想象一下这个场景&#xff0c;你有一个定时处理任务&#xff0c;需要查询数据库任务表中的所有待处理任务&#xff0c;然后进行处理…...

Vue3 Vuex状态管理多组件传递数据简单应用

去官网学习→安装 | Vuex cd 项目 安装 Vuex&#xff1a; npm install --save vuex 或着 创建项目时勾选Vuex vue create vue-demo ? Please pick a preset: Manually select features ? Check the features needed for your project: (Press <space> to se…...

Beats:安装及配置 Metricbeat (一)- 8.x

在我之前的文章&#xff1a; Beats&#xff1a;Beats 入门教程 &#xff08;一&#xff09;Beats&#xff1a;Beats 入门教程 &#xff08;二&#xff09; 我详细描述了如何在 Elastic Stack 7.x 安装及配置 Beats。在那里的安装&#xff0c;它通常不带有安全及 Elasticsearc…...

openCV使用c#操作摄像头

效果如下&#xff1a; 1.创建一个winform的窗体项目&#xff08;框架.NET Framework 4.7.2&#xff09; 2.Nuget引入opencv的c#程序包&#xff08;版本最好和我一致&#xff09; 3.后台代码 using System; using System.Collections.Generic; using System.ComponentModel;…...

Centos 防火墙命令

查看防火墙状态 systemctl status firewalld.service 或者 firewall-cmd --state 开启防火墙 单次开启防火墙 systemctl start firewalld.service 开机自启动防火墙 systemctl enable firewalld.service 重启防火墙 systemctl restart firewalld.service 防火墙设置开…...

【第二讲---初识SLAM】

SLAM简介 视觉SLAM&#xff0c;主要指的是利用相机完成建图和定位问题。如果传感器是激光&#xff0c;那么就称为激光SLAM。 定位&#xff08;明白自身状态&#xff08;即位置&#xff09;&#xff09;建图&#xff08;了解外在环境&#xff09;。 视觉SLAM中使用的相机与常见…...

C++ 面向对象三大特性——继承

✅<1>主页&#xff1a;我的代码爱吃辣 &#x1f4c3;<2>知识讲解&#xff1a;C 继承 ☂️<3>开发环境&#xff1a;Visual Studio 2022 &#x1f4ac;<4>前言&#xff1a;面向对象三大特性的&#xff0c;封装&#xff0c;继承&#xff0c;多态&#xff…...

LC-相同的树

LC-相同的树 链接&#xff1a;https://leetcode.cn/problems/same-tree/solutions/363636/xiang-tong-de-shu-by-leetcode-solution/ 描述&#xff1a;给你两棵二叉树的根节点 p 和 q &#xff0c;编写一个函数来检验这两棵树是否相同。 如果两个树在结构上相同&#xff0c;并…...

RocketMQ部署 Linux方式和Docker方式

一、Linux部署 准备一台Linux机器&#xff0c;部署单master rocketmq节点 系统ip角色模式CENTOS10.4.7.126Nameserver,brokerMaster 1. 配置JDK rocketmq运行需要依赖jdk&#xff0c;安装步骤略。 2. 下载和配置 从官网下载安装包 https://rocketmq.apache.org/zh/downlo…...

css内容达到最底部但滚动条没有滚动到底部

也是犯了一个傻狗一样的错误 &#xff0c;滚动条样式是直接复制的蓝湖的代码&#xff0c;有个高度&#xff0c;然后就出现了这样的bug 看了好久一直以为是布局或者overflow的问题&#xff0c;最后发现是因为我给这个滚动条加了个高度&#xff0c;我也是傻狗一样的&#xff0c;…...

机器学习深度学习——transformer(机器翻译的再实现)

&#x1f468;‍&#x1f393;作者简介&#xff1a;一位即将上大四&#xff0c;正专攻机器学习的保研er &#x1f30c;上期文章&#xff1a;机器学习&&深度学习——自注意力和位置编码&#xff08;数学推导代码实现&#xff09; &#x1f4da;订阅专栏&#xff1a;机器…...

神经网络基础-神经网络补充概念-30-搭建神经网络块

概念 搭建神经网络块是一种常见的做法&#xff0c;它可以帮助你更好地组织和复用网络结构。神经网络块可以是一些相对独立的模块&#xff0c;例如卷积块、全连接块等&#xff0c;用于构建更复杂的网络架构。 代码实现 import numpy as np import tensorflow as tf from tens…...

在线吉他调音

先看效果&#xff08;图片没有声&#xff0c;可以下载源码看看&#xff0c;比这更好~&#xff09;&#xff1a; 再看代码&#xff08;查看更多&#xff09;&#xff1a; <!DOCTYPE html> <html lang"en"> <head><meta charset"UTF-8&quo…...

Windows11 Docker Desktop 启动 -wsl kernel version too low

系统环境&#xff1a;windows11 1&#xff1a;docker下载 Docker: Accelerated Container Application Development 下载后双击安装即可 安装后启动Docker提示&#xff1a;Docker Desktop -wsl kernel version too low 处理起来也是非常方便 1:管理员身份启动&#xff1a;…...

Golang 中的 unsafe 包详解

Golang 中的 unsafe 包用于在运行时进行低级别的操作。这些操作通常是不安全的&#xff0c;因为可以打破 Golang 的类型安全性和内存安全性&#xff0c;使用 unsafe 包的程序可能会影响可移植性和兼容性。接下来看下 unsafe 包中的类型和函数。 unsafe.Pointer 类型 通常用于…...

linux 的swap、swappiness及kswapd原理【转+自己理解】

本文讨论的 swap基于Linux4.4内核代码 。Linux内存管理是一套非常复杂的系统&#xff0c;而swap只是其中一个很小的处理逻辑。 希望本文能让读者了解Linux对swap的使用大概是什么样子。阅读完本文&#xff0c;应该可以帮你解决以下问题&#xff1a; swap到底是干嘛的&#xf…...

什么是Java中的适配器模式?

Java中的适配器模式&#xff08;Adapter Pattern&#xff09;是一种设计模式&#xff0c;它允许我们将一种类的接口转换成另一种类的接口&#xff0c;以便于使用。适配器模式通常用于在不兼容的接口之间提供一种过渡性的接口&#xff0c;从而使代码更加灵活和可维护。 在Java中…...

MYSQL线上无锁添加索引

在需求上线过程中&#xff0c;经常会往一个数据量比较大的数据表中的字段加索引&#xff0c;一张几百万数据的表&#xff0c;加个索引往往要几分钟起步。在这段时间内&#xff0c;保证服务的正常功能运行十分重要&#xff0c;所以需要线上无锁添加索引&#xff0c;即加索引的语…...

如何实现客户自助服务?打造产品知识库

良好的客户服务始于自助服务。根据哈佛商业评论&#xff0c;81% 的客户在联系工作人员之前尝试自己解决问题。92% 的客户表示他们更喜欢使用产品知识库/帮助中心。 所以本文主要探讨了产品知识库是什么&#xff0c;有哪些优势以及如何创建。 产品知识库是什么 产品知识库是将…...

LeetCode环形子数组的最大和(编号918)

目录 一.题目 二.解题思路 三.解题代码 一.题目 918. 环形子数组的最大和 给定一个长度为 n 的环形整数数组 nums &#xff0c;返回 nums 的非空 子数组 的最大可能和 。 环形数组 意味着数组的末端将会与开头相连呈环状。形式上&#xff0c; nums[i] 的下一个元素是 nums[…...

PhpOffice/PhpSpreadsheet读取和写入Excel

PhpSpreadsheet是一个纯PHP编写的组件库&#xff0c;它使用现代PHP写法&#xff0c;代码质量和性能比PHPExcel高不少&#xff0c;完全可以替代PHPExcel&#xff08;PHPExcel已不再维护&#xff09;。使用PhpSpreadsheet可以轻松读取和写入Excel文档&#xff0c;支持Excel的所有…...