Video interactive live broadcast is currently a popular way to play. We often see PK Lianmai, live answering questions, KTV together, e-commerce live broadcast, interactive large class, video blind date, etc.
This article will teach you how to implement a live video application on iOS through Agora Video SDK. After registering a Shengwang account, developers can get 10,000 minutes of free usage every month, which can realize various real-time audio and video scenarios.
Without further ado, let's get started.
1. Experience live video through open source Demo
There may be some people who still don't know what the function we want to achieve will be in the end. Therefore, we provide an open-source basic live video live example project on GitHub. Before starting development, you can experience the experience effect of live video through this example project.
Agora provides open-source interactive live example projects OpenLive-iOS-Objective-C and OpenLive-iOS-Swift on GitHub. Before implementing the relevant functions, you can download and view the source code.
Objective-C Github link: Basic-Video-Broadcasting/OpenLive-iOS-Objective-C at master AgoraIO/Basic-Video-Broadcasting GitHub 4
Swift Github link: Basic-Video-Broadcasting/OpenLive-iOS at master AgoraIO/Basic-Video-Broadcasting GitHub 1
Second, the technical principle of live video
What we want to achieve here is live video. Agora's live video can achieve interactive effects, so it is often called interactive live broadcast. You can understand that multiple users join the same channel to realize the intercommunication of audio and video, and the data of this channel will be transmitted with low latency through the Agora SD-RTN real-time network of Sound Network.
It should be noted that Agora interactive live broadcast is different from video live broadcast. The video call does not distinguish between the anchor and the audience, all users can speak and see each other; while the users of the interactive live broadcast are divided into the anchor and the audience, only the anchor can speak freely and be seen by other users.
The following figure shows the basic workflow of integrating Agora Interactive Live in the App:
As shown in the figure, the steps to realize live video are as follows:
- Get Token: When the app client joins the channel, you need to use the Token to authenticate the user. In a test or production environment, get the token from the app server.
- Join a channel: Call joinChannel to create and join a channel. App clients using the same channel name join the same channel by default. A channel can be understood as a channel dedicated to transmitting real-time audio and video data.
- Publish and subscribe audio and video streams in a channel: After joining a channel, the app client can publish and subscribe audio and video streams in the channel.
App clients need the following information to join a channel:
- App ID: A string randomly generated by Agora to identify your app, which can be obtained from the Agora console, (Agora console link: Dashboard
- User ID: The unique identification of the user. You need to set the user ID yourself and make sure it is unique within the channel.
- Token: In a test or production environment, the app client obtains a token from your server. In the process described in this article, you can obtain a temporary token from the Agora console. Temporary tokens are valid for 24 hours.
- Channel Name: A string that identifies the live video channel.
3. Development environment
The Agora SDK has good compatibility and does not have high requirements on hardware devices and software systems. The development environment and test environment can meet the following conditions:
• Xcode 9.0 or above • A real device that supports voice and video functions • App requires an iOS device that supports iOS 8.0 or above
The following are the development and test environments for this article:
Development Environment • macOS Version 11.6 • Xcode Version 13.1
Test Environment • iPhone7 (iOS 15.3)
If you have not been exposed to the Agora SDK before, then you need to do the following preparations:
• Register a Shengwang account, enter the background to create an AppID, get a Token,
• Download the latest official live video SDK of Shengwang; (video live SDK link: Download - Video Call - Documentation Center - Shengwang Agora
4. Project Settings
1. Before implementing live video, please refer to the following steps to set up your project:
a) To create a new project, in Xcode, open Xcode and click Create a new Xcode project. (Create iOS project link: https://developer.apple.com/documentation/xcode/creating-an-xcode-project-for-an-app) 1
b) Select the platform type as iOS and the project type as Single View App, and click Next.
c) Enter the project information such as the project name (Product Name), development team information (Team), organization name (Organization Name) and language (Language), and click Next.
Note: If you have not added development team information, you will see the Add account… button. Click this button and follow the on-screen prompts to log in to your Apple ID. When done, you can select your Apple account as the development team.
d) Select the project storage path and click Create.
2. Integrated SDK
Choose one of the following methods to obtain the latest version of the Agora iOS SDK.
Method 1: Use CocoaPods to get the SDK
a) Make sure you have Cocoapods installed before starting. Refer to Getting Started with CocoaPods installation instructions. (Link to Getting Started with CocoaPods installation instructions: CocoaPods Guides - Getting Started 1
# platform :ios, '9.0'
target 'Your App' do
pod 'AgoraRtcEngine_iOS'
end
b) Go to the project root directory in the terminal and run the pod init command. A Podfile text file will be generated in the project folder.
c) Open the Podfile file and modify the file as follows. Note to replace Your App with your Target name.
Method 2: Get the SDK from the official website
a) Go to the SDK download page, get the latest version of the Agora iOS SDK, and unzip it. (Live video SDK link: Download - Video Call - Documentation Center - Agora Audio Network
b) According to your needs, copy the dynamic library in the libs folder to the ./project_name folder of the project (project_name is your project name).
c) Open Xcode, go to TARGETS > Project Name > Build Phases > Link Binary with Libraries menu, click + to add the following libraries (eg: ). When adding the AgoraRtcEngineKit.framework file, you need to click + and then click Add Other… to find the local file and open it.
A total of 11 library files need to be added:
i. AgoraRtcEngineKit.framework
ii. Accelerate.framework
iii. AudioToolbox.framework
iv. AVFoundation.framework
v. CoreMedia.framework
vi. CoreML.framework
vii.CoreTelephony.framework
viii.libc++.tbd
ix. libresolv.tbd
x.SystemConfiguration.framework
xi. VideoToolbox.framework
Note: To support iOS 9.0 or lower devices, set the dependency on CoreML.framework to Optional in Xcode.
d) Open Xcode and go to the TARGETS > Project Name > General > Frameworks, Libraries, and Embedded Content menu.
e) Click + > Add Other… > Add Files to add the corresponding dynamic library, and make sure that the Embed property of the added dynamic library is set to Embed & Sign. After the addition is complete, the project will automatically link the required system libraries.
Notice:
· According to the official requirements of Apple, dynamic libraries are not allowed in the extension of the app. If the Extension in the project needs to integrate the SDK, the file status needs to be changed to Do Not Embed when adding the dynamic library.
· Agora SDK uses libc++ (LLVM) by default, if you need to use libstdc++ (GNU), please contact sales@agora.io . The library provided by the SDK is FAT Image, which includes 32/64-bit emulators and 32/64-bit real machine versions.
3. Permission settings
Xcode go to the TARGETS > Project Name > General > Signing menu, select Automatically manage signing, and click Enable Automatic in the popup menu.
Add media device permissions According to the needs of the scene, in the info.plist file, click the + icon to start adding the following content to obtain the corresponding device permissions:
4. Import Agora related classes
Import the AgoraRtcEngineKit class in the project:
// Objective-C
// 导入 AgoraRtcKit 类
// 自 3.0.0 版本起,AgoraRtcEngineKit 类名更换为 AgoraRtcKit
// 如果获取的是 3.0.0 以下版本的 SDK,请改用 #import <AgoraRtcEngineKit/AgoraRtcEngineKit.h>
#import <AgoraRtcKit/AgoraRtcEngineKit.h>
// 声明 AgoraRtcEngineDelegate,用于监听回调
@interface ViewController : UIViewController <AgoraRtcEngineDelegate>
// 定义 agoraKit 变量
@property (strong, nonatomic) AgoraRtcEngineKit *agoraKit;
// Swift
// 导入 AgoraRtcKit 类
// 自 3.0.0 版本起,AgoraRtcEngineKit 类名更换为 AgoraRtcKit
// 如果获取的是 3.0.0 以下版本的 SDK,请改用 import AgoraRtcEngineKit
import AgoraRtcKit
class ViewController: UIViewController {
...
// 定义 agoraKit 变量
var agoraKit: AgoraRtcEngineKit?
}
5. Set Agora account information
In the KeyCenter.swift file, fill in your AppID in the corresponding position, you can replace "Your App ID";
// Objective-C
// AppID.m
// Agora iOS Tutorial
NSString *const appID = <#Your App ID#>;
// Swift
// AppID.swift
// Agora iOS Tutorial
Static let AppID: String = Your App ID
5. Client implementation
This section introduces a few tips on how to use the Agora Video SDK to implement live video in your app:
1. Create the user interface
Create a live video user interface for your project as required by the scenario. We recommend that you add elements to your project: local video window, remote video window.
You can refer to the following code to create a basic user interface.
// Objective-C
// 导入 UIKit
#import <UIKit/UIKit.h>
@interface ViewController ()
// 定义 localView 变量
@property (nonatomic, strong) UIView *localView;
// 定义 remoteView 变量
@property (nonatomic, strong) UIView *remoteView;
@end
@implementation ViewController
...
- (void)viewDidLoad {
[super viewDidLoad];
// 调用初始化视频窗口函数
[self initViews];
// 后续步骤调用 Agora API 使用的函数
[self initializeAgoraEngine];
[self setChannelProfile];
[self setClientRole];
[self setupLocalVideo];
[self joinChannel];
}
// 设置视频窗口布局
- (void)viewDidLayoutSubviews {
[super viewDidLayoutSubviews];
self.remoteView.frame = self.view.bounds;
self.localView.frame = CGRectMake(self.view.bounds.size.width - 90, 0, 90, 160);
}
- (void)initViews {
// 初始化远端视频窗口。只有当远端用户为主播时,才会显示视频画面
self.remoteView = [[UIView alloc] init];
[self.view addSubview:self.remoteView];
// 初始化本地视频窗口。只有当本地用户为主播时,才会显示视频画面
self.localView = [[UIView alloc] init];
[self.view addSubview:self.localView];
}
// Swift
// 导入 UIKit
import UIKit
class ViewController: UIViewController {
...
// 定义 localView 变量
var localView: UIView!
// 定义 remoteView 变量
var remoteView: UIView!
override func viewDidLoad() {
super.viewDidLoad()
// 调用初始化视频窗口函数
initView()
// 后续步骤调用 Agora API 使用的函数
initializeAgoraEngine()
setChannelProfile()
setClientRole()
setupLocalVideo()
joinChannel()
}
// 设置视频窗口布局
override func viewDidLayoutSubviews() {
super.viewDidLayoutSubviews()
remoteView.frame = self.view.bounds
localView.frame = CGRect(x: self.view.bounds.width - 90, y: 0, width: 90, height: 160)
}
func initView() {
// 初始化远端视频窗口。只有当远端用户为主播时,才会显示视频画面
remoteView = UIView()
self.view.addSubview(remoteView)
// 初始化本地视频窗口。只有当本地用户为主播时,才会显示视频画面
localView = UIView()
self.view.addSubview(localView)
}
}
2. Implement the live video logic
Now, we have integrated the Agora iOS SDK into the project. Next, we need to call the core API provided by the Agora iOS SDK in the ViewController to implement the basic live video function. You can provide the open source interactive live example projects OpenLive-iOS-Objective-C and OpenLive-iOS-Swift on Agora on GitHub. Before implementing the relevant functions, you can download and view the source code.
The API call sequence is shown in the following figure:
Follow these steps to implement this logic:
a) Initialize the AgoraRtcEngineKit object <br>Before calling other Agora APIs, you need to create and initialize the AgoraRtcEngineKit object. Call the sharedEngineWithAppId method and pass in the obtained App ID to initialize AgoraRtcEngineKit.
// Objective-C
// 输入 App ID 并初始化 AgoraRtcEngineKit 类。
- (void) viewDidLoad{
self.rtcEngine = [AgoraRtcEngineKit sharedEngineWithAppId:[KeyCenter AppId] delegate:self];
}
// Swift
// 输入 App ID 并初始化 AgoraRtcEngineKit 类。
private lazy var agoraKit: AgoraRtcEngineKit = {
let engine = AgoraRtcEngineKit.sharedEngine(withAppId: KeyCenter.AppId, delegate: nil)
return engine
}()
You can also register the callback events you want to monitor during initialization according to the needs of the scene, such as the local user joining the channel, and decoding the first frame of the remote user video.
b) Set the channel scene <br>Call the setChannelProfile method to set the channel scene to live. An AgoraRtcEngineKit can only use one channel scene. If you want to switch to another channel scene, you need to call the destroy method to destroy the current AgoraRtcEngineKit object, then use the sharedEngineWithAppId method to create a new object, and then call setChannelProfile to set the new channel scene.
// Objective-C
// 设置频道场景为直播模式
[self.rtcEngine setChannelProfile:AgoraChannelProfileLiveBroadcasting];
// Swift
// 设置频道场景为直播模式
agoraKit.setChannelProfile(.liveBroadcasting)
c) Set user roles <br>The live channel has two user roles: host and viewer, and the default role is viewer. After setting the channel scene to live, you can set the user role in the app by referring to the following steps:
- Let users choose whether their role is a host or a viewer;
- Call the setClientRole method, and then use the role selected by the user to pass parameters.
Note that users in the live channel can only see the anchor's screen and hear the anchor's voice. After joining the channel, if you want to switch user roles, you can also call the setClientRole method.
// Objective-C
// 设置用户角色
- (IBAction)doBroadcastPressed:(UIButton *)sender {
if (self.isBroadcaster) {
// 设置用户角色为主播
self.clientRole = AgoraClientRoleAudience;
if (self.fullSession.uid == 0) {
self.fullSession = nil;
}
} else {
// 设置用户角色为观众
self.clientRole = AgoraClientRoleBroadcaster;
}
[self.rtcEngine setClientRole:self.clientRole];
[self updateInterfaceWithAnimation:YES];
}
// Swift
// 选择用户角色
@IBAction func doBroadcasterTap(_ sender: UITapGestureRecognizer) {
// 选择用户角色为主播
selectedRoleToLive(role: .broadcaster)
}
@IBAction func doAudienceTap(_ sender: UITapGestureRecognizer)
// 选择用户角色为观众
selectedRoleToLive(role: .audience)
// 设置用户角色
agoraKit.setClientRole(settings.role)
// 设置为主播角色时
if settings.role == .broadcaster {
addLocalSession()
agoraKit.startPreview()
}
//设置为观众角色时
let isHidden = settings.role == .audience
d) Set the local view <br>After successfully initializing the AgoraRtcEngineKit object, you need to set the local view before joining the channel, so that you can see the local image in the call. Refer to the following steps to set up the local view:
· Call the enableVideo method to enable the video module.
· Call the setupLocalVideo method to set up the local view.
// Objective-C
// 启用视频模块。
[self.rtcEngine enableVideo];
// 设置本地视图。
- (void)addLocalSession {
VideoSession *localSession = [VideoSession localSession];
[self.videoSessions addObject:localSession];
// 设置本地视图。
[self.rtcEngine setupLocalVideo:localSession.canvas];
[self updateInterfaceWithAnimation:YES];
}
// VideoSession部分
// VideoSession.m
#import "VideoSession.h"
@implementation VideoSession
- (instancetype)initWithUid:(NSUInteger)uid {
if (self = [super init]) {
self.uid = uid;
self.hostingView = [[UIView alloc] init];
self.hostingView.translatesAutoresizingMaskIntoConstraints = NO;
self.canvas = [[AgoraRtcVideoCanvas alloc] init];
self.canvas.uid = uid;
self.canvas.view = self.hostingView;
self.canvas.renderMode = AgoraVideoRenderModeHidden;
}
return self;
}
+ (instancetype)localSession {
return [[VideoSession alloc] initWithUid:0];
}
@end
// Swift
// 启用视频模块。
agoraKit.enableVideo()
// 设置本地视图。
agoraKit.setupLocalVideo(videoCanvas)
// VideoSession部分
// VideoSession.swift
hostingView = VideoView(frame: CGRect(x: 0, y: 0, width: 100, height: 100))
hostingView.translatesAutoresizingMaskIntoConstraints = false
canvas = AgoraRtcVideoCanvas()
canvas.uid = uid
canvas.view = hostingView.videoView
canvas.renderMode = .hidden
e) Join a channel <br>A channel is a public space where people are in the same live video. After initializing and setting up the local view (live video scene), you can call the joinChannelByToken method to join the channel. You need to pass the following parameters to this method:
- channelId: Pass in the channel ID that identifies the channel. Users who enter the same channel ID will enter the same channel.
- token: Pass in a Token that identifies the user's roles and permissions. You can set the following values:
a) nil .
b) Temporary Token generated in the console. A temporary token is valid for 24 hours. For details, see Obtaining a Temporary Token.
c) The official Token generated by your server. It is suitable for production environments with high security requirements. For details, see Generate Token. If the project has enabled App Certificate, please use Token.
d) uid: ID of the local user. The data type is integer, and the uid of each user within the channel must be unique. If the uid is set to 0, the SDK will automatically assign a uid and report it in the joinSuccessBlock callback.
e) joinSuccessBlock: successfully joined the channel callback. The priority of joinSuccessBlock is higher than didJoinChannel , when two exist at the same time, didJoinChannel will be ignored. When a didJoinChannel callback is required, set joinSuccessBlock to nil.
For more parameter setting precautions, please refer to the parameter description in the joinChannelByToken interface.
// Objective-C
// 加入频道。
self.rtcEngine joinChannelByToken:[KeyCenter Token] channelId:self.roomName info:nil uid:0 joinSuccess:nil
// Swift
// 加入频道。
agoraKit.joinChannel(byToken: KeyCenter.Token, channelId: channelId, info: nil, uid: 0, joinSuccess: nil)
f) Set the remote view <br>In the video interactive live broadcast, usually you also need to see other anchors. After the remote host successfully joins the channel, the SDK will trigger the didJoinedOfUid callback, which will contain the uid information of the remote host. Call the setupRemoteVideo method in the callback, pass in the obtained uid, and set the view of the remote host.
// Objective-C
// 监听 didJoinedOfUid 回调
// 远端主播加入频道时,会触发该回调
- (void)rtcEngine:(AgoraRtcEngineKit *)engine didJoinedOfUid:(NSUInteger)uid elapsed:(NSInteger)elapsed {
AgoraRtcVideoCanvas *videoCanvas = [[AgoraRtcVideoCanvas alloc] init];
videoCanvas.uid = uid;
videoCanvas.renderMode = AgoraVideoRenderModeHidden;
videoCanvas.view = self.remoteView;
// 设置远端视图
[self.agoraKit setupRemoteVideo:videoCanvas];
}
// Swift
//需要在额外添加以下代码
extension LiveRoomViewController: AgoraRtcEngineDelegate {
// 监听 didJoinedOfUid 回调
// 远端主播加入频道时,会触发该回调
func rtcEngine(_ engine: AgoraRtcEngineKit, didJoinedOfUid uid: UInt, elapsed: Int) {
guard videoSessions.count <= maxVideoSession else {
return
}
let userSession = videoSession(of: uid)
// 设置远端视图
agoraKit.setupRemoteVideo(userSession.canvas)
}
}
g) Leave the channel <br>According to the needs of the scene, such as ending the call, closing the App or when the App switches to the background, call leaveChannel to leave the current call channel.
// Objective-C
// 离开频道的步骤
- (void)leaveChannel {
[self setIdleTimerActive:YES];
[self.rtcEngine setupLocalVideo:nil]; // nil means unbind
// 离开频道。
[self.rtcEngine leaveChannel:nil]; // leave the channel, callback = nil
if (self.isBroadcaster) {
[self.rtcEngine stopPreview];
}
for (VideoSession *session in self.videoSessions) {
[session.hostingView removeFromSuperview];
}
[self.videoSessions removeAllObjects];
if ([self.delegate respondsToSelector:@selector(liveVCNeedClose:)]) {
[self.delegate liveVCNeedClose:self];
}
}
// Swift
// 离开频道的步骤
func leaveChannel() {
// Step 1, release local AgoraRtcVideoCanvas instance
agoraKit.setupLocalVideo(nil)
// Step 2, leave channel and end group chat
agoraKit.leaveChannel(nil)
// Step 3, if current role is broadcaster, stop preview after leave channel
if settings.role == .broadcaster {
agoraKit.stopPreview()
}
setIdleTimerActive(true)
navigationController?.popViewController(animated: true)
}
h) Destroy the AgoraRtcEngineKit object <br>Finally, to leave the channel, we need to call destroy to destroy the AgoraRtcEngineKit object and release all the resources used by the Agora SDK.
// Objective-C
// 将以下代码填入你定义的函数中
[AgoraRtcEngineKit destroy];
// Swift
// 将以下代码填入你定义的函数中
AgoraRtcEngineKit.destroy()
At this point, complete, run to see the effect. Install the compiled app on two mobile phones, add the same channel name, and select the anchor role and the audience role respectively. If both mobile phones can see the same self, it means you have succeeded.
If you encounter problems during the development process, you can visit the forum to ask questions and communicate with sound network engineers (link: https://rtcdeveloper.agora.io/) 1
You can also visit the background for further technical support (link: https://agora-ticket.agora.io/ 2 )
**粗体** _斜体_ [链接](http://example.com) `代码` - 列表 > 引用
。你还可以使用@
来通知其他用户。