在这里插入图片描述

Video interactive live broadcast is currently a popular way to play. We often see PK Lianmai, live answering questions, KTV together, e-commerce live broadcast, interactive large class, video blind date, etc. This article will demonstrate how to implement a live video application on the Android side through the SoundNet Video SDK.

After clicking here to register a Shengwang account , developers can get 10,000 minutes of free usage every month, which can realize various real-time audio and video scenarios.

Without further ado, let's get started.

some prerequisites

1. Experience live video through open source Demo

There may be some people who still don't know what the function we want to achieve will be in the end. Therefore, we provide an open-source basic live video live example project on GitHub. Before starting development, you can experience the experience effect of live video through this example project.
Github: GitHub - Meherdeep/agora-android-live-streaming 1

Here, I've added two live streams and can have multiple viewers subscribe to it at the same time.

Second, the technical principle of live video

What we want to achieve here is live video, and the live video of Shengwang can achieve interactive effects, so it is often called interactive live broadcast. You can understand that multiple users join the same channel to realize the intercommunication of audio and video, and the data of this channel will be transmitted with low latency through the SD-RTN™ real-time network of SoundNet.

It should be noted that the interactive live broadcast of Shengwang is different from a video call. The video call does not distinguish between the anchor and the audience, all users can speak and see each other; while the users of the interactive live broadcast are divided into the anchor and the audience, only the anchor can speak freely and be seen by other users.
The following figure shows the basic workflow of integrating the sound network interactive live broadcast in the app:
image

The steps to realize interactive live broadcast are as follows:

1. Set the role: In the interactive live channel, the user role can be the anchor or the audience. The anchor publishes audio and video streams in the channel, and viewers can only subscribe to the audio and video streams.

2. Obtain Token: When the App client joins the channel, you need to verify the user identity through the Token. The App client sends a request to the App server, obtains the Token, and then authenticates the user when the client joins the channel.

3. Join the channel: call joinChannel to create and join the channel. App clients with the same channel name join the same channel by default.

4. Publish and subscribe audio and video in the channel: After joining the channel, the App client whose role is the anchor can publish audio and video. For a client whose role is the audience, if you want to publish audio and video, you can call setClientRole to switch the user role.

App clients need the following information to join a channel:

  • Channel Name: A string that identifies the live channel.
  • App ID: A randomly generated string used to identify your App, which can be obtained from the Soundnet console, s Soundnet console link: Dashboard .
  • User ID: The unique identification of the user. You need to set the user ID yourself and make sure it is unique within the channel.
  • Token: In a test or production environment, your app client will obtain a token from your server. For quick testing, you can also get a temporary token. Temporary tokens are valid for 24 hours.

3. Development environment

The compatibility of the SoundNet SDK is good, and the requirements for hardware devices and software systems are not high. The development environment and test environment can meet the following conditions:
• Android SDK API Level >= 16
• Android Studio 2.0 or above • Real phone with voice and video capabilities • App requires Android 4.1 or above device

The following are the development and test environments for this article:

development environment

• Windows 10 Home Chinese Edition • Java Version SE 8
• Android Studio 3.2 Canary 4

test environment

• Samsung Nexus (Android 4.4.2 API 19)
• Mi Note 3 (Android 7.1.1 API 25)

If you have not been exposed to the SoundNet SDK before, then you need to do the following preparations:

• Register a Shengwang account, enter the background to create an AppID, and obtain a Token;
• Download the latest official interactive live SDK of Shengwang;

4. Project Settings

1. Before implementing the interactive live broadcast, refer to the following steps to set up your project:

To create a new project, in Android Studio, choose Phone and Tablet > Empty Activity to create an Android project. (Create Android project link: https://developer.android.com/studio/projects/create-project )
After the project is created, Android Studio will automatically start syncing gradle. Please make sure the synchronization is successful before proceeding to the next step.

2. Integrate the SDK, this article recommends using gradle to integrate the SoundNet SDK:

a. Add the following code to the /Gradle Scripts/build.gradle(Project: ) file to add jcenter dependencies:

 buildscript {
     repositories {
         ...
         jcenter()
     }
     ...
}
 
  allprojects {
     repositories {
         ...
         jcenter()
     }
}

b. Add the following code to the /Gradle Scripts/build.gradle(Module: .App) file to integrate the Agora Video SDK into your Android project:

 ...
dependencies {
 ...
 // x.y.z,请填写具体的 SDK 版本号,如:3.5.0。
 // 通过发版说明获取最新版本号。
 implementation 'io.agora.rtc:full-sdk:x.y.z'
//本例使用布局相关设置constraintlayout
implementation  'androidx.constraintlayout:constraintlayout:2.0.4'
}

3. Permission settings

Add the following network and device permissions after `` in the /App/Manifests/AndroidManifest.xml file:

 <uses-permission android:name="android.permission.INTERNET" />
<uses-permission android:name="android.permission.CAMERA" />
<uses-permission android:name="android.permission.RECORD_AUDIO" />
<uses-permission android:name="android.permission.MODIFY_AUDIO_SETTINGS" />
<uses-permission android:name="android.permission.ACCESS_WIFI_STATE" />
<uses-permission android:name="android.permission.ACCESS_NETWORK_STATE" />
<uses-permission android:name="android.permission.BLUETOOTH" />

4. Import Agora related classes

In the /app/src/main/java/com/agora/samtan/agorabroadcast/VideoActivity file, add the following code:

 package com.agora.samtan.agorabroadcast;
import io.agora.rtc.Constants;
import io.agora.rtc.IRtcEngineEventHandler;
import io.agora.rtc.RtcEngine;
import io.agora.rtc.video.VideoCanvas;
import io.agora.rtc.video.VideoEncoderConfiguration;

5. Set up Shengwang account information

In the /app/src/main/res/values/strings.xml file, fill in your AppID into private_App_id:

 <resources>
    ……
<string name="private_App_id">填写位置</string>
……
</resources>

5. Client implementation

This section introduces some tips on how to use the SoundNet Video SDK to implement live video in your app:

1. Check and obtain necessary permissions

When launching the app, check that the necessary permissions for live video have been granted in the app. Call the following code in the onCreate function:

 @Override
    protected void onCreate(Bundle savedInstanceState) {
        super.onCreate(savedInstanceState);
        setContentView(R.layout.activity_main);

        int MY_PERMISSIONS_REQUEST_CAMERA = 0;
        if (ContextCompat.checkSelfPermission(this, Manifest.permission.CAMERA) != PackageManager.PERMISSION_GRANTED || ContextCompat.checkSelfPermission(this, Manifest.permission.RECORD_AUDIO) != PackageManager.PERMISSION_GRANTED) {
            ActivityCompat.requestPermissions(this, new String[]{Manifest.permission.CAMERA, Manifest.permission.RECORD_AUDIO}, MY_PERMISSIONS_REQUEST_CAMERA);

        }
}

2. Realize the logic of interactive live broadcast

Open your app, create an instance of RtcEngine, enable video and join the channel. If the local user is the host, publish the local video to the view below the UI. If another streamer joins the channel, your app will capture the join event and add the remote video to the view in the upper right corner of the UI.
The API usage sequence of interactive live broadcast is shown in the following figure:

在这里插入图片描述

image822×1048 106KB

Follow these steps to implement this logic:

a) Initialize RtcEngine
The RtcEngine class contains the main methods called by the application. It is best to call the interface of RtcEngine in the same thread, and it is not recommended to call it in different threads at the same time.

Currently, the SoundNet Native SDK only supports one RtcEngine instance, and each application only creates one RtcEngine object. All interface functions of the RtcEngine class, unless otherwise specified, are called asynchronously, and it is recommended to call the interface in the same thread. For all APIs whose return value is int, unless otherwise specified, the return value 0 means the call succeeds, and the return value less than 0 means the call fails.

In the VideoActivity file, the method used to initialize the RtcEngine by initializeAgoraEngine:

 private void initalizeAgoraEngine() {
        try {
            mRtcEngine = RtcEngine.create(getBaseContext(), getString(R.string.private_App_id), mRtcEventHandler);
        } catch (Exception e) {
            e.printStackTrace();
        }
}

In addition, there is an important IRtcEngineEventHandler interface class for the SDK to send callback event notifications to the application, and the application obtains the event notification of the SDK by inheriting the method of this interface class.

All methods of the interface class have default (empty) implementations, and applications can inherit only the events they care about as needed. In the callback method, the application should not do time-consuming or call APIs that may cause blocking (such as SendMessage), otherwise it may affect the operation of the SDK. The content is as follows:

 private final IRtcEngineEventHandler iRtcEngineEventHandler = new IRtcEngineEventHandler()
{
    /**Reports a warning during SDK runtime.
     * Warning code: https://docs.agora.io/en/Voice/API%20Reference/java/classio_1_1agora_1_1rtc_1_1_i_rtc_engine_event_handler_1_1_warn_code.html*/
    @Override
    public void onWarning(int warn)
    {
        Log.w(TAG, String.format("onWarning code %d message %s", warn, RtcEngine.getErrorDescription(warn)));
    }
 
    /**Reports an error during SDK runtime.
     * Error code: https://docs.agora.io/en/Voice/API%20Reference/java/classio_1_1agora_1_1rtc_1_1_i_rtc_engine_event_handler_1_1_error_code.html*/
    @Override
    public void onError(int err)
    {
        Log.e(TAG, String.format("onError code %d message %s", err, RtcEngine.getErrorDescription(err)));
        showAlert(String.format("onError code %d message %s", err, RtcEngine.getErrorDescription(err)));
    }
 
    /**Occurs when a user leaves the channel.
     * @param stats With this callback, the Application retrieves the channel information,
     *              such as the call duration and statistics.*/
    @Override
    public void onLeaveChannel(RtcStats stats)
    {
        super.onLeaveChannel(stats);
        Log.i(TAG, String.format("local user %d leaveChannel!", myUid));
        showLongToast(String.format("local user %d leaveChannel!", myUid));
    }
 
    /**Occurs when the local user joins a specified channel.
     * The channel name assignment is based on channelName specified in the joinChannel method.
     * If the uid is not specified when joinChannel is called, the server automatically assigns a uid.
     * @param channel Channel name
     * @param uid User ID
     * @param elapsed Time elapsed (ms) from the user calling joinChannel until this callback is triggered*/
    @Override
    public void onJoinChannelSuccess(String channel, int uid, int elapsed)
    {
        Log.i(TAG, String.format("onJoinChannelSuccess channel %s uid %d", channel, uid));
        showLongToast(String.format("onJoinChannelSuccess channel %s uid %d", channel, uid));
        myUid = uid;
        joined = true;
        handler.post(new Runnable()
        {
            @Override
            public void run()
            {
                join.setEnabled(true);
                join.setText(getString(R.string.leave));
            }
        });
    }
 
    @Override
    public void onRemoteAudioStats(io.agora.rtc.IRtcEngineEventHandler.RemoteAudioStats remoteAudioStats) {
        statisticsInfo.setRemoteAudioStats(remoteAudioStats);
        updateRemoteStats();
    }
 
    @Override
    public void onLocalAudioStats(io.agora.rtc.IRtcEngineEventHandler.LocalAudioStats localAudioStats) {
        statisticsInfo.setLocalAudioStats(localAudioStats);
        updateLocalStats();
    }
 
    @Override
    public void onRemoteVideoStats(io.agora.rtc.IRtcEngineEventHandler.RemoteVideoStats remoteVideoStats) {
        statisticsInfo.setRemoteVideoStats(remoteVideoStats);
        updateRemoteStats();
    }
 
    @Override
    public void onLocalVideoStats(io.agora.rtc.IRtcEngineEventHandler.LocalVideoStats localVideoStats) {
        statisticsInfo.setLocalVideoStats(localVideoStats);
        updateLocalStats();
    }
 
    @Override
    public void onRtcStats(io.agora.rtc.IRtcEngineEventHandler.RtcStats rtcStats) {
        statisticsInfo.setRtcStats(rtcStats);
    }
};

So, in our initialize function, we pass mRtcEventHandler as one of the parameters to the create method, which sets up a series of callback events that are fired whenever the user joins or leaves the channel.

 private IRtcEngineEventHandler mRtcEventHandler = new IRtcEngineEventHandler() {

        @Override
        public void onUserJoined(final int uid, int elapsed) {
            super.onUserJoined(uid, elapsed);
            runOnUiThread(new Runnable() {
                @Override
                public void run() {
                    setupRemoteVideo(uid);
                }
            });
        }

        @Override
        public void onUserOffline(int uid, int reason) {
            runOnUiThread(new Runnable() {
                @Override
                public void run() {
                    onRemoteUserLeft();
                }
            });
        }
    };

b) Set up channel scenes and roles
setChannelProfile() is a method that uses a reference to our AgoraRtcEngine object. Agora provides various configuration files that can be invoked and integrated into applications through this method.

The setClientRole() method sets the user's role as host or viewer (default). This method should be called before joining a channel. After joining the channel, it can be called again to switch the client role.

In order to easily experience the effect of the anchor role and the audience role in the interactive live broadcast, we will add two methods to our MainActivity class:
• The first method is called when the user selects an option from a radio button. We will set a variable accordingly. We set this to a value that will determine whether the user is a streamer or a viewer.

 public void onRadioButtonClicked(View view) {
        boolean checked = ((RadioButton) view).isChecked();
        switch (view.getId()) {
            case R.id.host:
                if (checked) {
                    channelProfile = Constants.CLIENT_ROLE_BROADCASTER;
                }
                break;
            case R.id.audience:
                if (checked) {
                    channelProfile = Constants.CLIENT_ROLE_AUDIENCE;
                }
                break;
        }
}

• Then we implement a function that is called when the user submits the details. Here we will get all the details we need and send them to the next activity.

 public void onSubmit(View view) {
        EditText channel = (EditText) findViewById(R.id.channel);
        String channelName = channel.getText().toString();
        Intent intent = new Intent(this, VideoActivity.class);
        intent.putExtra(channelMessage, channelName);
        intent.putExtra(profileMessage, channelProfile);
        startActivity(intent);
}

c) start video
The setupVideoProfile() function is used to define how the video needs to be rendered. You can use your own custom configuration for properties like frame rate, bit rate, orientation, mirror mode, and degradation preferences.

 private void setupVideoProfile() {
        mRtcEngine.enableVideo();

        mRtcEngine.setVideoEncoderConfiguration(new VideoEncoderConfiguration(VideoEncoderConfiguration.VD_640x480, VideoEncoderConfiguration.FRAME_RATE.FRAME_RATE_FPS_15,
                VideoEncoderConfiguration.STANDARD_BITRATE,
                VideoEncoderConfiguration.ORIENTATION_MODE.ORIENTATION_MODE_FIXED_PORTRAIT));
}

d) Set up local video
The setupLocalVideo() function is used to reference the setupLocalVideo method from our AgoraRtcEngine, which we use to set up a surface view for our local users to use in the live stream:

 private void setupLocalVideo() {
        FrameLayout container = (FrameLayout) findViewById(R.id.local_video_view_container);
        SurfaceView surfaceView = RtcEngine.CreateRendererView(getBaseContext());
        surfaceView.setZOrderMediaOverlay(true);
        container.addView(surfaceView);
        mRtcEngine.setupLocalVideo(new VideoCanvas(surfaceView, VideoCanvas.RENDER_MODE_FIT, 0));
    }

e) Join a channel <br>A channel is a common space where people are in the same video call. The joinChannel() method can be called like this:

 private void joinChannel() {
        mRtcEngine.joinChannel(token, channelName, "Optional Data", 0);
}

The method requires four parameters to run successfully:
• Token: Token authentication is recommended for all RTE APPs running in the production environment. For more information about token-based authentication on the SoundNet platform, see https://docs.agora.io/cn/Video/token?platform=All%20Platforms .
• Channel Name: A string is required to allow the user to enter the video call.
• Optional Information: This is an optional field through which you can pass additional information about the channel.
• uid: The unique ID of each user joining the channel. If you pass in a 0 or null value, Agora will automatically assign a uid to each user.

Note: This project is for reference and development environment use only, not for production environment. Token authentication is recommended for all RTE APPs running in a production environment.

In this example, the App is initialized and the core method is called to create and join the Agora live channel. In the VideoActivity file, add the following code after the onCreate function:

 package com.agora.samtan.agorabroadcast;
import android.content.Intent;
import android.graphics.PorterDuff;
import android.os.Bundle;
import android.util.Log;
import android.view.SurfaceView;
import android.view.View;   ;//;.;
import android.widget.FrameLayout;
import android.widget.ImageView;
import androidx.Appcompat.App.AppCompatActivity;
import io.agora.rtc.Constants;
import io.agora.rtc.IRtcEngineEventHandler;
import io.agora.rtc.RtcEngine;
import io.agora.rtc.video.VideoCanvas;
import io.agora.rtc.video.VideoEncoderConfiguration;

public class VideoActivity extends AppCompatActivity {
  private RtcEngine mRtcEngine;
  private String channelName;
  private int channelProfile;
  
  @Override
    protected void onCreate(Bundle savedInstanceState) {
        super.onCreate(savedInstanceState);
        setContentView(R.layout.activity_video);

        Intent intent = getIntent();
        channelName = intent.getStringExtra(MainActivity.channelMessage);
        channelProfile = intent.getIntExtra(MainActivity.profileMessage, -1);

        if (channelProfile == -1) {
            Log.e("TAG: ", "No profile");
        }

        initAgoraEngineAndJoinChannel();
    }

}

We announced a method called initAgoraEngineAndJoinChannel that will call all other methods needed during the live broadcast. We also define event handlers that will decide which methods are called when a remote user joins or leaves or is muted.

 private void initAgoraEngineAndJoinChannel() {
        initalizeAgoraEngine();
        mRtcEngine.setChannelProfile(Constants.CHANNEL_PROFILE_LIVE_BROADCASTING);
        mRtcEngine.setClientRole(channelProfile);
        setupVideoProfile();
        setupLocalVideo();
        joinChannel();
}

f) Add the remote interface when the remote host joins the channel <br>In the VideoActivity file, add the following code after the initializeAndJoinChannel function:

 private void setupRemoteVideo(int uid) {
        FrameLayout container = (FrameLayout) findViewById(R.id.remote_video_view_container);
        SurfaceView surfaceView = RtcEngine.CreateRendererView(getBaseContext());
        container.addView(surfaceView);
        mRtcEngine.setupRemoteVideo(new VideoCanvas(surfaceView, VideoCanvas.RENDER_MODE_FIT, uid));
    }

g) Release resources <br>Finally, we add onDestroy method to release our used resources. The relevant code is as follows:

 @Override
    protected void onDestroy() {
        super.onDestroy();

        leaveChannel();
        RtcEngine.destroy();
        mRtcEngine = null;
}

At this point, complete, run to see the effect. Install the compiled app on two mobile phones, add the same channel name, and select the anchor role and the audience role respectively. If both mobile phones can see the same self, it means you have succeeded.

If you encounter problems during the development process, you can visit the forum to ask questions and communicate with sound network engineers .

RTE开发者社区
658 声望973 粉丝

RTE 开发者社区是聚焦实时互动领域的中立开发者社区。不止于纯粹的技术交流,我们相信开发者具备更加丰盈的个体价值。行业发展变革、开发者职涯发展、技术创业创新资源,我们将陪跑开发者,共享、共建、共成长。