As a developer of multimedia applications, do you want to quickly develop innovative AI functions for media players? E.g:
- Super-divide low-quality video frame by frame during playback
- Let the barrage flying across the screen automatically bypass the main character of the screen
HMS Core 6.0.0's open multimedia pipeline service (AV Pipeline Kit) helps media application developers to reduce the difficulty of developing innovative features. By defining the standard interface of the plug-in and the flow of data flow between the plug-ins, developers only need to complete the plug-in development according to the standard interface, and they can quickly build a new type of media scene.
The AV Pipeline Kit defines a set of plug-in standard interfaces, and has built-in data flow management, thread management, memory management, message management, etc. for the plug-in. Developers only need to implement the core processing logic of the plug-in. No need to pay attention to the logic of thread synchronization, flow control, audio and video synchronization, etc. At present, 3 pipelines that can be used in playback scenes have been preset: video playback, video super-dividing, sound event detection, and provide a Java interface for developers to use, and also support developers to directly call a single preset plug-in through the C++ interface. If the preset plug-in or preset pipeline does not meet the usage requirements, the developer can customize the plug-in and customize the pipeline.
Technical solution
Video Super
Below we describe in detail the built-in high-performance video super-splitting plug-in, which is interspersed between the decoding and display process of the video stream, and converts low-resolution video into high-resolution video in real time, improving video clarity, increasing the expressive power of video details, and improving User viewing experience.
Development preparation
1. Create a new Android Studio project and modify the project-level build.gradle file as follows
Add the Maven warehouse address in "allprojects> repositories".
allprojects {
repositories {
google()
jcenter()
maven {url 'https://developer.huawei.com/repo/'}
}
}
2. Modify the project-level build.gradle file as follows
The targetSdkVersion is set to 28; and compile dependencies are added in dependencies.
dependencies {
implementation 'com.huawei.hms:avpipelinesdk:6.0.0.302'
implementation 'com.huawei.hms:avpipeline-aidl:6.0.0.302'
implementation 'com.huawei.hms:avpipeline-fallback-base:6.0.0.302'
implementation 'com.huawei.hms:avpipeline-fallback-cvfoundry:6.0.0.302'
}
3. Configure manifest
Modify the AndroidManifest.xml file and add the permission to read external storage.
<uses-permission android:name="android.permission.READ_EXTERNAL_STORAGE" />
4. Synchronous project
Click the gradle synchronization icon in the toolbar to complete the synchronization of the "build.gradle" file, and download the relevant dependencies to the local.
Development steps
For detailed sample code, please see GitHub
1. Dynamic application for storage permissions
String[] permissionLists = {
Manifest.permission.READ_EXTERNAL_STORAGE
};
int requestPermissionCode = 1;
for (String permission : permissionLists) {
if (ContextCompat.checkSelfPermission(this, permission) != PackageManager.PERMISSION_GRANTED) {
ActivityCompat.requestPermissions(this, permissionLists, requestPermissionCode);
}
}
2. Initialize the AV Pipeline framework
Context context = getApplicationContext();
boolean ret = AVPLoader.initFwk(context);
if(!ret) return;
3. Create a MediaPlayer instance
The control of the playback process is completed by this example.
MediaPlayer mPlayer = MediaPlayer.create(MediaPlayer.PLAYER_TYPE_AV);
if (mPlayer == null) return;
4. Set the graph profile
The AV Pipeline framework relies on this configuration file to arrange various plug-ins. In addition, you need to set the value of MEDIA\_ENABLE\_CV to 1 to enable the video super-division plug-in.
MediaMeta meta = new MediaMeta();
meta.setString(MediaMeta.MEDIA_GRAPH_PATH, getExternalFilesDir(null).getPath() + "/PlayerGraphCV.xml");
meta.setInt32(MediaMeta.MEDIA_ENABLE_CV, 1);
mPlayer.setParameter(meta);
5. After setting the following parameters, call the prepare interface to start the MediaPlayer preparation work.
If you need to listen to certain events, set the callback function through interfaces such as setOnPreparedListener and setOnErrorListener. (Optional)
// 设置视频渲染的surface
SurfaceView mSurfaceVideo = findViewById(R.id.surfaceViewup);
SurfaceHolder mVideoHolder = mSurfaceVideo.getHolder();
mVideoHolder.addCallback(new SurfaceHolder.Callback() {
// 用户自定义回调函数内容,可参考codelab_视频播放
});
mPlayer.setVideoDisplay(mVideoHolder.getSurface());
// 设置待播放媒体文件的路径
mPlayer.setDataSource(mFilePath);
// 若需要监听某些事件,则还需要通过setXXXListener接口设置回调函数
// 例如需要监听prepare完成的事件,需进行如下设置
mPlayer.setOnPreparedListener(new MediaPlayer.OnPreparedListener() {
@Override
public void onPrepared(MediaPlayer mp, int param1, int param2, MediaParcel parcel) {
// 用户自定义回调函数内容
}
});
mPlayer.prepare();
6. Call start to start playing
mPlayer.start();
7. Call stop to stop playing
mPlayer.stop();
8. Destroy the player
mPlayer.reset();
mPlayer.release();
9. Other matters
For details on the constraints of the video super-splitting plug-in, please refer to the document
Visit the official website of Huawei Multimedia Pipeline Service to learn more about related content
Huawei Multimedia Pipeline Service Development Guidance Document
Huawei multimedia pipeline service open source warehouse address : GitHub , Gitee
Huawei HMS Core Official Forum
solve integration problems, please go to Stack Overflow
Click on the right side of the avatar in the upper right corner to follow, and learn about the latest technology of HMS Core for the first time~
**粗体** _斜体_ [链接](http://example.com) `代码` - 列表 > 引用
。你还可以使用@
来通知其他用户。