Introduction
OpenAtom OpenHarmony (hereinafter referred to as "OpenHarmony"), as the "connector" of the "open source" world, continuously provides a steady stream of "source power" for the development of the intelligent society. Shenzhen Kaihong has always been actively involved in the construction of the OpenHarmony community and constantly promotes the development of open source.
As an OS framework development engineer of Shenzhen Kaihong, I have actively joined the OpenHarmony community construction since the establishment of the OpenHarmony open source project. I am responsible for the research and development of the OpenHarmony framework and structure. This time I will bring the source code analysis of the OpenHarmony multimedia subsystem. Hope to provide reference for the majority of developers.
The OpenHarmony multimedia subsystem is one of the more important subsystems in the OpenHarmony system framework. OpenHarmony integrates the third-party library of ffmpeg, and many functions of multimedia require the ffmpeg library. In addition, the processing of media files includes the processing of audio and video clipping, audio and video separation and other application scenarios. Some functional multimedia subsystems do not provide external corresponding interfaces. For this, a set of JS interfaces can be implemented through the NAPI mechanism to provide Application layer to call, in order to achieve more multimedia functions.
Effect display <br>This article makes developers familiar with the entire operation process of realizing this function by realizing the function of cropping audio and video files. The following is the renderings:
First select the source file, and set the start time and end time (in seconds) of the crop in the crop settings. After the parameters are set, we click the crop button to crop the source file. After the crop is successful, the playback will be displayed. button.
In the whole operation process, the play button of the source file selection module is to play the source file, and the play button of the crop module is to play the cropped file. We can check the effect comparison before and after cropping by playing the video file.
The code has been uploaded to the SIG repository, and the link is as follows:
https://gitee.com/openharmony-sig/knowledge_demo_entainment/tree/master/FA/MediaCuteDemo
https://gitee.com/openharmony-sig/knowledge_demo_entainment/tree/master/docs/MediaCuteDemo
Source code analysis
The source code analysis is divided into two parts, one part is the local function implemented by NAPI, and the other part is the application function implemented by JS. 1. NAPI 1. Implementation <br>The following is the content of the source code analysis. The main code of the core module is myffmpegsys, which provides the js interface for the application side.
1. myffmpegsys is integrated into the OpenHarmony source code as a new subsystem, placed in the root directory of the OpenHarmony source code, in the same directory as foundation.
2. Configure build/subsystem_config.json.
"myffmpegsys": {
"path": "myffmpegsys",
"name": "myffmpegsys"
},
- Configure productdefine/common/products/XXXX.json of the product (where XXXX corresponds to the device model).
"parts":{
"myffmpegsys:myffmpegpart":{},
"ace:ace_engine_standard":{},
......
}
- After configuring the subsystem and the corresponding components, the source code of the myffmpegsys subsystem is analyzed below.
(1) Directory structure
Myffmpegdemo mainly deals with napi-related interface conversion, and ffmpeg_utils processes the actual video file cropping function by calling the ffmpeg third-party library.
(2) The path of the ffmpeg third-party library integrated by OpenHarmony is third_party/ffmpeg, myffmpegdemo will depend on ffmpeg, and the header file will also reference the ffmpeg header file, so the relevant dependencies and paths will be added to the BUILD.gn file.
import("//build/ohos.gni")
ohos_shared_library("myffmpegdemo") {
include_dirs = [
"//foundation/ace/napi/interfaces/kits",
"//myffmpegsys/myffmpegpart/myffmpegdemo/include",
"//third_party/ffmpeg",
]
sources = [
"myffmpegdemo.cpp",
"ffmpeg_utils.cpp",
]
public_deps = [
"//foundation/ace/napi:ace_napi",
"//third_party/ffmpeg:libohosffmpeg"
]
external_deps = [
"hiviewdfx_hilog_native:libhilog",
]
relative_install_dir = "module"
subsystem_name = "myffmpegsys"
part_name = "myffmpegpart"
}
(3) Flowchart
(4) Code analysis Napi interface registration:
/***********************************************
* Module export and register
***********************************************/
static napi_value registerMyffmpegdemo(napi_env env, napi_value exports)
{
static napi_property_descriptor desc[] = {
DECLARE_NAPI_FUNCTION("videoCute", videoCute),
DECLARE_NAPI_FUNCTION("videoToAacH264", videoToAacH264),
};
napi_define_properties(env, exports, sizeof(desc) / sizeof(desc[0]), desc);
return exports;
}
NAPI implements the videoCute interface, converts the NAPI type to C++ type, and then calls the videoCute interface of FfmpegUtils:
static void executeVideoCute(napi_env env, void* data) {
VideoCuteAddOnData *addonData = (VideoCuteAddOnData *) data;
//调用视频剪切的功能
addonData->result = FfmpegUtils::videoCute((const char*)addonData->args0.c_str(), \
addonData->args1, \
addonData->args2, \
(const char*)addonData->args3.c_str());
}
FfmpegUtils initializes input, output format context:
//初始化上下文
ret = avformat_open_input(&ifmt_ctx, in_filename, 0, 0);
if (ret < 0) {
ERROR_BUF(ret);
HiLog::Error(LABEL, "gyf avformat_open_input error = %{public}s", errbuf);
return ret;
}
ret = avformat_alloc_output_context2(&ofmt_ctx, NULL, NULL, out_filename);
if (ret < 0) {
ERROR_BUF(ret);
HiLog::Error(LABEL, "gyf avformat_alloc_output_context2 error = %{public}s", errbuf);
goto end;
}
ofmt = ofmt_ctx->oformat;
Create an output stream from the input stream and copy the codec parameters:
//创建流以及参数拷贝
for (int i = 0; i < (int)ifmt_ctx->nb_streams; i++) {
in_stream = ifmt_ctx->streams[i];
AVStream *out_stream = avformat_new_stream(ofmt_ctx, NULL);
if (!out_stream) {
ret = AVERROR_UNKNOWN;
goto end;
}
avcodec_parameters_copy(out_stream->codecpar, in_stream->codecpar);
out_stream->codecpar->codec_tag = 0;
}
Open the output file and write the header file:
//打开输出文件
ret = avio_open(&ofmt_ctx->pb, out_filename, AVIO_FLAG_WRITE);
if (ret < 0) {
ERROR_BUF(ret);
HiLog::Error(LABEL, "gyf avio_open error = %{public}s", errbuf);
goto end;
} // 写头信息
ret = avformat_write_header(ofmt_ctx, NULL);
if (ret < 0) {
ERROR_BUF(ret);
HiLog::Error(LABEL, "gyf avformat_write_header error = %{public}s", errbuf);
goto end;
}
Jump to the specified frame according to the set interception time period:
//跳转到指定帧
ret = av_seek_frame(ifmt_ctx, -1, start_seconds * AV_TIME_BASE, AVSEEK_FLAG_ANY);
if (ret < 0) {
ERROR_BUF(ret);
HiLog::Error(LABEL, "gyf av_seek_frame error = %{public}s", errbuf);
goto end;
}
Read the frame data in a loop, and exit the loop when the interception time point is reached:
//读取数据
ret = av_read_frame(ifmt_ctx, &pkt);
if (ret < 0) {
break;
}
in_stream = ifmt_ctx->streams[pkt.stream_index];
out_stream = ofmt_ctx->streams[pkt.stream_index];
// 时间超过要截取的时间,就退出循环
if (av_q2d(in_stream->time_base) * pkt.pts > end_seconds) {
av_packet_unref(&pkt);
break;
}
Write the information at the end of the file:
//写文件尾信息
ret = av_write_trailer(ofmt_ctx);
Second, JS application implementation
Directory Structure
The code mainly consists of two parts, the index is mainly the settings related to cropping, and the player is the page that plays the video file.
The source file is set in the index, the start time of the cropping, and after the end time, the cropping function of the video is performed through the cropping button. The code of this part is performed through the interface provided by the underlying NAPI.
cutevideo() {
globalThis.isCuteSuccess = false;
console.log('gyf cutevideo');
myffmpegdemo.videoCute(this.src, this.startTime, this.endTime, this.srcOut,
function (result) {
console.log('gyf cutevideo callback result = ' + result);
globalThis.showPrompt('videoCute finished!');
if (0 === result) {
globalThis.isCuteSuccess = true;
} else {
globalThis.isCuteSuccess = false;
}
}
);
},
Once the video is cropped successfully, a play button will appear on the page. After clicking the play button, you can watch the cropped file.
Summary <br>This article explains how to use the capabilities of the OpenHarmony system to achieve more functions through NAPI. Developers can use the three-party library that comes with OpenHarmony to implement multimedia processing functions such as audio and video separation, audio and video transcoding, and audio and video encoding and decoding, and these functions can be implemented at the system layer and provide corresponding interfaces through NAPI. transfer. For other internal capabilities integrated by OpenHarmony, NAPI can also be used to provide external interfaces to achieve more functions.
Development is a long road, and developers can only achieve multiplier effect in future development work by drawing inferences from other facts and drawing parallels by analogy.
**粗体** _斜体_ [链接](http://example.com) `代码` - 列表 > 引用
。你还可以使用@
来通知其他用户。