Welcome to my GitHub
https://github.com/zq2599/blog_demos
Content: All original articles are categorized and summarized and supporting source code, involving Java, Docker, Kubernetes, DevOPS, etc.;
Welcome to my GitHub
(including supporting source code) are classified and summarized here: 161f09d18d1554 https://github.com/zq2599/blog_demos
Overview of this article
- My own mp4 file, how to let more people play it remotely? As shown below:
- Here is a brief explanation of the function of the above figure:
- Deploy an open source streaming server <font color="blue">SRS</font>
- Develop a java application named <font color="blue">PushMp4</font>, the application will read the Mp4 file on the local disk, read each frame, and push it to the SRS
- Everyone who wants to watch videos connects to SRS with streaming media player software (such as VLC) on their computer and plays the video pushed by PushMp4
- Today we will complete the actual combat in the above picture. The whole process is divided into the following steps:
- Environmental information
- Prepare MP4 files
- Deploy SRS with docker
- java application development and operation
- VLC playback
Environmental information
- In this actual combat, the environmental information involved on my side is as follows, for your reference:
- Operating System: macOS Monterey
- JDK:1.8.0_211
- JavaCV:1.5.6
- SRS:3
Prepare MP4 files
- Just prepare an ordinary MP4 video file. I downloaded the video of Big Bear Rabbit commonly used in video development online. The address is:
https://www.learningcontainer.com/wp-content/uploads/2020/05/sample-mp4-file.mp4
Deploy SRS with docker
SRS is a well-known open source media server. The streams pushed here can be played online with a media player. For simplicity, I use one line of commands in the docker environment to complete the deployment:
docker run -p 1935:1935 -p 1985:1985 -p 8080:8080 ossrs/srs:3
- At this moment, the SRS service is running and can be pushed up
Develop JavaCV applications
- Next, enter the most important coding stage, create a new maven project named <font color="blue">simple-grab-push</font>, pom.xml is as follows (the one named <font color="blue">javacv The parent project of -tutorials</font> actually has no effect. I am just here to facilitate the management of the code of multiple projects. You can delete this parent project node):
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<parent>
<artifactId>javacv-tutorials</artifactId>
<groupId>com.bolingcavalry</groupId>
<version>1.0-SNAPSHOT</version>
</parent>
<modelVersion>4.0.0</modelVersion>
<groupId>com.bolingcavalry</groupId>
<version>1.0-SNAPSHOT</version>
<artifactId>simple-grab-push</artifactId>
<packaging>jar</packaging>
<properties>
<!-- javacpp当前版本 -->
<javacpp.version>1.5.6</javacpp.version>
</properties>
<dependencies>
<dependency>
<groupId>org.projectlombok</groupId>
<artifactId>lombok</artifactId>
</dependency>
<dependency>
<groupId>ch.qos.logback</groupId>
<artifactId>logback-classic</artifactId>
<version>1.2.3</version>
</dependency>
<dependency>
<groupId>org.apache.logging.log4j</groupId>
<artifactId>log4j-to-slf4j</artifactId>
<version>2.13.3</version>
</dependency>
<!-- javacv相关依赖,一个就够了 -->
<dependency>
<groupId>org.bytedeco</groupId>
<artifactId>javacv-platform</artifactId>
<version>${javacpp.version}</version>
</dependency>
</dependencies>
</project>
- It can be seen from the above files that JavaCV has only one dependency <font color="blue">javacv-platform</font>, which is quite concise.
- Next, start coding. Before coding, draw the whole process first, so that the code is much clearer:
- It can be seen from the above figure that the process is very simple. Here all the code is written in a java class:
package com.bolingcavalry.grabpush;
import lombok.extern.slf4j.Slf4j;
import org.bytedeco.ffmpeg.avcodec.AVCodecParameters;
import org.bytedeco.ffmpeg.avformat.AVFormatContext;
import org.bytedeco.ffmpeg.avformat.AVStream;
import org.bytedeco.ffmpeg.global.avcodec;
import org.bytedeco.ffmpeg.global.avutil;
import org.bytedeco.javacv.FFmpegFrameGrabber;
import org.bytedeco.javacv.FFmpegFrameRecorder;
import org.bytedeco.javacv.FFmpegLogCallback;
import org.bytedeco.javacv.Frame;
/**
* @author willzhao
* @version 1.0
* @description 读取指定的mp4文件,推送到SRS服务器
* @date 2021/11/19 8:49
*/
@Slf4j
public class PushMp4 {
/**
* 本地MP4文件的完整路径(两分零五秒的视频)
*/
private static final String MP4_FILE_PATH = "/Users/zhaoqin/temp/202111/20/sample-mp4-file.mp4";
/**
* SRS的推流地址
*/
private static final String SRS_PUSH_ADDRESS = "rtmp://192.168.50.43:11935/live/livestream";
/**
* 读取指定的mp4文件,推送到SRS服务器
* @param sourceFilePath 视频文件的绝对路径
* @param PUSH_ADDRESS 推流地址
* @throws Exception
*/
private static void grabAndPush(String sourceFilePath, String PUSH_ADDRESS) throws Exception {
// ffmepg日志级别
avutil.av_log_set_level(avutil.AV_LOG_ERROR);
FFmpegLogCallback.set();
// 实例化帧抓取器对象,将文件路径传入
FFmpegFrameGrabber grabber = new FFmpegFrameGrabber(MP4_FILE_PATH);
long startTime = System.currentTimeMillis();
log.info("开始初始化帧抓取器");
// 初始化帧抓取器,例如数据结构(时间戳、编码器上下文、帧对象等),
// 如果入参等于true,还会调用avformat_find_stream_info方法获取流的信息,放入AVFormatContext类型的成员变量oc中
grabber.start(true);
log.info("帧抓取器初始化完成,耗时[{}]毫秒", System.currentTimeMillis()-startTime);
// grabber.start方法中,初始化的解码器信息存在放在grabber的成员变量oc中
AVFormatContext avFormatContext = grabber.getFormatContext();
// 文件内有几个媒体流(一般是视频流+音频流)
int streamNum = avFormatContext.nb_streams();
// 没有媒体流就不用继续了
if (streamNum<1) {
log.error("文件内不存在媒体流");
return;
}
// 取得视频的帧率
int frameRate = (int)grabber.getVideoFrameRate();
log.info("视频帧率[{}],视频时长[{}]秒,媒体流数量[{}]",
frameRate,
avFormatContext.duration()/1000000,
avFormatContext.nb_streams());
// 遍历每一个流,检查其类型
for (int i=0; i< streamNum; i++) {
AVStream avStream = avFormatContext.streams(i);
AVCodecParameters avCodecParameters = avStream.codecpar();
log.info("流的索引[{}],编码器类型[{}],编码器ID[{}]", i, avCodecParameters.codec_type(), avCodecParameters.codec_id());
}
// 视频宽度
int frameWidth = grabber.getImageWidth();
// 视频高度
int frameHeight = grabber.getImageHeight();
// 音频通道数量
int audioChannels = grabber.getAudioChannels();
log.info("视频宽度[{}],视频高度[{}],音频通道数[{}]",
frameWidth,
frameHeight,
audioChannels);
// 实例化FFmpegFrameRecorder,将SRS的推送地址传入
FFmpegFrameRecorder recorder = new FFmpegFrameRecorder(SRS_PUSH_ADDRESS,
frameWidth,
frameHeight,
audioChannels);
// 设置编码格式
recorder.setVideoCodec(avcodec.AV_CODEC_ID_H264);
// 设置封装格式
recorder.setFormat("flv");
// 一秒内的帧数
recorder.setFrameRate(frameRate);
// 两个关键帧之间的帧数
recorder.setGopSize(frameRate);
// 设置音频通道数,与视频源的通道数相等
recorder.setAudioChannels(grabber.getAudioChannels());
startTime = System.currentTimeMillis();
log.info("开始初始化帧抓取器");
// 初始化帧录制器,例如数据结构(音频流、视频流指针,编码器),
// 调用av_guess_format方法,确定视频输出时的封装方式,
// 媒体上下文对象的内存分配,
// 编码器的各项参数设置
recorder.start();
log.info("帧录制初始化完成,耗时[{}]毫秒", System.currentTimeMillis()-startTime);
Frame frame;
startTime = System.currentTimeMillis();
log.info("开始推流");
long videoTS = 0;
int videoFrameNum = 0;
int audioFrameNum = 0;
int dataFrameNum = 0;
// 假设一秒钟15帧,那么两帧间隔就是(1000/15)毫秒
int interVal = 1000/frameRate;
// 发送完一帧后sleep的时间,不能完全等于(1000/frameRate),不然会卡顿,
// 要更小一些,这里取八分之一
interVal/=8;
// 持续从视频源取帧
while (null!=(frame=grabber.grab())) {
videoTS = 1000 * (System.currentTimeMillis() - startTime);
// 时间戳
recorder.setTimestamp(videoTS);
// 有图像,就把视频帧加一
if (null!=frame.image) {
videoFrameNum++;
}
// 有声音,就把音频帧加一
if (null!=frame.samples) {
audioFrameNum++;
}
// 有数据,就把数据帧加一
if (null!=frame.data) {
dataFrameNum++;
}
// 取出的每一帧,都推送到SRS
recorder.record(frame);
// 停顿一下再推送
Thread.sleep(interVal);
}
log.info("推送完成,视频帧[{}],音频帧[{}],数据帧[{}],耗时[{}]秒",
videoFrameNum,
audioFrameNum,
dataFrameNum,
(System.currentTimeMillis()-startTime)/1000);
// 关闭帧录制器
recorder.close();
// 关闭帧抓取器
grabber.close();
}
public static void main(String[] args) throws Exception {
grabAndPush(MP4_FILE_PATH, SRS_PUSH_ADDRESS);
}
}
- Each line in the above code has detailed comments, so I won't go into details. Only the following four key points need attention:
- <font color="blue">MP4_FILE_PATH</font> is where the local MP4 files are stored, please change to the location where the MP4 files are stored on your computer
- <font color="blue">SRS_PUSH_ADDRESS</font> is the push address of the SRS service, please change it to the address of your own SRS service deployment
- When the <font color="blue">grabber.start(true)</font> method is executed, the inside is the initialization process of the frame grabber, and the relevant information of the MP4 file will be obtained.
- When the <font color="blue">recorder.record(frame)</font> method is executed, the frame will be pushed to the SRS server
- After the encoding is completed, run this class, and the console log is as follows. It can be seen that the frame rate, duration, decoder, media stream and other information of the MP4 file are successfully obtained, and then the streaming starts:
23:21:48.107 [main] INFO com.bolingcavalry.grabpush.PushMp4 - 开始初始化帧抓取器
23:21:48.267 [main] INFO com.bolingcavalry.grabpush.PushMp4 - 帧抓取器初始化完成,耗时[163]毫秒
23:21:48.277 [main] INFO com.bolingcavalry.grabpush.PushMp4 - 视频帧率[15],视频时长[125]秒,媒体流数量[2]
23:21:48.277 [main] INFO com.bolingcavalry.grabpush.PushMp4 - 流的索引[0],编码器类型[0],编码器ID[27]
23:21:48.277 [main] INFO com.bolingcavalry.grabpush.PushMp4 - 流的索引[1],编码器类型[1],编码器ID[86018]
23:21:48.279 [main] INFO com.bolingcavalry.grabpush.PushMp4 - 视频宽度[320],视频高度[240],音频通道数[6]
23:21:48.294 [main] INFO com.bolingcavalry.grabpush.PushMp4 - 开始初始化帧抓取器
23:21:48.727 [main] INFO com.bolingcavalry.grabpush.PushMp4 - 帧录制初始化完成,耗时[433]毫秒
23:21:48.727 [main] INFO com.bolingcavalry.grabpush.PushMp4 - 开始推流
- Next, try to see if you can pull the stream to play
Play with VLC
- Please install VLC software and open
- As shown in the red box below, click <font color="blue">Open Network...</font> in the menu, and then enter the push stream address written in the previous code (I am here <font color="red">rtmp ://192.168.50.43:11935/live/livestream</font>):
- As shown below, it is successfully played, and the sound is normal:
Additional knowledge points
- After the above actual combat, we are familiar with the basic operations of playback and push streaming, and mastered the acquisition of general information and parameter settings. In addition to the knowledge in the code, there are the following hidden knowledge points that are also worthy of attention
- The code to set the ffmpeg log level is <font color="blue">avutil.av_log_set_level(avutil.AV_LOG_ERROR)</font>, after changing the parameter to <font color="red">avutil.AV_LOG_INFO</font>, you can See the richer log in the console, as shown in the red area below, which shows the details of the MP4 file, such as two media streams (audio stream and video stream):
- The second knowledge point is about the encoder type and encoder ID. As shown in the figure below, the encoder types of the two media streams (AVStream) are <font color="red"> 0 </font> and <font color="red"> 1 </font>, the two encoder IDs are <font color="red"> 27 </font> and <font color="red"> 86018 font>, what do these four numbers represent?
- First look at the encoder type, use IDEA's decompilation function to open <font color="blue">avutil.class</font>, as shown in the figure below, the encoder type equal to 0 means video (VIDEO), and the type equal to 1 means audio (AUDIO) ):
- Look at the encoder ID again, open <font color="blue">avcodec.java</font>, and see that the encoder ID is <font color="red"> 27 </font> means H264:
- The hexadecimal of the encoder ID value <font color="red">86018</font> is <font color="red">0x15002</font>, and the corresponding encoder is in the red box as shown below:
- So far, the JavaCV push-stream actual combat (MP4 file) has been completed. I hope that through this article, we can familiarize ourselves with the regular operations of JavaCV to handle push-pull streams;
https://github.com/zq2599/blog_demos
**粗体** _斜体_ [链接](http://example.com) `代码` - 列表 > 引用
。你还可以使用@
来通知其他用户。