Qt/C++ audio and video development 56-udp push and pull/multicast and unicast push

I. Introduction

Previously, rtsp/rtmp push streaming and rtsp/rtmp/hls/flv/ws-flv/webrtc streaming have been implemented. This generally requires relying on an independent streaming media service program. Is there a more convenient way? This kind of dependency is needed, and then push and pull streaming can be achieved. Of course, some of them are udpp streaming. Among them, udp streaming can also be multicast or unicast streaming. Multicast generally chooses the address 224.0.0.1, and unicast. In case of multicast, it is to specify a unique IP address such as 192.168.0.8. In case of multicast, it is equivalent to all LAN devices in the same switch receiving the data. The advantage is that once it is pushed, all places can pull the data. The disadvantage is that it is extremely possible. Multicast storms occur because each LAN device may receive a large number of video packets, which are completely redundant.

Based on the previous ffmpeg push code, you only need to add one line of code to implement udp push. That is to fill in the corresponding format when calling avformat_alloc_output_context2. You need to fill in mpegts. In fact, it can also be h264, but I personally recommend using mpegts. The ffmpeg command line push stream is ffmpeg -re -stream_loop -1 -if:/mp4/push/1.mp4 -c copy -f mpegts udp://127.0.0.1:1234. You can see that you need to specify the corresponding port number. One push stream occupies one port number. If you want to push 10 video files, 10 ports will be occupied.

2. Effect drawing

Insert image description here
Insert image description here

3. Experience address

  1. Domestic site: https://gitee.com/feiyangqingyun
  2. International site: https://github.com/feiyangqingyun
  3. Personal work: https://blog.csdn.net/feiyangqingyun/article/details/97565652
  4. Experience address: https://pan.baidu.com/s/1d7TH_GEYl5nOecuNlWJJ7g Extraction code: 01jf File name: bin_video_push.

4. Functional features

  1. Supports various local video files and network video files.
  2. Supports various network video streams, web cameras, protocols including rtsp, rtmp, http.
  3. Supports streaming from local camera devices, and can specify resolution, frame rate, etc.
  4. Supports streaming of local desktop, and can specify screen area and frame rate, etc.
  5. Automatically start the streaming media service program, the default is mediamtx (formerly rtsp-simple-server), srs, EasyDarwin, LiveQing, ZLMediaKit, etc. can be selected.
  6. You can switch preview video files in real time, switch the video file playback progress, and push the stream wherever you switch.
  7. The clarity and quality of the push stream are adjustable.
  8. Files, directories, and addresses can be added dynamically.
  9. Video files are automatically pushed in a loop. If the video source is a video stream, it will automatically reconnect after being disconnected.
  10. The network video stream will automatically reconnect, and the stream will continue to be pushed if the reconnection is successful.
  11. The network video stream has extremely high real-time performance and extremely low delay, with the delay time being about 100ms.
  12. Extremely low CPU usage, 4-channel main stream push only takes up 0.2% of the CPU. Theoretically, a regular ordinary PC machine can push 100 channels without any pressure, and the main performance bottleneck is the network.
  13. There are two options for streaming: rtsp/rtmp. The pushed data supports four direct access methods: rtsp/rtmp/hls/webrtc, and can be opened directly in a browser to view real-time images.
  14. You can push the stream to an external network server, and then play the corresponding video stream through mobile phones, computers, tablets and other devices.
  15. Each push stream can be manually specified with a unique identifier (to facilitate streaming/users do not need to remember complex addresses). If not specified, a hash value will be randomly generated according to the strategy.
  16. Automatically generate a test webpage and open it directly for playback. You can see the real-time effect and automatically display it in a grid according to the number.
  17. During the streaming process, you can switch the corresponding streaming items in the table, preview the video being pushed in real time, and switch the playback progress of the video file.
  18. Audio and video are pushed simultaneously, and the original data that conforms to the 264/265/aac format is automatically pushed, and the data that does not conform is automatically transcoded and then pushed (it will occupy a certain amount of CPU).
  19. The transcoding strategy supports three types, automatic processing (original data that meets the requirements/transcoding that does not meet the requirements), file only (transcoded video of file type), and all transcoding.
  20. The table displays the resolution and audio and video data status of each stream in real time. Gray represents no input stream, black represents no output stream, green represents the original data stream, and red represents the transcoded data stream.
  21. Automatically reconnect to the video source and the streaming media server to ensure that after startup, the push address and the open address are reconnected in real time. As long as they are restored, they will be connected immediately to continue collecting and pushing.
  22. An example of loop push is provided. A video source is pushed to multiple streaming media servers at the same time. For example, a video is opened and pushed to Douyin/Kuaishou/Bilibili at the same time. It can be used as a recording and playback push, and the list is looped, which is very convenient and practical.
  23. According to different streaming media server types, the corresponding rtsp/rtmp/hls/flv/ws-flv/webrtc address is automatically generated. Users can directly copy the address to the player or web page for preview.
  24. The encoded video format can be automatically processed (if the source is 264, then 264/the source is 265, then 265), converted to H264 (forced conversion to 264), or converted to H265 (forced conversion to 265).
  25. Supports any version of Qt4/Qt5/Qt6 and any system (windows/linux/macos/android/embedded linux, etc.).

5. Related codes

void FFmpegPushClient::initOsd()
{
    
    
    QList<OsdInfo> osds;
    OsdInfo osd;

    //日期时间
    osd.name = "datetime";
    osd.color = "#FFFFFF";
    osd.fontSize = 30;
    osd.format = OsdFormat_DateTime;
    osd.position = OsdPosition_LeftTop;
    osds << osd;

    //图片
    osd.name = "osd.png";
    osd.image = QImage(":/image/bg_novideo.png");
    osd.format = OsdFormat_Image;
    osd.position = OsdPosition_LeftBottom;
    //设置唯一名称标识并将图片保存(滤镜基本上都是支持指定图片文件)
    osd.name = "osd.png";
    QString file = QString("./%1").arg(osd.name);
    osd.image.save(file, "png");
    osds << osd;

    ffmpegThread->setOsdInfo(osds);
}

void FFmpegPushClient::start()
{
    
    
    if (ffmpegThread || videoUrl.isEmpty() || pushUrl.isEmpty()) {
    
    
        return;
    }

    //实例化视频采集线程
    ffmpegThread = new FFmpegThread;
    //关联播放开始信号用来启动推流
    connect(ffmpegThread, SIGNAL(receivePlayStart(int)), this, SLOT(receivePlayStart(int)));
    //关联录制信号变化用来判断是否推流成功
    connect(ffmpegThread, SIGNAL(recorderStateChanged(RecorderState, QString)), this, SLOT(recorderStateChanged(RecorderState, QString)));
    //设置播放地址
    ffmpegThread->setVideoUrl(videoUrl);
    //设置解码内核
    ffmpegThread->setVideoCore(VideoCore_FFmpeg);
    //设置视频模式
    ffmpegThread->setVideoMode(VideoMode_Opengl);
    //设置硬解码(和推流无关/只是为了加速显示/推流只和硬编码有关)
    //ffmpegThread->setHardware("dxva2");
    //设置解码策略(推流的地址再拉流建议开启最快速度)
    //ffmpegThread->setDecodeType(DecodeType_Fastest);
    //设置读取超时时间超时后会自动重连
    ffmpegThread->setReadTimeout(5 * 1000);
    //设置连接超时时间(0表示一直连)
    ffmpegThread->setConnectTimeout(0);
    //设置重复播放相当于循环推流
    ffmpegThread->setPlayRepeat(true);
    //设置默认不播放音频(界面上切换到哪一路就开启)
    ffmpegThread->setPlayAudio(false);
    //设置默认不预览视频(界面上切换到哪一路就开启)
    ffmpegThread->setPushPreview(false);

    //设置保存视频类将数据包信号发出来用于保存文件
    FFmpegSave *saveFile = ffmpegThread->getSaveFile();
    saveFile->setSendPacket(!fileName.isEmpty(), false);
    connect(saveFile, SIGNAL(receivePacket(AVPacket *)), this, SLOT(receivePacket(AVPacket *)));
    connect(saveFile, SIGNAL(receiveSaveStart()), this, SLOT(receiveSaveStart()));
    connect(saveFile, SIGNAL(receiveSaveFinsh()), this, SLOT(receiveSaveFinsh()));
    connect(saveFile, SIGNAL(receiveSaveError(int)), this, SLOT(receiveSaveError(int)));    

    //如果是本地设备或者桌面录屏要取出其他参数
    VideoHelper::initVideoPara(ffmpegThread, videoUrl, encodeVideoScale);

    //设置编码策略/视频编码格式/视频压缩比率/视频缩放比例
    ffmpegThread->setEncodeType((EncodeType)encodeType);
    ffmpegThread->setVideoFormat((VideoFormat)videoFormat);
    ffmpegThread->setEncodeVideoRatio(encodeVideoRatio);
    ffmpegThread->setEncodeVideoScale(encodeVideoScale);

    //启动播放
    ffmpegThread->play();
}

void FFmpegPushClient::stop()
{
    
    
    //停止推流和采集并彻底释放对象
    if (ffmpegThread) {
    
    
        ffmpegThread->recordStop();
        ffmpegThread->stop();
        ffmpegThread->deleteLater();
        ffmpegThread = NULL;
    }

    //停止录制
    if (ffmpegSave) {
    
    
        ffmpegSave->stop();
        ffmpegSave->deleteLater();
        ffmpegSave = NULL;
    }
}

Guess you like

Origin blog.csdn.net/feiyangqingyun/article/details/133266794