Gstreamer-based applications

The upper layer of the application understands how to build a video playback application based on gstreamer's API.

2.1 MP3 playback based on gstreamer api

A basic process

1. Initialize gst and start a main loop.

2. Define a pipeline and the required elements in the pipeline, such as source decoder sink, etc.

3. Set some information about necessary components.

4. Get the bus and monitor the bus messages.

5. Call gst_bin_add_many to form a pipeline from each element, and then call gst_element_link_many to link these components together.

6. Set the status of the pipeline to playing.

7. Start executing this main loop

8. In the bus listening callback function, if the event at the end of the file is monitored, exit the main loop.

9. Code example:


  

 GMainLoop *loop;

    GstElement *pipeline,*source,*decoder,*sink;//定义组件

    GstBus *bus;

    gst_init(&argc,&argv);

    loop = g_main_loop_new(NULL,FALSE);//创建主循环,在执行 g_main_loop_run后正式开始循环

    if(argc != 2)

    {

        g_printerr("Usage:%s <mp3 filename>\n",argv[0]);

        return -1;

    }

    //创建管道和组件

    pipeline = gst_pipeline_new("audio-player");

    source = gst_element_factory_make("filesrc","file-source");

    decoder = gst_element_factory_make("mad","mad-decoder");

    sink = gst_element_factory_make("autoaudiosink","audio-output");

    if(!pipeline||!source||!decoder||!sink){

        g_printerr("One element could not be created.Exiting.\n");

        return -1;

    }



    //设置 source的location 参数。即 文件地址.

    g_object_set(G_OBJECT(source),"location",argv[1],NULL);



    //得到 管道的消息总线

    bus = gst_pipeline_get_bus(GST_PIPELINE(pipeline));

   //添加消息监视器

    gst_bus_add_watch(bus,bus_call,loop);

    gst_object_unref(bus);



    //把组件添加到管道中.管道是一个特殊的组件,可以更好的让数据流动

    gst_bin_add_many(GST_BIN(pipeline),source,decoder,sink,NULL);



   //依次连接组件

   gst_element_link_many(source,decoder,sink,NULL);



   //开始播放

    gst_element_set_state(pipeline,GST_STATE_PLAYING);

    g_print("Running\n");



    //开始循环

    g_main_loop_run(loop);

    g_print("Returned,stopping playback\n");

    gst_element_set_state(pipeline,GST_STATE_NULL);

    gst_object_unref(GST_OBJECT(pipeline));

    return 0;

2.2 qt mutlimedia

  qt operates through QMediaPlayer. QMediaPlayer integrates operations including audio output and audio file reading at the bottom. It is a high-level, encapsulated player core. By calling it, you can input video and audio playback in any format and play it. Status adjustment.

    Code to use QMediaPlayer to play a piece of music

player = new QMediaPlayer;

player->setMedia(QMediaContent(QUrl::fromLocalFile("coolsong.mp3")));

player->play();

The first step: create a QMediaplayer object; the second step: set the currently played media file; the third step: play.

The bottom layer of QMediaPlayer is implemented by gstreamer through a series of encapsulations.

  

 specific process

    After a series of calls, all operations of QMediaPlayer will be called to QGstreamerPlayerSession. In the constructor of QGstreamerPlayerSession, call gst_element_factory_make("playbin3", NULL);

The playbin pipeline of gstreamer was created, the videoSink and AudioSink of playbin were set up, and the bus was monitored. Finally, callback functions were set for some messages. As shown in the figure below, all QMediaPlayer's pause play stop and other playback control operations ultimately set different states for this playbin. Such as play operation: gst_element_set_state(m_playbin, GST_STATE_PLAYING)

 

After understanding the basic concepts and processes of gstreamer, it is necessary to delve into the source code of gstreamer to understand the working mechanism of gstreamer.

Guess you like

Origin blog.csdn.net/H2008066215019910120/article/details/112564981