Phytec中国的wiki
support@phytec.cn
热线:0755-61802110-803
gstreamer文档首页 https://gstreamer.freedesktop.org/documentation/index.html
gstreamer是什么
图来自 http://maemo.org/development/sdks/maemo_5_beta_docs/using_multimedia_components/
https://zh.wikipedia.org/wiki/GStreamer
GStreamer 是一个基于流水线的多媒体框架,基于GObject,以C语言写成。
凭借GStreamer,程序员可以很容易地创建各种多媒体功能组件,包括简单的音频回放,音频和视频播放,录音,流媒体和音频编辑。基于流水线设计,可以创建诸如视频编辑器、流媒体广播和媒体播放器等等的很多多媒体应用。
https://blog.csdn.net/jack0106/article/details/5909935
自己动手写gstreamer插件,是一件比较困难的事情,目前我还没有那个能力,不过还好发现了appsink和appsrc,可以在一定程度上避免编写gstreamer插件,同时又可以在pipeline中调用第三方的代码。gstreamer的确是linux上多媒体开发的王者!
流式IO
我们的VPU硬件接口,并不支持写入/读取这样的接口(如 vpu_decode(input, output); ),只支持stream IO,而使用stream IO的优势:
https://blog.csdn.net/coroutines/article/details/70141086
基于V4L2的应用,通常面临着大块数据的读取与拷贝等问题。尤其在嵌入式系统中,对于实时性能要求较高的应用,拷贝会花上几十个ms的时间,这通常轻则造成用户体验差,重则导致产品质量不达标。V4L2 Framework定义了几种不同的方式,用于从设备中读取数据,这篇文章简要介绍下在Streaming I/O模式下,如何使用这几种数据的获取与使用方法。Streaming I/O设计的目的就是为了减少在数据处理的各个环节中,拷贝的次数,从而实现各阶段硬件的无缝配合。
而gstreamer就是设计用来处理流式IO的,所以要方便、高效的使用VPU,通过gstreamer来开发是最容易的。
gstreamer支持的功能
下面这个表格介绍了一下常用的功能,gstreamer可实现的功能远不止这些。
功能 | 说明 |
---|---|
USB的UVC摄像头/SOC摄像头接口 | gstreamer都提供支持 |
硬视频编解码 | 我们平台通过v4l2设备来实现 |
软视频编解码 | 大部分通过libav来实现 |
打包/解包 | 如matroskamux,avimux,mp4mux |
音频编解码 | 基本都支持,mp3,ogg等 |
rtp/rtsp | 可支持服务端,客户端,打包解包 |
测试信号源 | 视频源 videotestsrc 可输出如雪花点,SMPTE测试图形等 |
读取/写入文件 | 可方便的从文件读取作为源,或者将任意位置数据写入文件 |
显示 | 可将视频输出到framebuffer或者drm设备 |
在软件中处理流的数据 | appsrc和appsink可以方便的在软件中生成和处理数据 |
将视频流分开两个流处理 | gtreamer可以很容易实现 |
音频视频分别处理 | 天然支持 |
gstreamer命令行测试/调试工具
gstreamer有一系列工具可以很方便的用来验证想法,甚至直接作为应用。
常用的指令有两个:
命令 | 介绍 |
---|---|
gst-launch | 执行gstreamer pipeline,实现各种功能 |
gst-inspect | 查看所有的插件/处理模块,或查看模块的信息 |
如查找解码器:
root@phyboard-mira-imx6-3:~# gst-inspect-1.0 | grep Decoder video4linux2: v4l2mpeg4dec: V4L2 MPEG4 Decoder video4linux2: v4l2mpeg2dec: V4L2 MPEG2 Decoder video4linux2: v4l2h264dec: V4L2 H264 Decoder playback: uridecodebin3: URI Decoder playback: uridecodebin: URI Decoder playback: decodebin3: Decoder Bin 3 playback: decodebin: Decoder Bin rtp: rtpulpfecdec: RTP FEC Decoder rtp: rtpreddec: Redundant Audio Data (RED) Decoder rtsp: rtpdec: RTP Decoder
查看硬件解码器的信息:
root@phyboard-mira-imx6-3:~# gst-inspect-1.0 v4l2mpeg4dec Factory Details: Rank primary + 1 (257) Long-name V4L2 MPEG4 Decoder Klass Codec/Decoder/Video Description Decodes MPEG4 streams via V4L2 API Author Nicolas Dufresne <nicolas.dufresne@collabora.com> Plugin Details: Name video4linux2 Description elements for Video 4 Linux Filename /usr/lib/gstreamer-1.0/libgstvideo4linux2.so Version 1.14.1 License LGPL Source module gst-plugins-good Source release date 2018-05-17 Binary package GStreamer Good Plug-ins source release Origin URL Unknown package origin GObject +----GInitiallyUnowned +----GstObject +----GstElement +----GstVideoDecoder +----GstV4l2VideoDec +----v4l2mpeg4dec Pad Templates: SINK template: 'sink' Availability: Always Capabilities: video/mpeg mpegversion: 4 systemstream: false SRC template: 'src' Availability: Always Capabilities: video/x-raw format: { (string)NV12, (string)I420, (string)YV12, (string)YUY2 } width: [ 1, 32768 ] height: [ 1, 32768 ] framerate: [ 0/1, 2147483647/1 ] Element has no clocking capabilities. Element has no URI handling capabilities. Pads: SINK: 'sink' Pad Template: 'sink' SRC: 'src' Pad Template: 'src' Element Properties: name : The name of the object flags: readable, writable String. Default: "v4l2mpeg4dec0" parent : The parent of the object flags: readable, writable Object of type "GstObject" device : Device location flags: readable String. Default: "/dev/video9" device-name : Name of the device flags: readable String. Default: "CODA960" device-fd : File descriptor of the device flags: readable Integer. Range: -1 - 2147483647 Default: -1 output-io-mode : Output side I/O mode (matches sink pad) flags: readable, writable Enum "GstV4l2IOMode" Default: 0, "auto" (0): auto - GST_V4L2_IO_AUTO (1): rw - GST_V4L2_IO_RW (2): mmap - GST_V4L2_IO_MMAP (3): userptr - GST_V4L2_IO_USERPTR (4): dmabuf - GST_V4L2_IO_DMABUF (5): dmabuf-import - GST_V4L2_IO_DMABUF_IMPORT capture-io-mode : Capture I/O mode (matches src pad) flags: readable, writable Enum "GstV4l2IOMode" Default: 0, "auto" (0): auto - GST_V4L2_IO_AUTO (1): rw - GST_V4L2_IO_RW (2): mmap - GST_V4L2_IO_MMAP (3): userptr - GST_V4L2_IO_USERPTR (4): dmabuf - GST_V4L2_IO_DMABUF (5): dmabuf-import - GST_V4L2_IO_DMABUF_IMPORT extra-controls : Extra v4l2 controls (CIDs) for the device flags: readable, writable Boxed pointer of type "GstStructure"
播放视频,如:
gst-launch-1.0 -m -vvvv filesrc location=/usr/share/phytec-qtdemo/videos/caminandes.webm ! \ decodebin ! \ videoconvert ! \ videoscale ! \ video/x-raw,format=YUY2,height=480,width=800 ! \ kmssink connector-id=32 driver-name="imx-drm"
由于在这个过程中并未指定使用硬件解码,是否使用了解码可以通过中断计数来查看:
root@phyboard-mira-imx6-3:~/udisk# cat /proc/interrupts | grep vpu 24: 0 0 0 0 GPC 12 Edge 2040000.vpu
如使用下面的命令行可以实现通过硬件压缩测试视频为h264并通过网络发送:
gst-launch-1.0 v4l2src device=/dev/video10 ! \ videoconvert ! \ video/x-raw,width=1280,height=720,colorimetry=bt709 ! \ v4l2h264enc ! \ rtph264pay config-interval=10 pt=96 ! \ udpsink host=192.168.3.10 port=5000 sync=true
然后在上位机使用vlc打开rtp.sdp文件,文件内容:
v=0 m=video 5000 RTP/AVP 96 c=IN IP4 192.168.3.11 a=rtpmap:96 H264/90000
即可播放开发板提供的h264网络视频流。
- 入门 https://gstreamer.freedesktop.org/documentation/frequently-asked-questions/using.html
- 介绍有用的elements https://gstreamer.freedesktop.org/documentation/tutorials/basic/handy-elements.html
- 各种例子 http://wiki.oz9aec.net/index.php/Gstreamer_cheat_sheet
gstreamer应用开发
gstreamer作为一个软件开发框架可以很方便的进行音视频的开发,它基于C语言,开发中会用到GObject以及glib。
GObject可以认为是在C语言中实现了面对对象的功能,而glib是一个比较通用的库。
在使用命令行工具验证后,可以很容易的改写成应用程序。
- gstreamer编程教程:https://gstreamer.freedesktop.org/documentation/tutorials/basic/hello-world.html
- gstreamer应用开发指南:https://gstreamer.freedesktop.org/documentation/application-development/basics/helloworld.html
- 怎样在QT中使用gstreamer:https://gstreamer.freedesktop.org/data/doc/gstreamer/head/qt-gstreamer/html/examples.html
怎样按照gst-launch的pipeline来写代码
其中重要的一点是,如何向gstreamer的处理流(pipeline)中输入或者输出数据,大多数情况下使用appsrc和appsink就可以搞定。
https://blog.csdn.net/coroutines/article/details/43987357
应用程序可以使用多种方式向Pipeline中注入数据,而使用AppSrc是最简单的一种。
AppSrc可以工作在俩种模式下:Pull模式和Push模式。Pull模式下,AppSrc会在需要的时候向应用程序请求数据(信号:need-data),而Push模式下,应用程序主动向AppSrc注入数据(信号:enough-data)。
https://blog.csdn.net/sakulafly/article/details/21340263
专门让应用可以往pipeline里面传入数据的element时appsrc,而appsink就正好相反,让应用可以从pipeline中获得数据。为了避免混淆,我们可以这么来理解,appsrc是一个普通的source element,不过它的数据都是来自外太空,而appsink是一个普通的sink element,数据从这里出去的就消失不见了。
下面举一个例子
根据之前的硬压rtp例子编写的应用程序,该程序可实现同样的功能。
#include <gst/gst.h> #include <stdio.h> #include <signal.h> #include <string.h> #define CLIENT_IP "192.168.3.10" #define TARGET_BITRATE 15000000 #define WIDTH 1280 #define HEIGHT 720 static GMainLoop *loop; GstElement *pipeline, *source, *convert, *encoder, *rtp264, *sink, *capsfilter; GstCaps *source_caps, *bayer_filter_caps, *YUV_filter_caps, *Ethernet_filter_caps; GstBus *bus; GstStateChangeReturn ret; guint bus_watch_id; static void sigint_restore (void) { struct sigaction action; memset (&action, 0, sizeof (action)); action.sa_handler = SIG_DFL; sigaction (SIGINT, &action, NULL); } /* Signal handler for ctrl+c */ void intHandler(int dummy) { //! Emit the EOS signal which tells all the elements to shut down properly: printf("Sending EOS signal to shutdown pipeline cleanly\n"); gst_element_send_event(pipeline, gst_event_new_eos()); sigint_restore(); return; } static gboolean bus_call (GstBus *bus, GstMessage *msg, gpointer data) { GMainLoop *loop = (GMainLoop *) data; switch (GST_MESSAGE_TYPE (msg)) { case GST_MESSAGE_EOS: g_print ("End of stream\n"); g_main_loop_quit (loop); break; case GST_MESSAGE_ERROR: { gchar *debug; GError *error; gst_message_parse_error (msg, &error, &debug); g_free (debug); g_printerr ("Error: %s\n", error->message); g_error_free (error); g_main_loop_quit (loop); break; } default: break; } return TRUE; } int watcher_make() { /* we add a message handler */ bus = gst_pipeline_get_bus (GST_PIPELINE (pipeline)); bus_watch_id = gst_bus_add_watch (bus, bus_call, loop); gst_object_unref (bus); return 0; } int main(int argc, char *argv[]) { signal(SIGINT, intHandler); /* Initialize GStreamer */ gst_init (&argc, &argv); loop = g_main_loop_new (NULL, FALSE); /* Create the elements */ // source = gst_element_factory_make ("appsrc", "source"); source = gst_element_factory_make ("v4l2src","source"); convert = gst_element_factory_make ("videoconvert","convert"); encoder = gst_element_factory_make ("v4l2h264enc","encoder"); rtp264 = gst_element_factory_make ("rtph264pay","rtp264"); sink = gst_element_factory_make ("udpsink","sink"); capsfilter = gst_element_factory_make ("capsfilter","video_caps"); /* Create the empty pipeline */ pipeline = gst_pipeline_new ("pipeline"); if (!pipeline || !source || !encoder || !rtp264 || !sink) { g_printerr ("Not all elements could be created.\n"); return -1; } /* Modify the source properties */ g_object_set (G_OBJECT (source), "device", "/dev/video10", NULL); /* Modify the caps properties */ source_caps = gst_caps_from_string ("video/x-raw,width=1280,height=720,colorimetry=bt709"); g_object_set (G_OBJECT (capsfilter), "caps", source_caps, NULL); gst_caps_unref(source_caps); g_object_set( G_OBJECT(rtp264), "pt", 96, "config-interval",10,NULL); g_object_set( G_OBJECT(sink), "host", CLIENT_IP, "port", 5000, "sync",FALSE, NULL); /* Add function to watch bus */ if(watcher_make() != 0) return -1; /* Build the pipeline */ gst_bin_add_many (GST_BIN (pipeline), source, convert, capsfilter, encoder, rtp264, sink, NULL); /* Link the elements together */ gst_element_link_many (source, convert, capsfilter, encoder, rtp264, sink, NULL); /* Start playing */ ret = gst_element_set_state (pipeline, GST_STATE_PLAYING); if (ret == GST_STATE_CHANGE_FAILURE) { g_printerr ("Unable to set the pipeline to the playing state.\n"); gst_object_unref (pipeline); return -1; } g_main_loop_run(loop); /* Free resources */ gst_element_set_state (pipeline, GST_STATE_NULL); gst_object_unref (GST_OBJECT (pipeline)); g_main_loop_unref (loop); return 0; }
编译方法:
$CC $CFLAGS -I=/usr/include/gstreamer-1.0 -I=/usr/include/glib-2.0 -I=/usr/lib/glib-2.0/include -L=/usr/lib -lgstreamer-1.0 -lgobject-2.0 -lglib-2.0 gst.c -o gst
phyCORE i.MX6平台vpu gstreamer介绍
NXP的VPU介绍PPT:http://cache.freescale.com/files/training/doc/ftf/2014/FTF-CON-F0165.pdf
在phytec-qt5demo-image这个镜像文件系统的home目录中(也就是开发板root用户的home目录),有两个文件夹 gstreamer_examples 和 v4l2_c-examples
其中有编码和解码的例子,请阅读其中的说明文件与脚本。
root@phyboard-mira-imx6-3:~# ls gstreamer_examples 该目录下主要是一些操作相机的gstreamer指令 bwcam-fbdev_640x480.sh func.sh phytec_usb_cam register-settings-tw9910.txt bwcam-save_jpg_full_res.sh master-clock_vm006.sh port_cam-1_via_ipu2-csi1 register-settings-vita1300.txt bwcam-save_raw_full_res.sh more_mt9m131_scripts readme.txt remove_qt_demo.sh camera_select_i.MX6.txt more_mt9p031_scripts register-settings-mt9m001.txt tools colcam-fbdev_640x480.sh more_mt9v024_scripts register-settings-mt9m131.txt vpu_enc_dec_scripts colcam-save_jpg_full_res.sh more_vita1300_scripts register-settings-mt9p031.txt colcam-save_raw_full_res.sh phytec_thermal_cam register-settings-mt9v02x.txt root@phyboard-mira-imx6-3:~# ls gstreamer_examples/vpu_enc_dec_scripts/ 操作VPU的gstreamer指令 VLC_Setup.txt bwcam_network_stream.sh colcam_network_stream.sh readme.txt bwcam-save_h264.sh colcam-save_h264.sh play_h264_mkv-file.sh show_network_stream.sh root@phyboard-mira-imx6-3:~# ls gstreamer_examples/phytec_usb_cam/ USB摄像头的gstreamer指令 func.sh usb_bwcam-fbdev_640x480.sh usb_bwcam-save_raw_full_res.sh readme.txt usb_bwcam-save_jpg_full_res.sh root@phyboard-mira-imx6-3:~# ls v4l2_c-examples 使用v4l2来操作相机的指令 anleitung_instruction.txt mt9p031_bw_full_save-raw.sh vita1300_bw_full_save-raw.sh mt9m131_col_full_save-raw.sh mt9v02x_bw_full_save-raw.sh vita1300_col_full_save-raw.sh mt9p006_col_full_save-raw.sh mt9v02x_col_full_save-raw.sh vm05x_full_save-raw.sh
也可以在这里下载:
BSP版本 | 地址 |
---|---|
16.x | |
18.x |
我们的i.MX6软件平台与NXP方案的区别
我们使用开源的VPU驱动以及开源的gstreamer插件,如图:
也就是说我们和nxp都提供gstreamer的接口。
在更底层,nxp提供私有的接口,我们提供v4l2的接口,在后面的v4l2例子中,我们会写一个应用程序打开vpu设备来读取它的参数。