Skip to content

Commit

Permalink
Merge pull request #3127 from randaz81/gstreamer_plugins
Browse files Browse the repository at this point in the history
Gstreamer plugins
  • Loading branch information
randaz81 authored Aug 28, 2024
2 parents 61a4a05 + a376db4 commit c5e4d05
Show file tree
Hide file tree
Showing 38 changed files with 2,757 additions and 1,394 deletions.
2 changes: 1 addition & 1 deletion .ci/initial-cache.gh.linux.cmake
Original file line number Diff line number Diff line change
Expand Up @@ -24,7 +24,7 @@ set(ENABLE_yarpcar_portmonitor ON CACHE BOOL "")
set(ENABLE_yarppm_depthimage_to_mono ON CACHE BOOL "")
set(ENABLE_yarppm_depthimage_to_rgb ON CACHE BOOL "")
set(ENABLE_yarppm_segmentationimage_to_rgb ON CACHE BOOL "")
set(ENABLE_yarpcar_h264 ON CACHE BOOL "" ON CACHE BOOL "")
set(ENABLE_yarpcar_gstreamer ON CACHE BOOL "" ON CACHE BOOL "")
set(ENABLE_yarpcar_unix_stream ON CACHE BOOL "")
set(ENABLE_yarppm_image_compression_ffmpeg ON CACHE BOOL "")
set(ENABLE_yarppm_sound_compression_mp3 ON CACHE BOOL "")
Expand Down
7 changes: 6 additions & 1 deletion cmake/YarpFindDependencies.cmake
Original file line number Diff line number Diff line change
Expand Up @@ -558,7 +558,11 @@ yarp_dependent_option(
)
yarp_dependent_option(
YARP_COMPILE_yarplaserscannergui "Do you want to compile yarplaserscannergui?" ON
"YARP_COMPILE_EXECUTABLES;YARP_COMPILE_GUIS;YARP_HAS_Qt5;YARP_HAS_OpenCV" OFF
"YARP_COMPILE_EXECUTABLES;YARP_COMPILE_GUIS;YARP_HAS_OpenCV" OFF
)
yarp_dependent_option(
YARP_COMPILE_yarpopencvdisplay "Do you want to compile yarpopencvdisplay?" ON
"YARP_COMPILE_EXECUTABLES;YARP_COMPILE_GUIS;YARP_HAS_OpenCV" OFF
)
yarp_dependent_option(
YARP_COMPILE_yarpviz "Do you want to compile yarpviz?" ON
Expand Down Expand Up @@ -684,6 +688,7 @@ yarp_print_feature(YARP_COMPILE_yarpdataplayer 2 "Compile yarpdataplayer${YARP_C
yarp_print_feature("YARP_COMPILE_yarpdataplayer AND YARP_HAS_OpenCV" 3 "yarpdataplayer video support")
yarp_print_feature(YARP_COMPILE_yarpmotorgui 2 "Compile yarpmotorgui${YARP_COMPILE_yarpmotorgui_disable_reason}")
yarp_print_feature(YARP_COMPILE_yarplaserscannergui 2 "Compile yarplaserscannergui${YARP_COMPILE_yarplaserscannergui_disable_reason}")
yarp_print_feature(YARP_COMPILE_yarpopencvdisplay 2 "Compile yarpopencvdisplay${YARP_COMPILE_yarpopencvdisplay_disable_reason}")
yarp_print_feature(YARP_COMPILE_yarpbatterygui 2 "Compile yarpbatterygui${YARP_COMPILE_yarpbatterygui_disable_reason}")
yarp_print_feature(YARP_COMPILE_yarpviz 2 "Compile yarpviz${YARP_COMPILE_yarpviz_disable_reason}")

Expand Down
2 changes: 1 addition & 1 deletion doc/001_installation/3_install_linux.md
Original file line number Diff line number Diff line change
Expand Up @@ -168,7 +168,7 @@ sudo apt-get install libjpeg-dev

### GStreamer {#install_gstreamer_debian}

GStreamer is required to enable the h264 carrier
GStreamer is required to enable the gstreamer carrier

~~~{.sh}
sudo apt-get install libgstreamer1.0-dev libgstreamer-plugins-base1.0-dev \
Expand Down
2 changes: 1 addition & 1 deletion doc/090_tutorials.dox
Original file line number Diff line number Diff line change
Expand Up @@ -49,7 +49,7 @@ Here are a collection of tutorials on various topics in YARP.
- \ref yarp_pointcloud
- \ref yarp_code_examples
- \ref using_cmake
- \ref carrier_h264_howto
- \ref carrier_gstreamer_howto

\section tutorial_protocols Communication protocol details:
- \ref yarp_protocol
Expand Down
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
/**
\ingroup carriers_examples
\defgroup carrier_h264_howto h264 carrier
\defgroup carrier_gstreamer_howto Gstreamer carrier

\tableofcontents

Expand All @@ -13,44 +13,39 @@ This document contains a brief introduction to Gstreamer tool and explains how t
\note They are still work in progress and should be considered experimental. Please report any problems.

\section gstreamer_introduction Gstreamer: brief introduction
Gstreamer is a free framework for media applications; it provides a set of plugins that let the user to build applications by connecting them as in a pipeline. It has been ported to a wide range of operating systems, processors and compilers. Also Nvidia developed plugins for its platforms and we employ its h264 encode in order to take advantage from its hardware codec.
Gstreamer is a free framework for media applications; it provides a set of plugins that let the user to build applications by connecting them as in a pipeline. It has been ported to a wide range of operating systems, compilers and processors, including Nvidia GPUs.

A Gstreamer application is composed by a chain of elements, the base construction block of a Gstreamer application. An element takes an input stream from previous element in the chain, carries out its function, like encode, and passes the modified stream to the next element. Usually each element is a plugin.

The user can develop application in two way: the first consists in write an application in c/c++, where the elements are connected using API, while the second uses the gst-launch command-line tool. In the following an example of how to use gst-launch command:


\verbatim
gst-launch-1.0 -v videotestsrc ! ‘video/x-raw, format=(string)I420, width=(int)640, height=(int)480’ ! x264enc ! h264parse ! avdec_h264 ! autovideosink \endverbatim
gst-launch-1.0 -v videotestsrc ! ‘video/x-raw, format=(string)I420, width=(int)640, height=(int)480’ ! x264enc ! h264parse ! avdec_h264 ! autovideosink
\endverbatim

This command creates a source video test with the properties specified in this string <em> “video/x-raw, format=(string)I420, width=(int)640, height=(int)480”</em>; after it is encoded in h264, then decoded and shown. Each element of this pipeline, except the property element, is plugins dynamically loaded. The videotestsrc element lets the user to see a stream without using camera.

The previous command works on Linux, but since Gstreamer is platform independent, we can launch the same command on Windows taking care to change only hardware dependent plugin. So the same command on Window is:
The previous command works on Linux, but since Gstreamer is platform independent, we can launch the same command on Windows taking care to change only hardware dependent plugin. So the same command on Windows is:

\verbatim
gst-launch-1.0 -v videotestsrc ! “video/x-raw, format=(string)I420, width=(int)640, height=(int)480” !
openh264enc ! h264parse ! avdec_h264 ! autovideosink
gst-launch-1.0 -v videotestsrc ! “video/x-raw, format=(string)I420, width=(int)640, height=(int)480” ! openh264enc ! h264parse ! avdec_h264 ! autovideosink
\endverbatim

It’s important to notice that the changed element is the encoder (openh264enc), while the decoder is the same. This because the decoder belongs to the plugin that wraps libav library, a cross-platform library to convert stream in a wide range of multimedia formats. [see \ref references chapter]


<em> Please see \ref notes section about commands in this tutorial. </em>

\section how_to_stream_h264 How to stream in h264
\section how_to_stream_h264 How to stream using h264 encoder
The server grabs images from cameras, so it needs to run on where cameras are connected.
The server is a Gstreamer command pipeline, while the client could be a yarp or a Gstreamer application connected to the robot’s network.

Since Gstreamer is wide spread framework for media application, wrap it into yarp it is not interesting and is worthless.
Instead is more intriguing that a yarp application can read “standard” streams using the h264 carrier.
For these reasons, the server streamer is a native Gstreamer pipeline, while the client side has been developed in yarp like a carrier.

\subsection server_side Server side:
The server application consists in the following Gstreamer command:

\verbatim
gst-launch-1.0 -v v4l2src device="/dev/video1" ! ‘video/x-raw, width=1280, height=480, format=(string)I420’ !
omxh264enc ! h264parse ! rtph264pay pt=96 config-interval=5 ! udpsink host=224.0.0.1 auto-multicast=true port=33000
gst-launch-1.0 -v v4l2src device="/dev/video1" ! ‘video/x-raw, width=1280, height=480, format=(string)I420’ ! omxh264enc ! h264parse ! rtph264pay pt=96 config-interval=5 ! udpsink host=224.0.0.1 auto-multicast=true port=33000
\endverbatim


Expand All @@ -62,48 +57,31 @@ gst-launch-1.0 -v v4l2src device="/dev/video1" ! ‘video/x-raw, width=1280, hei
\li <em>udpsink</em>: this is the last element and sends out the stream. In this case we use multicast, but it is possible to send the stream using unicast in this way: udpsink host=IP_ADDRESS_OF_CLIENT port=NOT_WELL_KNOWN_PORT_NUMBER


Currently is not available a Gstreamer application that implements the pipeline. In the near future, it could be developed in order to make easier configure the server.


\subsection client_side Client side
The client can read the stream using Gstreamer native command or yarp.
The client can read the stream using Gstreamer native command:

In the first case the Gstreamer command is:
\verbatim
gst-launch-1.0 -v udpsrc multicast-group=224.0.0.1 auto-multicast=true port=3000 caps="application/x-rtp, media=(string)video, encoding-name=(string)H264, payload=(int)96" !
rtph264depay ! h264parse ! avdec_h264! autovideosink
gst-launch-1.0 -v udpsrc multicast-group=224.0.0.1 auto-multicast=true port=3000 caps="application/x-rtp, media=(string)video, encoding-name=(string)H264, payload=(int)96" ! rtph264depay ! h264parse ! avdec_h264 ! autovideosink
\endverbatim

If you want use a yarp application to read the stream you need to:
\verbatim
1) install Gstreamer (see \ref how_to_install_gstreamer )
2) compile yarp with “ENABLE_yarpcar_h264” option enabled.
3) register the stream server to the yarp server: “yarp name register <SERVER_NAME_PORT> h264 <SERVER_IP_ADRESS> <SERVER_IP_PORT>”
4) run your application
5) connect client and server port using h264 carrier: “yarp connect <SERVER_NAME_PORT> <CLIENT_PORT> h264”
\endverbatim


\subsection some_options Some options
\subsection some_options Some options and extra notes

\li <b>set the frame rate </b>: in server side exist the parameter framerate=30/1, that configures the framerate to which grab images. Insert in in property element: 'video/x-raw, format=(string)I420, width=(int)640, height=(int)480, framerate=30/1'
\li In some cases could be useful that server streams video at <b>constant rate</b>. You can achieve this adding this parameter to the encoder plugin: control-rate=2 bitrate=5000000
\li <b>remove the jitter</b>: in the client it is possible adding the rtpjitterbuffer plugin in this way:
If you want to remove the jitter in h264 yarp carrier, please add parameter “+removeJitter.1” in connect command. (Note that the syntax is the usual used to specify parameters to yarp carriers). Therefore the connect command to your application could be:
\verbatim
yarp connect <SERVER_NAME_PORT> <CLIENT_PORT> h264+removeJitter.1
\endverbatim
\li The yarp carrier lets you to \b crop each frame specifying the number of pixel to crop on each side of image in carrier parameter in connection command. For example if you want to crop 200 pixel on top the command appears like:
\verbatimyarp connect <SERVER_NAME_PORT> <CLIENT_PORT> h264+cropTop.200.\endverbatim
Instead, if you want to use native Gstreamer client the plugin “videocrop” performs this operation.
\li You can use the native Gstreamer plugin “videocrop” to crop the video:
\verbatim
gst-launch-1.0 -v udpsrc port=33000 caps="application/x-rtp, media=(string)video, encoding-name=(string)H264, payload=(int)96" !
rtph264depay ! h264parse ! avdec_h264 ! videocrop left=10, right=30, top=50, bottom=50 ! autovideosink
gst-launch-1.0 -v udpsrc port=33000 caps="application/x-rtp, media=(string)video, encoding-name=(string)H264, payload=(int)96" ! rtph264depay ! h264parse ! avdec_h264 ! videocrop left=10, right=30, top=50, bottom=50 ! autovideosink
\endverbatim

\li <b>fakevideosink</b> plugin can be used instead of <b>autovideosink</b> to test the pipeline without displaying anything.
\li <b>videotestsrc</b> followed by the format specifier can be used to generate a test image
\li <b>filesrc / filesink</b> plugins can used to read from/write to file.
\li other useful encoder plugins: <b>x264,x265,avenc_mjpeg</b>. Suggested decoders: <b>avdec_h265,avdec_h265,avdec_mjpeg</b>
\li official plugins list: https://gstreamer.freedesktop.org/documentation/plugins_doc.html?gi-language=c
\li another common format for video/x-raw instead of <b>I420</b> is <b>RGB</b>

\section how_to_install_gstreamer How to install Gstreamer
Currently we are using 1.8.3 version
Currently we are using 1.24.4 version.

\subsection ubuntu On Ubuntu
\li Packages required to build
Expand All @@ -116,6 +94,16 @@ Currently we are using 1.8.3 version
- gstreamer1.0-libav (for avdec_h264)
\li Useful packages but not required
- gstreamer1.0-tools

\verbatim
sudo apt-get install libgstreamer1.0-dev \
libgstreamer-plugins-base1.0-dev \
gstreamer1.0-plugins-base \
gstreamer1.0-plugins-good \
gstreamer1.0-plugins-bad \
gstreamer1.0-libav \
gstreamer1.0-tools
\endverbatim

\subsection windows On windows
You need to download both the main package and the devel package from here:
Expand All @@ -135,37 +123,54 @@ Installation of grstreamer devel package:
\li Add in path environment variable the path to executable (Usually is C:\\gstreamer\\1.0\\x86_64\\bin)

\subsection check_installation Verify your installation
First of all, you need to verify if Gstreamer has been installed successfully by using these commands:

You can verify the installation by running a simple test application composed by a server and a client :

\subsubsection server Server side (example on Windows)

\verbatim
gst-launch-1.0 -v videotestsrc ! "video/x-raw, format=(string)I420, width=(int)640, height=(int)480" !
openh264enc ! h264parse ! rtph264pay pt=96 config-interval=5 ! udpsink host=<YOUR_IP_ADDRESS> port=<A_PORT_NUMBER>
gst-launch-1.0 -v videotestsrc ! "video/x-raw, format=(string)I420, width=(int)640, height=(int)480" ! openh264enc ! h264parse ! rtph264pay pt=96 config-interval=5 ! udpsink host=<YOUR_IP_ADDRESS> port=<A_PORT_NUMBER>
\endverbatim

\subsubsection client Client side
\verbatim
gst-launch-1.0 -v udpsrc port=<A_PORT_NUMBER> caps="application/x-rtp, media=(string)video, encoding-name=(string)H264, payload=(int)96" !
rtph264depay ! h264parse ! avdec_h264! autovideosink
gst-launch-1.0 -v udpsrc port=<A_PORT_NUMBER> caps="application/x-rtp, media=(string)video, encoding-name=(string)H264, payload=(int)96" ! rtph264depay ! h264parse ! avdec_h264 ! autovideosink
\endverbatim

After you can substitute the client side with a yarp application, for example yarpview.
So, after yarp server had been launched, please run:
\section yarp_usage Usage with yarp

\li yarpview
\li yarp name register /gst h264 <YOUR_IP_ADDRESS> <A_PORT_NUMBER>
\li yarp connect /gst /yarpview/img:i h264
If you want use a yarp application to read the stream you need to:
-# install Gstreamer (see \ref how_to_install_gstreamer )
-# compile yarp with “ENABLE_yarpcar_gstreamer” option enabled.
-# Run the server as mentioned above.
-# register a fake port to the yarp server, in this example called gstreamer_src:
\verbatim
yarp name register /gstreamer_src gstreamer <SERVER_IP_ADRESS> <SERVER_IP_PORT>
\endverbatim
-# set up your decoding pipeline by setting an environment variable with a string containing a string.
For example on windows:
\verbatim
set GSTREAMER_ENV=udpsrc port=<A_PORT_NUMBER> caps="application/x-rtp, media=(string)video, encoding-name=(string)H264, payload=(int)96" ! rtph264depay ! h264parse ! avdec_h264
\endverbatim
on linux:
\verbatim
export GSTREAMER_ENV="udpsrc port=15000 caps=\"application/x-rtp, media=(string)video, encoding-name=(string)H264, payload=(int)96\" ! rtph264depay ! h264parse ! avdec_h264"
\endverbatim
-# run your application (e.g. yarpview, or your own application). This must be executed after setting the environment variable, i.e. the environment variable should be accessible to the executable.
\verbatim
yarpview --name /view
\endverbatim
-# connect client and server port using gstreamer carrier:
\verbatim
yarp connect /gstreamer_src /view gstreamer+pipelineEnv.GSTREAMER_ENV
\endverbatim

Now on yarp view you can see the following image, where in the bottom right box there is snow pattern.

\image html h264GstVideoTestSrc.png ""

\section notes Notes
\subsection tricks Tricks
\li On Ubuntu 18.04 the plugin “autovideosink” has a bug, please use “xvimagesink”
\li On Windows the property element uses ‘ instead of “
\li Another completely different way to use Yarp with Gstreamer is to use `Yarp Gstreamer Plugins`. They also allow to feed a yarp image inside a gstreamer pipeline. See documentation: \ref gstreamerplugins_module

\section references References
[1] Gstreamer documentation
Expand Down
Loading

0 comments on commit c5e4d05

Please sign in to comment.