A sample showing how you can integrate Basler's pylon API into GStreamer's AppSrc element (so you can use your camera as a source for a pipeline!). Applications include streaming video across network, displaying to framebuffers, etc.
Software license: Apache 2.0
pylongstreamer (using the InstantCameraAppSrc class):
Example of using Basler's Pylon API with GStreamer's GstAppSrc element.
InstantCameraAppSrc Functional Descrtiption:
A Typical Image Streaming Pipeline would look as follows:
The GStreamer AppSrc element is the source for the pipeline.
When AppSrc needs an image, it sends the "need-data" signal.
This triggers a callback function where the user can "push" their own data buffer to the src pad of the AppSrc element.
The data buffer is then picked up by the rest of the pipeline.
The InstantCameraAppSrc class provides a convinient way to access a Basler camera in your GStreamer application:
For example, InitCamera() can be used to configure the camera and pylon driver as you like.
And the source element for the pipeline can be created with: GstElement* source = myCamera.GetGstAppSrc();
By default, the InstantCameraAppSrc will use the camera in free-running mode, and the pylon driver's Grab Engine with the LatestImageOnly strategy, so that only the latest image received by the camera is available to be retrieved. If a new image arrives before the previous one is retrieved, it is overwritten in the Grab Engine. This strategy makes for the "live-ist" of streams. :-)
The CInstantCameraAppSrc class is an extension of the Pylon::CInstantCamera class.
It accesses the camera using pylon's GenICam-compliant generic API, so it will work with both USB and GigE cameras out of the box. :-)
pylongstreamer example program usage:
-aoi width height
-rescale width height
Pipelines (created with the CPipelineHelper class):
Quick-Start (Just pick a pipeline. It will use the first camera found, and set it to 640x480, 30fps):
Save a video:
pylongstreamer -camera 12345678 -aoi 320 240 -framerate 15 -h264file mymovie.h264 100
Display stream in a window:
pylongstreamer -framerate 30 -display
Display stream on a seconday screen:
pylongstreamer -aoi 640 480 -framebuffer /dev/fb1
Use a TTL trigger to control the camera, rescale and rotate image, compress the images to h264, and stream across the network:
pylongstreamer -camera 21234567 -aoi 1024 768 -rescale 320 240 -rotate 90 -framerate 60 -usetrigger -h264stream 192.168.1.150
-camera: Use a specific camera by entering the serial number.
-ondemand: Camera will be software triggered with each need-data signal instead of freerunning. May lower CPU load, but may be less 'real-time'.
-usetrigger: Camera will expect to be hardware triggered by user via IO ports (cannot be used with -ondemand).
Pipeline Example Details:
-display (displays the raw image stream in a window on the local machine.)
This is built and tested with GStreamer 1.0 and Pylon 5.0.9 (Linux) / Pylon 5.0.10 (Windows).
Project supports Windows and Linux; x86 x64, and ARM. Repo includes Visual Studio 2013 solution and a Linux makefile.
Note for Linux users: You may need to install the following libraries:
(on Ubuntu systems, all can be isntalled using apt-get install)
Note about GStreamer Plugins (elements):
Some GStreamer elements (plugins) used in the pipeline examples may not be available on all systems. Consult GStreamer for more information: https://gstreamer.freedesktop.org/
Please share your experience, feature requests, and support requests in the project's forum.
Thanks and I hope you enjoy! :-)