A sample showing how you can integrate Basler's pylon API into GStreamer's AppSrc element (so you can use your camera as a source for a pipeline!). Applications include streaming video across network, displaying to framebuffers, etc.

Participation wanted!

  Software license: Apache 2.0


pylongstreamer (using the InstantCameraAppSrc class):
Example of using Basler's Pylon API with GStreamer's GstAppSrc element.

InstantCameraAppSrc Functional Descrtiption:
A Typical Image Streaming Pipeline would look as follows:

The GStreamer AppSrc element is the source for the pipeline.
When AppSrc needs an image, it sends the "need-data" signal.
This triggers a callback function where the user can "push" their own data buffer to the src pad of the AppSrc element.
The data buffer is then picked up by the rest of the pipeline.
The InstantCameraAppSrc class provides a convinient way to access a Basler camera in your GStreamer application:
For example, InitCamera() can be used to configure the camera and pylon driver as you like.
And the source element for the pipeline can be created with: GstElement* source = myCamera.GetGstAppSrc();

By default, the InstantCameraAppSrc will use the camera in free-running mode, and the pylon driver's Grab Engine with the LatestImageOnly strategy, so that only the latest image received by the camera is available to be retrieved. If a new image arrives before the previous one is retrieved, it is overwritten in the Grab Engine. This strategy makes for the "live-ist" of streams. :-)

The CInstantCameraAppSrc class is an extension of the Pylon::CInstantCamera class.
It accesses the camera using pylon's GenICam-compliant generic API, so it will work with both USB and GigE cameras out of the box. :-)

pylongstreamer example program usage:
pylongstreamer -(option) -(pipeline)

-camera serialnumber
width pixels
-height pixels
-framerate framespersecond

Pipelines (created with the CPipelineHelper class):
-framebuffer /dev/fbX
-h264stream ipaddress
filename numberofframes

Quick-Start (Just pick a pipeline. It will use the first camera found, and set it to 640x480, 30fps):
pylongstreamer -display

Save a video:
  pylongstreamer -camera 12345678 -width 320 -height 240 -framerate 15 -h264file mymovie.h264 100
Display stream in a window:
  pylongstreamer -framerate 30 -display
Display stream on a seconday screen:
  pylongstreamer -width 100 -height -100 -framebuffer /dev/fb1
Use a TTL trigger to control the camera, compress the images to h264, and stream across the network:
  pylongstreamer -camera 21234567 -width 1024 -height 768 -framerate 60 -usetrigger -h264stream

Option Details:
-camera: Use a specific camera by entering the serial number.
-ondemand: Camera will be software triggered with each need-data signal instead of freerunning. May lower CPU load, but may be less 'real-time'.
-usetrigger: Camera will expect to be hardware triggered by user via IO ports (cannot be used with -ondemand).

Pipeline Example Details:
-h264stream (Encodes images as h264 and transmits stream to another PC running a GStreamer receiving pipeline.)
-h264file (Encodes images as h264 and saves stream to local file.)
-display (displays the raw image stream in a window on the local machine.)
-framebuffer (directs raw image stream to Linux framebuffer, e.g. /dev/fb0)

This is built and tested with GStreamer 1.0 and Pylon 5.0.9 (Linux) / Pylon 5.0.10 (Windows).
Project supports Windows and Linux; x86 x64, and ARM. Repo includes Visual Studio 2013 solution and a Linux makefile.
Note for Linux users: You may need to install the following libraries:

(on Ubuntu systems, all can be isntalled using apt-get install)

Note about GStreamer Plugins (elements):
Some GStreamer elements (plugins) used in the pipeline examples may not be available on all systems. Consult GStreamer for more information:

Please share your experience, feature requests, and support requests in the project's forum.

Thanks and I hope you enjoy! :-)


Streaming video to browser?

Would anyone like an additional pipeline built which would allow browser-based access to a camera's video stream?
If so, do you have a preference on which webserver should be used as part of this reference sample?
(Browser access requires sinking the pipeline to a webserver on the device with the camera. This could be Apache, Icecast, etc.)

Let me know your thoughts!

- Matt

2017-08-26 02:13:08 UTC mattb - 87 points

Does this project pique your interest?
Login or register on Imaginghub to join or follow this project.