-
Notifications
You must be signed in to change notification settings - Fork 33
Examples
- Image capture
- Raw image capture
- Timelapse mode
- Timeout mode
- Video recording
- Segmented video recording
- Quantization parameter
- Change encoding type
- Resizer component
- Splitter component
- Print pipeline
- Store to memory
If you want to change any of the default configuration settings, this can be done by modifying the static properties within the MMALCameraConfig
class. The main class, MMALCamera
which interfaces to the rest of the functionality the library provides is a Singleton and is called as follows: MMALCamera cam = MMALCamera.Instance
.
Note: The await Task.Delay(2000);
is required to allow the camera sensor to "warm up". Due to the rolling shutter used in the Raspberry Pi camera modules, we need to wait for a few seconds before valid image data can be used, otherwise your images will likely be under-exposed. The value of 2 seconds is a safe amount of time to wait, but is only required after enabling the camera component, either on first run or after a manual disable.
Additionally, the call to ConfigureCameraSettings()
is only required if you have made changes to the camera's configuration.
Support for these encoders has been added in later firmware releases so will likely need a sudo rpi-update
in order for it to work. Please see this issue for reference.
The below examples describe how to take a simple JPEG image, either by using the built-in helper method or manual mode. Here we are using an Image Encoder component which will encode the raw image data into JPEG format; you can change the encoding format to be one of the following: JPEG, BMP, PNG, GIF. In addition, you can also change the pixel format you would like to encode with - in the below examples we are using YUV420.
Helper mode
public async Task TakePictureHelper()
{
MMALCamera cam = MMALCamera.Instance;
using (var imgCaptureHandler = new ImageStreamCaptureHandler("/home/pi/images/", "jpg"))
{
await cam.TakePicture(imgCaptureHandler, MMALEncoding.JPEG, MMALEncoding.I420);
}
// Only call when you no longer require the camera, i.e. on app shutdown.
cam.Cleanup();
}
Manual mode
public async Task TakePictureManual()
{
MMALCamera cam = MMALCamera.Instance;
using (var imgCaptureHandler = new ImageStreamCaptureHandler("/home/pi/images/", "jpg"))
using (var imgEncoder = new MMALImageEncoder(imgCaptureHandler))
using (var nullSink = new MMALNullSinkComponent())
{
cam.ConfigureCameraSettings();
var portConfig = new MMALPortConfig(MMALEncoding.JPEG, MMALEncoding.I420, 90);
// Create our component pipeline.
imgEncoder.ConfigureOutputPort(portConfig);
cam.Camera.StillPort.ConnectTo(imgEncoder);
cam.Camera.PreviewPort.ConnectTo(nullSink);
// Camera warm up time
await Task.Delay(2000);
await cam.ProcessAsync(cam.Camera.StillPort);
}
// Only call when you no longer require the camera, i.e. on app shutdown.
cam.Cleanup();
}
In this example we are capturing raw unencoded image data directly from the camera sensor. You can change the pixel format of the raw data by changing the MMALCameraConfig.StillEncoding
and MMALCameraConfig.StillSubFormat
properties.
Helper mode
public async Task TakeRawPictureHelper()
{
MMALCamera cam = MMALCamera.Instance;
using (var imgCaptureHandler = new ImageStreamCaptureHandler("/home/pi/images/", "raw"))
{
await cam.TakeRawPicture(imgCaptureHandler);
}
// Only call when you no longer require the camera, i.e. on app shutdown.
cam.Cleanup();
}
Note: I422 encoding will prevent the native callback handler from being called, ultimately requiring a reboot of your Pi. I have tested I420, RGB24 and RGBA which work as expected.
The timelapse mode example describes how to take an image every 10 seconds for 4 hours. You can change the frequency and duration of the timelapse mode by changing the various properties in the Timelapse
object.
public async Task TakeTimelapsePicture()
{
MMALCamera cam = MMALCamera.Instance;
// This example will take an image every 10 seconds for 4 hours
using (var imgCaptureHandler = new ImageStreamCaptureHandler("/home/pi/images/", "jpg"))
{
var cts = new CancellationTokenSource(TimeSpan.FromHours(4));
var tl = new Timelapse { Mode = TimelapseMode.Second, CancellationToken = cts.Token, Value = 10 };
await cam.TakePictureTimelapse(imgCaptureHandler, MMALEncoding.JPEG, MMALEncoding.I420, tl);
}
// Only call when you no longer require the camera, i.e. on app shutdown.
cam.Cleanup();
}
The timeout mode example shows how to take simultaneous image captures for a set duration. This is done via a helper method in the MMALCamera
class. We pass in a CancellationToken
which will signal when image capturing should stop.
public async Task TakeTimeoutPicture()
{
MMALCamera cam = MMALCamera.Instance;
using (var imgCaptureHandler = new ImageStreamCaptureHandler("/home/pi/images/", "jpg"))
{
var cts = new CancellationTokenSource(TimeSpan.FromHours(4));
await cam.TakePictureTimeout(imgCaptureHandler, MMALEncoding.JPEG, MMALEncoding.I420, cts.Token);
}
// Only call when you no longer require the camera, i.e. on app shutdown.
cam.Cleanup();
}
The below examples show how to capture video using MMALSharp. For basic video recording, there is a built in helper method which uses H.264 encoding. If you wish to use a different encoding type, or would like to customise additional parameters such as bitrate, you can also do this manually.
Helper mode
// Self-contained method for recording H.264 video for a specified amount of time. Records at 30fps, 25Mb/s at the highest quality.
public async Task TakeVideoHelper()
{
MMALCamera cam = MMALCamera.Instance;
using (var vidCaptureHandler = new VideoStreamCaptureHandler("/home/pi/videos/", "h264"))
{
var cts = new CancellationTokenSource(TimeSpan.FromMinutes(3));
// Take video for 3 minutes.
await cam.TakeVideo(vidCaptureHandler, cts.Token);
}
// Only call when you no longer require the camera, i.e. on app shutdown.
cam.Cleanup();
}
Manual mode
public async Task TakeVideoManual()
{
MMALCamera cam = MMALCamera.Instance;
using (var vidCaptureHandler = new VideoStreamCaptureHandler("/home/pi/videos/", "h264"))
using (var vidEncoder = new MMALVideoEncoder(vidCaptureHandler))
using (var renderer = new MMALVideoRenderer())
{
cam.ConfigureCameraSettings();
var portConfig = new MMALPortConfig(MMALEncoding.H264, MMALEncoding.I420, 25, 10, MMALVideoEncoder.MaxBitrateLevel4, null);
// Create our component pipeline. Here we are using the H.264 standard with a YUV420 pixel format. The video will be taken at 25Mb/s.
vidEncoder.ConfigureOutputPort(portConfig);
cam.Camera.VideoPort.ConnectTo(vidEncoder);
cam.Camera.PreviewPort.ConnectTo(renderer);
// Camera warm up time
await Task.Delay(2000);
var cts = new CancellationTokenSource(TimeSpan.FromMinutes(3));
// Take video for 3 minutes.
await cam.ProcessAsync(cam.Camera.VideoPort, cts.Token);
}
// Only call when you no longer require the camera, i.e. on app shutdown.
cam.Cleanup();
}
The segmented recording mode allows us to split video recording into multiple files. The user is able to specify the frequency at which the split occurs via the Split
object.
Note: MMALCameraConfig.InlineHeaders
must be set to true in order for this to work.
public async Task TakeSegmentedVideo()
{
// Required for segmented recording mode
MMALCameraConfig.InlineHeaders = true;
MMALCamera cam = MMALCamera.Instance;
using (var vidCaptureHandler = new VideoStreamCaptureHandler("/home/pi/videos/", "h264"))
using (var vidEncoder = new MMALVideoEncoder(vidCaptureHandler, null, new Split { Mode = TimelapseMode.Second, Value = 30 }))
using (var renderer = new MMALVideoRenderer())
{
cam.ConfigureCameraSettings();
var portConfig = new MMALPortConfig(MMALEncoding.H264, MMALEncoding.I420, 25, 10, MMALVideoEncoder.MaxBitrateLevel4, null);
// Create our component pipeline. Here we are using the H.264 standard with a YUV420 pixel format. The video will be taken at 25Mb/s.
vidEncoder.ConfigureOutputPort(portConfig);
cam.Camera.VideoPort.ConnectTo(vidEncoder);
cam.Camera.PreviewPort.ConnectTo(renderer);
cam.ConfigureCameraSettings();
// Camera warm up time
await Task.Delay(2000);
var cts = new CancellationTokenSource(TimeSpan.FromMinutes(1));
// Record video for 1 minute, using segmented video record to split into multiple files every 30 seconds.
await cam.ProcessAsync(cam.Camera.VideoPort, cts.Token);
}
// Only call when you no longer require the camera, i.e. on app shutdown.
cam.Cleanup();
}
The quantization parameter allows us to set a variable bitrate when recording in H.264 encoding. To enable this behavior, set the bitrate parameter to '0' and set the quality parameter to a value between 1-10. Note: this only applies to H.264, MJPEG makes use of both the quality and bitrate values.
public async Task QuantizationParameterExample()
{
MMALCamera cam = MMALCamera.Instance;
using (var vidCaptureHandler = new VideoStreamCaptureHandler("/home/pi/videos/", "h264"))
using (var vidEncoder = new MMALVideoEncoder(vidCaptureHandler))
using (var renderer = new MMALVideoRenderer())
{
cam.ConfigureCameraSettings();
var portConfig = new MMALPortConfig(MMALEncoding.H264, MMALEncoding.I420, 25, 10, 0, null);
// Create our component pipeline. Here we are using the H.264 standard with a YUV420 pixel format. The video will be taken at 25Mb/s.
vidEncoder.ConfigureOutputPort(portConfig);
cam.Camera.VideoPort.ConnectTo(vidEncoder);
cam.Camera.PreviewPort.ConnectTo(renderer);
// Camera warm up time
await Task.Delay(2000);
var cts = new CancellationTokenSource(TimeSpan.FromMinutes(3));
// Take video for 3 minutes.
await cam.ProcessAsync(cam.Camera.VideoPort, cts.Token);
}
// Only call when you no longer require the camera, i.e. on app shutdown.
cam.Cleanup();
}
Due to the way MMALSharp handles the lifecycle of each component, if we wish to change the encoding of a component we must do this by leaving the scope of the encoder's current using
block; after doing so, this will free up the unmanaged resources of that encoder and will allow us to create a fresh instance with a different encoding type.
public async Task ChangeImageEncodingType()
{
MMALCamera cam = MMALCamera.Instance;
using (var imgCaptureHandler = new ImageStreamCaptureHandler("/home/pi/images/", "jpg"))
using (var imgEncoder = new MMALImageEncoder(imgCaptureHandler))
using (var nullSink = new MMALNullSinkComponent())
{
cam.ConfigureCameraSettings();
// Camera warm up time
await Task.Delay(2000);
var portConfigJPEG = new MMALPortConfig(MMALEncoding.JPEG, MMALEncoding.I420, 90);
var portConfigBMP = new MMALPortConfig(MMALEncoding.BMP, MMALEncoding.RGBA, 0);
// Create our component pipeline.
imgEncoder.ConfigureOutputPort(portConfigJPEG);
cam.Camera.StillPort.ConnectTo(imgEncoder);
cam.Camera.PreviewPort.ConnectTo(nullSink);
await cam.ProcessAsync(cam.Camera.StillPort);
imgCaptureHandler.Extension = "bmp";
imgEncoder.ConfigureOutputPort(portConfigBMP);
await cam.ProcessAsync(cam.Camera.StillPort);
}
// Only call when you no longer require the camera, i.e. on app shutdown.
cam.Cleanup();
}
The same applies to video encoders too.
public async Task ChangeVideoEncodingType()
{
MMALCamera cam = MMALCamera.Instance;
using (var vidCaptureHandler = new VideoStreamCaptureHandler("/home/pi/videos/", "h264"))
using (var vidEncoder = new MMALVideoEncoder(vidCaptureHandler))
using (var renderer = new MMALVideoRenderer())
{
cam.ConfigureCameraSettings();
// Camera warm up time
await Task.Delay(2000);
var portConfigH264 = new MMALPortConfig(MMALEncoding.H264, MMALEncoding.I420, 25, 10, MMALVideoEncoder.MaxBitrateLevel4, null);
var portConfigMJPEG = new MMALPortConfig(MMALEncoding.MJPEG, MMALEncoding.I420, 25, 90, MMALVideoEncoder.MaxBitrateMJPEG, null);
// Create our component pipeline. Here we are using the H.264 standard with a YUV420 pixel format. The video will be taken at 25Mb/s.
vidEncoder.ConfigureOutputPort(portConfigH264);
cam.Camera.VideoPort.ConnectTo(vidEncoder);
cam.Camera.PreviewPort.ConnectTo(renderer);
var cts = new CancellationTokenSource(TimeSpan.FromMinutes(3));
// Take video for 3 minutes.
await cam.ProcessAsync(cam.Camera.VideoPort, cts.Token);
vidCaptureHandler.Extension = "mjpeg";
vidEncoder.ConfigureOutputPort(portConfigMJPEG);
cts = new CancellationTokenSource(TimeSpan.FromMinutes(3));
await cam.ProcessAsync(cam.Camera.VideoPort, cts.Token);
}
// Only call when you no longer require the camera, i.e. on app shutdown.
cam.Cleanup();
}
The MMALResizerComponent
can be connected to your pipeline to change the width/height and encoding type/pixel format of frames captured by the camera component. The resizer component itself is an MMALDownstreamHandlerComponent
meaning you can process data to a file directly from it without the need to connect an encoder.
public async Task ResizerComponentExample()
{
MMALCamera cam = MMALCamera.Instance;
using (var imgCaptureHandler = new ImageStreamCaptureHandler("/home/pi/images/", "jpg"))
using (var resizer = new MMALResizerComponent(null))
using (var imgEncoder = new MMALImageEncoder(imgCaptureHandler))
using (var nullSink = new MMALNullSinkComponent())
{
cam.ConfigureCameraSettings();
var resizerPortConfig = new MMALPortConfig(MMALEncoding.I420, MMALEncoding.I420, 800, 600, 0, 0, 0, false, null);
var encoderPortConfig = new MMALPortConfig(MMALEncoding.JPEG, MMALEncoding.I420, 90);
// Create our component pipeline.
resizer.ConfigureInputPort(MMALEncoding.OPAQUE, MMALEncoding.I420, cam.Camera.StillPort)
.ConfigureOutputPort(resizerPortConfig);
imgEncoder.ConfigureOutputPort(encoderPortConfig);
cam.Camera.StillPort.ConnectTo(resizer);
resizer.Outputs[0].ConnectTo(imgEncoder);
cam.Camera.PreviewPort.ConnectTo(nullSink);
// Camera warm up time
await Task.Delay(2000);
await cam.ProcessAsync(cam.Camera.StillPort);
}
// Only call when you no longer require the camera, i.e. on app shutdown.
cam.Cleanup();
}
The MMALSplitterComponent
connects exclusively to the video port of the camera component. From here, the splitter provides 4 output ports, allowing you to further extend your pipeline and produce up to 4 file outputs at any given time.
public async Task SplitterComponentExample()
{
MMALCamera cam = MMALCamera.Instance;
using (var handler = new VideoStreamCaptureHandler("/home/pi/video/", "h264"))
using (var handler2 = new VideoStreamCaptureHandler("/home/pi/video/", "h264"))
using (var handler3 = new VideoStreamCaptureHandler("/home/pi/video/", "h264"))
using (var handler4 = new VideoStreamCaptureHandler("/home/pi/video/", "h264"))
using (var splitter = new MMALSplitterComponent(null))
using (var vidEncoder = new MMALVideoEncoder(handler))
using (var vidEncoder2 = new MMALVideoEncoder(handler2))
using (var vidEncoder3 = new MMALVideoEncoder(handler3))
using (var vidEncoder4 = new MMALVideoEncoder(handler4))
using (var renderer = new MMALVideoRenderer())
{
cam.ConfigureCameraSettings();
var splitterPortConfig = new MMALPortConfig(MMALEncoding.OPAQUE, MMALEncoding.I420, 10, 0, 13000000, null);
var portConfig1 = new MMALPortConfig(MMALEncoding.H264, MMALEncoding.I420, 10, 10, 13000000, DateTime.Now.AddSeconds(20));
var portConfig2 = new MMALPortConfig(MMALEncoding.H264, MMALEncoding.I420, 10, 20, 13000000, DateTime.Now.AddSeconds(15));
var portConfig3 = new MMALPortConfig(MMALEncoding.H264, MMALEncoding.I420, 25, 30, 13000000, DateTime.Now.AddSeconds(10));
var portConfig4 = new MMALPortConfig(MMALEncoding.H264, MMALEncoding.I420, 25, 40, 13000000, DateTime.Now.AddSeconds(10));
// Create our component pipeline.
splitter.ConfigureInputPort(MMALEncoding.OPAQUE, MMALEncoding.I420, cam.Camera.VideoPort);
splitter.ConfigureOutputPort(0, splitterPortConfig);
splitter.ConfigureOutputPort(1, splitterPortConfig);
splitter.ConfigureOutputPort(2, splitterPortConfig);
splitter.ConfigureOutputPort(3, splitterPortConfig);
vidEncoder.ConfigureInputPort(MMALEncoding.OPAQUE, MMALEncoding.I420, splitter.Outputs[0]);
vidEncoder.ConfigureOutputPort(0, portConfig1);
vidEncoder2.ConfigureInputPort(MMALEncoding.OPAQUE, MMALEncoding.I420, splitter.Outputs[1]);
vidEncoder2.ConfigureOutputPort(0, portConfig2);
vidEncoder3.ConfigureInputPort(MMALEncoding.OPAQUE, MMALEncoding.I420, splitter.Outputs[2]);
vidEncoder3.ConfigureOutputPort(0, portConfig3);
vidEncoder4.ConfigureInputPort(MMALEncoding.OPAQUE, MMALEncoding.I420, splitter.Outputs[3]);
vidEncoder4.ConfigureOutputPort(0, portConfig4);
cam.Camera.VideoPort.ConnectTo(splitter);
splitter.Outputs[0].ConnectTo(vidEncoder);
splitter.Outputs[1].ConnectTo(vidEncoder2);
splitter.Outputs[2].ConnectTo(vidEncoder3);
splitter.Outputs[3].ConnectTo(vidEncoder4);
cam.Camera.PreviewPort.ConnectTo(renderer);
// Camera warm up time
await Task.Delay(2000);
await cam.ProcessAsync(cam.Camera.VideoPort);
}
// Only call when you no longer require the camera, i.e. on app shutdown.
cam.Cleanup();
}
You can also use the splitter component to record video and capture images at the same time:
public async Task VideoAndImages()
{
MMALCamera cam = MMALCamera.Instance;
using (var imgCaptureHandler = new ImageStreamCaptureHandler("/home/pi/images/", "jpg"))
using (var vidCaptureHandler = new VideoStreamCaptureHandler("/home/pi/videos/", "h264"))
using (var splitter = new MMALSplitterComponent(null))
using (var imgEncoder = new MMALImageEncoder(imgCaptureHandler, continuousCapture: true))
using (var vidEncoder = new MMALVideoEncoder(vidCaptureHandler))
using (var nullSink = new MMALNullSinkComponent())
{
cam.ConfigureCameraSettings();
var imgEncoderPortConfig = new MMALPortConfig(MMALEncoding.JPEG, MMALEncoding.I420, 90);
var vidEncoderPortConfig = new MMALPortConfig(MMALEncoding.H264, MMALEncoding.I420, 25, 10, MMALVideoEncoder.MaxBitrateLevel4, null);
// Create our component pipeline.
imgEncoder.ConfigureOutputPort(imgEncoderPortConfig);
vidEncoder.ConfigureOutputPort(vidEncoderPortConfig);
cam.Camera.VideoPort.ConnectTo(splitter);
splitter.Outputs[0].ConnectTo(imgEncoder);
splitter.Outputs[1].ConnectTo(vidEncoder);
cam.Camera.PreviewPort.ConnectTo(nullSink);
// Camera warm up time
await Task.Delay(2000);
CancellationTokenSource cts = new CancellationTokenSource(TimeSpan.FromSeconds(15));
// Process for 15 seconds.
await cam.ProcessAsync(cam.Camera.VideoPort, cts.Token);
}
// Only call when you no longer require the camera, i.e. on app shutdown.
cam.Cleanup();
}
Version 0.3 brings the ability to print out the current component pipeline you have configured - this can be useful when using many components and encoders (such as the splitter).
Calling the PrintPipeline()
method on the MMALCamera
instance will print your current pipeline to the console window.
public async Task PrintComponentPipeline()
{
MMALCamera cam = MMALCamera.Instance;
using (var imgCaptureHandler = new ImageStreamCaptureHandler("/home/pi/images/", "jpg"))
using (var imgEncoder = new MMALImageEncoder(imgCaptureHandler))
using (var nullSink = new MMALNullSinkComponent())
{
cam.ConfigureCameraSettings();
var portConfig = new MMALPortConfig(MMALEncoding.JPEG, MMALEncoding.I420, 90);
// Create our component pipeline.
imgEncoder.ConfigureOutputPort(portConfig);
cam.Camera.StillPort.ConnectTo(imgEncoder);
cam.Camera.PreviewPort.ConnectTo(nullSink);
cam.PrintPipeline();
// Camera warm up time
await Task.Delay(2000);
await cam.ProcessAsync(cam.Camera.StillPort);
}
// Only call when you no longer require the camera, i.e. on app shutdown.
cam.Cleanup();
}
Sometimes you may wish to gain access to the image data directly instead of saving to disk, this is especially useful when capturing raw unencoded image data from the camera directly. In this scenario, you can use either a InMemoryCaptureHandler
(wraps a List<byte>
) or a MemoryStreamCaptureHandler
.
public async Task StoreToMemory()
{
MMALCamera cam = MMALCamera.Instance;
var captureHandler = new InMemoryCaptureHandler();
await cam.TakeRawPicture(captureHandler);
// Access raw unencoded output.
var outputFrames = captureHandler.WorkingData;
// Only call when you no longer require the camera, i.e. on app shutdown.
cam.Cleanup();
}