Skip to content
This repository has been archived by the owner on Feb 22, 2024. It is now read-only.

Advanced Examples

Ian Auty edited this page Oct 15, 2020 · 40 revisions

Note: You are currently viewing the v0.7 Alpha examples. If you are not cloning the latest code from this repository then you may wish to look at the v0.6 examples instead.

Revisions

Contents

  1. Rapid image capture
  2. Raw video from resizer
  3. Raw video from splitter
  4. Raw video from resizer with splitter component
  5. Encode / Decode from Stream - Image
    1. Encode
    2. Decode
  6. Encode / Decode from Stream - Video
    1. Encode
    2. Decode
    3. Decode -> Encode
    4. Decode -> Splitter -> Encode
    5. Decode -> Splitter -> Resizer -> Encode
  7. Static render overlay
  8. External process e.g. FFmpeg, cvlc
    1. FFmpeg - RTMP streaming
    2. FFmpeg - Raw video convert
    3. FFmpeg - Images to video
  9. Video record with Circular Buffer
  10. Store motion vectors
  11. ImageFx component

Notes

FFmpeg

For FFmpeg functionality, you will need to install the latest version of FFmpeg from source - do not install from the Raspbian repositories as they don't have H.264 support.

A guide to installing FFmpeg from source including the H.264 codec can be found here

Rapid image capture

By utilising the camera's video port, we are able to retrieve image frames at a much higher speed than using the conventional still port. Images captured via the video port will be of a lesser quality and do not support EXIF.

public async Task TakePictureFromVideoPort()
{                        
    MMALCamera cam = MMALCamera.Instance;

    using (var imgCaptureHandler = new ImageStreamCaptureHandler("/home/pi/images/", "jpg"))
    using (var splitter = new MMALSplitterComponent())
    using (var imgEncoder = new MMALImageEncoder(continuousCapture: true))
    using (var nullSink = new MMALNullSinkComponent())
    {
        cam.ConfigureCameraSettings();
        
        var portConfig = new MMALPortConfig(MMALEncoding.JPEG, MMALEncoding.I420, quality: 90);

        // Create our component pipeline.         
        imgEncoder.ConfigureOutputPort(portConfig, imgCaptureHandler);
                
        cam.Camera.VideoPort.ConnectTo(splitter);
        splitter.Outputs[0].ConnectTo(imgEncoder);                    
        cam.Camera.PreviewPort.ConnectTo(nullSink);
        
        // Camera warm up time
        await Task.Delay(2000);
                
        CancellationTokenSource cts = new CancellationTokenSource(TimeSpan.FromSeconds(15));
        
        // Process images for 15 seconds.        
        await cam.ProcessAsync(cam.Camera.VideoPort, cts.Token);
    }

    // Only call when you no longer require the camera, i.e. on app shutdown.
    cam.Cleanup();
}

Raw video capture from resizer component

Available in v0.5.1

The resizer component can adjust the resolution coming from the camera's video port, allowing you to record raw YUV420 frames. By passing VideoPort to the generic constraint on ConfigureOutputPort, we tell MMALSharp to use VideoPort behaviour on the resizer's output. By default, the resizer uses StillPort behaviour and would not work in this scenario.

public async Task RecordVideoDirectlyFromResizer()
{                        
    MMALCamera cam = MMALCamera.Instance;

    using (var vidCaptureHandler = new VideoStreamCaptureHandler("/home/pi/videos/tests", "raw"))
    using (var resizer = new MMALResizerComponent())
    using (var preview = new MMALVideoRenderer())
    {
        cam.ConfigureCameraSettings();
    
        // Use the resizer to resize 1080p to 640x480.
        var portConfig = new MMALPortConfig(MMALEncoding.I420, MMALEncoding.I420, width: 640, height: 480);

        resizer.ConfigureOutputPort<VideoPort>(0, portConfig, vidCaptureHandler);

        // Create our component pipeline.         
        cam.Camera.VideoPort
            .ConnectTo(resizer);
        cam.Camera.PreviewPort
            .ConnectTo(preview);

        // Camera warm up time
        await Task.Delay(2000);

        CancellationTokenSource cts = new CancellationTokenSource(TimeSpan.FromSeconds(15));

        // Record video for 20 seconds
        await cam.ProcessAsync(cam.Camera.VideoPort, cts.Token);
    }
    
    // Only call when you no longer require the camera, i.e. on app shutdown.
    cam.Cleanup();
}

Raw video capture from splitter component

Available in v0.5.1

The splitter component can also be used to record raw video frames. As seen with the resizer example above, we can pass VideoPort as a generic constraint to the ConfigureOutputPort method, instructing MMALSharp to use VideoPort behaviour for the splitter's output port. If no type is passed in, the splitter will simply act as a pass-through component.

public async Task RecordVideoDirectlyFromSplitter()
{                        
    MMALCamera cam = MMALCamera.Instance;

    using (var vidCaptureHandler = new VideoStreamCaptureHandler("/home/pi/videos/tests", "raw"))
    using (var vidCaptureHandler2 = new VideoStreamCaptureHandler("/home/pi/videos/tests", "raw"))
    using (var vidCaptureHandler3 = new VideoStreamCaptureHandler("/home/pi/videos/tests", "raw"))
    using (var vidCaptureHandler4 = new VideoStreamCaptureHandler("/home/pi/videos/tests", "raw"))
    using (var preview = new MMALVideoRenderer())
    using (var splitter = new MMALSplitterComponent())
    {
        cam.ConfigureCameraSettings();
    
        var splitterPortConfig = new MMALPortConfig(MMALEncoding.I420, MMALEncoding.I420);

        // Create our component pipeline.         
        splitter.ConfigureInputPort(new MMALPortConfig(MMALEncoding.OPAQUE, MMALEncoding.I420), cam.Camera.VideoPort, null);
        splitter.ConfigureOutputPort<SplitterVideoPort>(0, splitterPortConfig, vidCaptureHandler);
        splitter.ConfigureOutputPort<SplitterVideoPort>(1, splitterPortConfig, vidCaptureHandler2);
        splitter.ConfigureOutputPort<SplitterVideoPort>(2, splitterPortConfig, vidCaptureHandler3);
        splitter.ConfigureOutputPort<SplitterVideoPort>(3, splitterPortConfig, vidCaptureHandler4);
        
        cam.Camera.VideoPort.ConnectTo(splitter);
        cam.Camera.PreviewPort.ConnectTo(preview);

        // Camera warm up time
        await Task.Delay(2000);

        CancellationTokenSource cts = new CancellationTokenSource(TimeSpan.FromSeconds(15));

        // Record video for 20 seconds
        await cam.ProcessAsync(cam.Camera.VideoPort, cts.Token);
    }
    
    // Only call when you no longer require the camera, i.e. on app shutdown.
    cam.Cleanup();
}

Raw video from resizer with splitter component

Available in v0.5.1

You can combine both previous examples into one by using the resizer component with the splitter. By combining the components, you can potentially resize up to 4 separate raw video streams which adds a lot of flexibility to your application.

public async Task RecordVideoDirectlyFromResizerWithSplitterComponent()
{                        
    MMALCamera cam = MMALCamera.Instance;

    using (var vidCaptureHandler = new VideoStreamCaptureHandler("/home/pi/videos/tests", "raw"))
    using (var vidCaptureHandler2 = new VideoStreamCaptureHandler("/home/pi/videos/tests", "raw"))
    using (var vidCaptureHandler3 = new VideoStreamCaptureHandler("/home/pi/videos/tests", "raw"))
    using (var vidCaptureHandler4 = new VideoStreamCaptureHandler("/home/pi/videos/tests", "raw"))
    using (var preview = new MMALVideoRenderer())
    using (var splitter = new MMALSplitterComponent())
    using (var resizer = new MMALResizerComponent())
    using (var resizer2 = new MMALResizerComponent())
    using (var resizer3 = new MMALResizerComponent())
    using (var resizer4 = new MMALResizerComponent())
    {
        cam.ConfigureCameraSettings();
    
        var splitterPortConfig = new MMALPortConfig(MMALEncoding.OPAQUE, MMALEncoding.I420);

        var portConfig = new MMALPortConfig(MMALEncoding.I420, MMALEncoding.I420, width: 1024, height: 768, timeout: DateTime.Now.AddSeconds(20));
        var portConfig2 = new MMALPortConfig(MMALEncoding.I420, MMALEncoding.I420, width: 800, height: 600, timeout: DateTime.Now.AddSeconds(20));
        var portConfig3 = new MMALPortConfig(MMALEncoding.I420, MMALEncoding.I420, width: 640, height: 480, timeout: DateTime.Now.AddSeconds(15));
        var portConfig4 = new MMALPortConfig(MMALEncoding.I420, MMALEncoding.I420, width: 320, height: 240, timeout: DateTime.Now.AddSeconds(20));

        // Create our component pipeline.         
        splitter.ConfigureInputPort(new MMALPortConfig(MMALEncoding.OPAQUE, MMALEncoding.I420), cam.Camera.VideoPort, null);
        splitter.ConfigureOutputPort(0, splitterPortConfig, null);
        splitter.ConfigureOutputPort(1, splitterPortConfig, null);
        splitter.ConfigureOutputPort(2, splitterPortConfig, null);
        splitter.ConfigureOutputPort(3, splitterPortConfig, null);
        
        resizer.ConfigureOutputPort<VideoPort>(0, portConfig, vidCaptureHandler);
        resizer2.ConfigureOutputPort<VideoPort>(0, portConfig2, vidCaptureHandler2);
        resizer3.ConfigureOutputPort<VideoPort>(0, portConfig3, vidCaptureHandler3);
        resizer4.ConfigureOutputPort<VideoPort>(0, portConfig4, vidCaptureHandler4);

        // Create our component pipeline.         
        cam.Camera.VideoPort.ConnectTo(splitter);

        splitter.Outputs[0].ConnectTo(resizer);
        splitter.Outputs[1].ConnectTo(resizer2);
        splitter.Outputs[2].ConnectTo(resizer3);
        splitter.Outputs[3].ConnectTo(resizer4);

        cam.Camera.PreviewPort.ConnectTo(preview);

        // Camera warm up time
        await Task.Delay(2000);

        await cam.ProcessAsync(cam.Camera.VideoPort);
    }
    
    // Only call when you no longer require the camera, i.e. on app shutdown.
    cam.Cleanup();
}

Encode / Decode from Stream - Image

MMALSharp provides the ability to encode/decode images fed from Streams. It supports GIF, BMP, JPEG and PNG file formats, and decoding must be carried out to the following:

  • JPEG -> YUV420/422 (I420/422)
  • GIF -> RGB565 (RGB16)
  • BMP/PNG -> RGBA

Note: Please notice the <FileEncodeOutputPort> generic constraint when calling .ConfigureOutputPort, this is an important addition as this port has the ability to handle MMAL_EVENT_FORMAT_CHANGED buffers that may be produced by the component.

Encode

public async Task EncodeFromFilestream()
{
    MMALStandalone standalone = MMALStandalone.Instance;

    using (var stream = File.OpenRead("/home/pi/raw_jpeg_decode.raw"))
    using (var inputCaptureHandler = new InputCaptureHandler(stream))
    using (var outputCaptureHandler = new ImageStreamCaptureHandler("/home/pi/images/", "raw"))
    using (var imgEncoder = new MMALImageEncoder())
    {
        var inputPortConfig = new MMALPortConfig(MMALEncoding.I420, null, width: 2560, height: 1920, zeroCopy: true);
        var outputPortConfig = new MMALPortConfig(MMALEncoding.BMP, MMALEncoding.I420, width: 2560, height: 1920, zeroCopy: true);

        // Create our component pipeline.
        imgEncoder.ConfigureInputPort(inputPortConfig, inputCaptureHandler)
                  .ConfigureOutputPort<FileEncodeOutputPort>(0, outputPortConfig, outputCaptureHandler);

        await standalone.ProcessAsync(imgEncoder);      
    }

    // Only call when you no longer require the MMAL library, i.e. on app shutdown.
    standalone.Cleanup();
}

Decode

public async Task DecodeFromFilestream()
{
    MMALStandalone standalone = MMALStandalone.Instance;

    using (var stream = File.OpenRead("/home/pi/test.jpg"))
    using (var inputCaptureHandler = new InputCaptureHandler(stream))
    using (var outputCaptureHandler = new ImageStreamCaptureHandler("/home/pi/images/", "raw"))
    using (var imgDecoder = new MMALImageDecoder())
    {
        var inputPortConfig = new MMALPortConfig(MMALEncoding.JPEG, MMALEncoding.I420, width: 2560, height: 1920, zeroCopy: true);
        var outputPortConfig = new MMALPortConfig(MMALEncoding.I420, null, width: 2560, height: 1920, zeroCopy: true);

        // Create our component pipeline.
        imgDecoder.ConfigureInputPort(inputPortConfig, inputCaptureHandler)
                  .ConfigureOutputPort<FileEncodeOutputPort>(0, outputPortConfig, outputCaptureHandler);

        await standalone.ProcessAsync(imgDecoder);    
    }

    // Only call when you no longer require the MMAL library, i.e. on app shutdown.
    standalone.Cleanup();
}

Decode -> Encode (and vice-versa)

Available in v0.6.

You can also connect encoder components to decoder components and process as a single operation.

public async Task DecodeThenEncodeImageFromFilestream()
{
    MMALStandalone standalone = MMALStandalone.Instance;

    using (var stream = File.OpenRead("/home/pi/test.bmp"))
    using (var inputCaptureHandler = new InputCaptureHandler(stream))
    using (var outputCaptureHandler = new ImageStreamCaptureHandler("/home/pi/images/", "jpeg"))
    using (var imgDecoder = new MMALImageDecoder())
    using (var imgEncoder = new MMALImageEncoder())
    {
        var decoderInputPortConfig = new MMALPortConfig(MMALEncoding.BMP, MMALEncoding.RGB16, zeroCopy: true);
        var decoderOutputPortConfig = new MMALPortConfig(MMALEncoding.RGBA, null, width: 640, height: 480, zeroCopy: true);

        var encoderInputPortConfig = new MMALPortConfig(MMALEncoding.RGBA, null, width: 640, height: 480, zeroCopy: true);
        var encoderOutputPortConfig = new MMALPortConfig(MMALEncoding.JPEG, MMALEncoding.I420, width: 640, height: 480, zeroCopy: true);

        // Create our component pipeline.
        imgDecoder.ConfigureInputPort(decoderInputPortConfig, inputCaptureHandler)
                  .ConfigureOutputPort(0, decoderOutputPortConfig, null);

        imgEncoder.ConfigureInputPort(encoderInputPortConfig, imgDecoder.Outputs[0], null)
                  .ConfigureOutputPort<FileEncodeOutputPort>(0, encoderOutputPortConfig, outputCaptureHandler);

        imgDecoder.Outputs[0].ConnectTo(imgEncoder);

        standalone.PrintPipeline(imgDecoder);

        await standalone.ProcessAsync(imgDecoder);
    }

    // Only call when you no longer require the MMAL library, i.e. on app shutdown.
    standalone.Cleanup();
}

Encode / Decode from Stream - Video

You can also encode and decode video files fed from streams in MMALSharp.

Encode

public async Task EncodeVideoFromFilestream()
{
    MMALStandalone standalone = MMALStandalone.Instance;

    using (var stream = File.OpenRead("/home/pi/videos/decoded_rgb.raw"))
    using (var inputCaptureHandler = new InputCaptureHandler(stream))
    using (var outputCaptureHandler = new VideoStreamCaptureHandler("/home/pi/videos/", "h264"))
    using (var vidEncoder = new MMALVideoEncoder())
    {
        var inputPortConfig = new MMALPortConfig(MMALEncoding.RGB16, null, width: 1280, height: 720, framerate: 25, bitrate: 1300000, zeroCopy: true);
        var outputPortConfig = new MMALPortConfig(MMALEncoding.H264, MMALEncoding.RGB16, width: 1280, height: 720, framerate: 25, bitrate: 1300000, zeroCopy: true);

        // Create our component pipeline.
        vidEncoder.ConfigureInputPort(inputPortConfig, inputCaptureHandler)
            .ConfigureOutputPort<FileEncodeOutputPort>(0, outputPortConfig, outputCaptureHandler);

        await standalone.ProcessAsync(vidEncoder);
    }

    // Only call when you no longer require the MMAL library, i.e. on app shutdown.
    standalone.Cleanup();
}

Decode

public async Task DecodeVideoFromFilestream()
{
    MMALStandalone standalone = MMALStandalone.Instance;

    using (var stream = File.OpenRead("/home/pi/videos/test.h264"))
    using (var inputCaptureHandler = new InputCaptureHandler(stream))
    using (var outputCaptureHandler = new VideoStreamCaptureHandler("/home/pi/videos/", "raw"))
    using (var vidDecoder = new MMALVideoDecoder())
    {
        var inputPortConfig = new MMALPortConfig(MMALEncoding.H264, null, width: 1280, height: 720, framerate: 25, bitrate: 1300000, zeroCopy: true);
        var outputPortConfig = new MMALPortConfig(MMALEncoding.RGB16, null, width: 1280, height: 720, framerate: 25, bitrate: 1300000, zeroCopy: true);

        // Create our component pipeline.
        vidDecoder.ConfigureInputPort(inputPortConfig, inputCaptureHandler)
            .ConfigureOutputPort<FileEncodeOutputPort>(0, outputPortConfig, outputCaptureHandler);

        await standalone.ProcessAsync(vidDecoder);
    }

    // Only call when you no longer require the MMAL library, i.e. on app shutdown.
    standalone.Cleanup();
}

Decode -> Encode (and vice-versa)

Available in v0.6.

Again, you can also connect encoder components to decoder components and process as a single operation.

public async Task DecodeThenEncodeVideoFromFilestream()
{
    MMALStandalone standalone = MMALStandalone.Instance;

    using (var stream = File.OpenRead("/home/pi/test.h264"))
    using (var inputCaptureHandler = new InputCaptureHandler(stream))
    using (var outputCaptureHandler = new VideoStreamCaptureHandler("/home/pi/Videos", "h264"))
    using (var imgDecoder = new MMALVideoDecoder())
    using (var imgEncoder = new MMALVideoEncoder())
    {
        var decoderInputConfig = new MMALPortConfig(MMALEncoding.H264, null, width: 1280, height: 720, framerate: 25, zeroCopy: true);
        var decoderOutputConfig = new MMALPortConfig(MMALEncoding.I420, null, width:  1280, height: 720, zeroCopy: true);

        var encoderInputConfig = new MMALPortConfig(MMALEncoding.I420, null, width: 1280, height: 720, zeroCopy: true);
        var encoderOutputConfig = new MMALPortConfig(MMALEncoding.H264, MMALEncoding.I420, width: 1280, height: 720, framerate: 25, zeroCopy: true);

        imgDecoder.ConfigureInputPort(decoderInputConfig, inputCaptureHandler)
                  .ConfigureOutputPort(0, decoderOutputConfig, null);

        imgEncoder.ConfigureInputPort(encoderInputConfig, imgDecoder.Outputs[0], null)
                  .ConfigureOutputPort<FileEncodeOutputPort>(0, encoderOutputConfig, outputCaptureHandler);
        
        imgDecoder.Outputs[0].ConnectTo(imgEncoder);
        
        await standalone.ProcessAsync(imgDecoder);
    }

    // Only call when you no longer require the MMAL library, i.e. on app shutdown.
    standalone.Cleanup();
}

Decode -> Splitter -> Encode (and vice-versa)

Available in v0.6.

The flexibility of the MMAL pipeline allows you to also add a splitter component into the mix. You can start with a Decoder component, connect it to a splitter (giving you 4 outputs), and then attach 4 encoders onto each splitter output port.

In the example below, we are assuming that the input video file is H.264 YUV420 encoded with a resolution of 1280 x 720. This file is then decoded, sent to the splitter component and fed to 4 individual encoder components, re-encoding to exactly the same format. You are free to tinker around with the encodings/pixel formats.

public async Task DecodeThenEncodeVideoFromFilestreamWithSplitter()
{
    MMALStandalone standalone = MMALStandalone.Instance;

    using (var stream = File.OpenRead("/home/pi/test.h264"))
    using (var inputCaptureHandler = new InputCaptureHandler(stream))
    using (var outputCaptureHandler = new VideoStreamCaptureHandler("/home/pi/Videos", "h264"))
    using (var outputCaptureHandler2 = new VideoStreamCaptureHandler("/home/pi/Videos", "h264"))
    using (var outputCaptureHandler3 = new VideoStreamCaptureHandler("/home/pi/Videos", "h264"))
    using (var outputCaptureHandler4 = new VideoStreamCaptureHandler("/home/pi/Videos", "h264"))
    using (var splitter = new MMALSplitterComponent())
    using (var imgDecoder = new MMALVideoDecoder())
    using (var imgEncoder = new MMALVideoEncoder())
    using (var imgEncoder2 = new MMALVideoEncoder())
    using (var imgEncoder3 = new MMALVideoEncoder())
    using (var imgEncoder4 = new MMALVideoEncoder())
    {
        var decoderInputConfig = new MMALPortConfig(MMALEncoding.H264, null, width: 1280, height: 720, framerate: 25, zeroCopy: true);
        var decoderOutputConfig = new MMALPortConfig(MMALEncoding.I420, null, width: 1280, height: 720, zeroCopy: true);
        
        var splitterInputConfig = new MMALPortConfig(MMALEncoding.I420, null, framerate: 25, zeroCopy: true);
        var splitterOutputConfig = new MMALPortConfig(null, null, zeroCopy: true);

        var encoderInputConfig = new MMALPortConfig(MMALEncoding.I420, null, width: 1280, height: 720, zeroCopy: true);
        var encoderOutputConfig = new MMALPortConfig(MMALEncoding.H264, MMALEncoding.I420, width: 1280, height: 720, framerate: 25, zeroCopy: true);

        imgDecoder.ConfigureInputPort(decoderInputConfig, inputCaptureHandler)
                  .ConfigureOutputPort(0, decoderOutputConfig, null);

        splitter.ConfigureInputPort(splitterInputConfig, null)
                .ConfigureOutputPort(0, splitterOutputConfig, null);

        imgEncoder.ConfigureInputPort(encoderInputConfig, splitter.Outputs[0], null)
                  .ConfigureOutputPort<FileEncodeOutputPort>(0, encoderOutputConfig, outputCaptureHandler);
                  
        imgEncoder2.ConfigureInputPort(encoderInputConfig, splitter.Outputs[1], null)
                  .ConfigureOutputPort<FileEncodeOutputPort>(0, encoderOutputConfig, outputCaptureHandler);
        
        imgEncoder3.ConfigureInputPort(encoderInputConfig, splitter.Outputs[2], null)
                  .ConfigureOutputPort<FileEncodeOutputPort>(0, encoderOutputConfig, outputCaptureHandler);
                  
        imgEncoder4.ConfigureInputPort(encoderInputConfig, splitter.Outputs[3], null)
                  .ConfigureOutputPort<FileEncodeOutputPort>(0, encoderOutputConfig, outputCaptureHandler);
        
        imgDecoder.Outputs[0].ConnectTo(splitter);
        splitter.Outputs[0].ConnectTo(imgEncoder);
        splitter.Outputs[1].ConnectTo(imgEncoder2);
        splitter.Outputs[2].ConnectTo(imgEncoder3);
        splitter.Outputs[3].ConnectTo(imgEncoder4);
        
        await standalone.ProcessAsync(imgDecoder);
    }

    // Only call when you no longer require the MMAL library, i.e. on app shutdown.
    standalone.Cleanup();
}

Decode -> Splitter -> Resizer -> Encode (and vice-versa)

Available in v0.6.

In addition to the above, we can also introduce a resizer component too.

In the example below, we are assuming that the input video file is H.264 YUV420 encoded with a resolution of 1280 x 720. This file is then decoded, sent to the splitter component and fed to 4 individual encoder components, one of which via a resizer component where we resize the output to 640 x 480. We then re-encode to exactly the same format. You are free to tinker around with the encodings/pixel formats.

public async Task DecodeThenEncodeVideoFromFilestreamWithSplitterAndResizer()
{
    MMALStandalone standalone = MMALStandalone.Instance;

    using (var stream = File.OpenRead("/home/pi/test.h264"))
    using (var inputCaptureHandler = new InputCaptureHandler(stream))
    using (var outputCaptureHandler = new VideoStreamCaptureHandler("/home/pi/Videos", "h264"))
    using (var outputCaptureHandler2 = new VideoStreamCaptureHandler("/home/pi/Videos", "h264"))
    using (var outputCaptureHandler3 = new VideoStreamCaptureHandler("/home/pi/Videos", "h264"))
    using (var outputCaptureHandler4 = new VideoStreamCaptureHandler("/home/pi/Videos", "h264"))
    using (var splitter = new MMALSplitterComponent())
    using (var imgDecoder = new MMALVideoDecoder())
    using (var imgEncoder = new MMALVideoEncoder())
    using (var imgEncoder2 = new MMALVideoEncoder())
    using (var imgEncoder3 = new MMALVideoEncoder())
    using (var imgEncoder4 = new MMALVideoEncoder())
    using (var resizer = new MMALResizerComponent())
    {
        var decoderInputConfig = new MMALPortConfig(MMALEncoding.H264, null, width: 1280, height: 720, framerate: 25, zeroCopy: true);
        var decoderOutputConfig = new MMALPortConfig(MMALEncoding.I420, null, width: 1280, height: 720, zeroCopy: true);
        
        var splitterInputConfig = new MMALPortConfig(MMALEncoding.I420, null, framerate: 25, zeroCopy: true);
        var splitterOutputConfig = new MMALPortConfig(null, null, zeroCopy: true);

        var resizerInputConfig = new MMALPortConfig(MMALEncoding.I420, null, width: 1280, height: 720, framerate: 25, zeroCopy: true);
        var resizerOutputConfig = new MMALPortConfig(null, null, width: 640, height: 480, zeroCopy: true);

        var encoderInputConfig = new MMALPortConfig(MMALEncoding.I420, null, width: 1280, height: 720, zeroCopy: true);
        var encoderOutputConfig = new MMALPortConfig(MMALEncoding.H264, MMALEncoding.I420, width: 1280, height: 720, framerate: 25, zeroCopy: true);

        imgDecoder.ConfigureInputPort(decoderInputConfig, inputCaptureHandler)
                  .ConfigureOutputPort(0, decoderOutputConfig, null);

        splitter.ConfigureInputPort(splitterInputConfig, null)
                .ConfigureOutputPort(0, splitterOutputConfig, null);

        imgEncoder.ConfigureInputPort(encoderInputConfig, splitter.Outputs[0], null)
                  .ConfigureOutputPort<FileEncodeOutputPort>(0, encoderOutputConfig, outputCaptureHandler);
                  
        imgEncoder2.ConfigureInputPort(encoderInputConfig, splitter.Outputs[1], null)
                  .ConfigureOutputPort<FileEncodeOutputPort>(0, encoderOutputConfig, outputCaptureHandler);
        
        imgEncoder3.ConfigureInputPort(encoderInputConfig, splitter.Outputs[2], null)
                  .ConfigureOutputPort<FileEncodeOutputPort>(0, encoderOutputConfig, outputCaptureHandler);
        
        resizer.ConfigureInputPort(resizerInputConfig, splitter.Outputs[3], null)
               .ConfigureOutputPort(0, resizerOutputConfig, null);
        
        imgEncoder4.ConfigureInputPort(encoderInputConfig, resizer.Outputs[0], null)
                  .ConfigureOutputPort<FileEncodeOutputPort>(0, encoderOutputConfig, outputCaptureHandler);
        
        imgDecoder.Outputs[0].ConnectTo(splitter);
        splitter.Outputs[0].ConnectTo(imgEncoder);
        splitter.Outputs[1].ConnectTo(imgEncoder2);
        splitter.Outputs[2].ConnectTo(imgEncoder3);
        splitter.Outputs[3].ConnectTo(resizer);
        resizer.Outputs[0].ConnectTo(imgEncoder4);
        
        await standalone.ProcessAsync(imgDecoder);
    }

    // Only call when you no longer require the MMAL library, i.e. on app shutdown.
    standalone.Cleanup();
}

Static render overlay

MMAL allows you to create additional video preview renderers which sit alongside the usual Null Sink or Video renderers shown in previous examples. The purpose of the additional renderers is that they allow you to overlay static content which is shown onto the display your Pi is connected to.

The overlay renderers will only work with unencoded images and they must have one of the following pixel formats:

  • YUV420 (I420)
  • RGB888 (RGB24)
  • RGBA
  • BGR888 (BGR24)
  • BGRA

An easy way to get an unencoded image for use with the overlay renderers is to use the Raw image capture functionality as described in this example, setting the MMALCameraConfig.Encoding and MMALCameraConfig.EncodingSubFormat properties to one of the accepted pixel formats. Once you have got your test frame, follow the below example to overlay your image:

public async Task StaticOverlayExample()
{                        
    MMALCamera cam = MMALCamera.Instance;
    
    PreviewConfiguration previewConfig = new PreviewConfiguration
    {
        FullScreen = false,
        PreviewWindow = new Rectangle(160, 0, 640, 480),
        Layer = 2,
        Opacity = 1
    };

    using (var imgCaptureHandler = new ImageStreamCaptureHandler("/home/pi/images/", "jpg"))
    using (var imgEncoder = new MMALImageEncoder())
    using (var video = new MMALVideoRenderer(previewConfig))
    {                    
        cam.ConfigureCameraSettings();
        video.ConfigureRenderer();
                
        PreviewOverlayConfiguration overlayConfig = new PreviewOverlayConfiguration
        {
            FullScreen = true,
            PreviewWindow = new Rectangle(50, 0, 640, 480),
            Layer = 1,
            Resolution = new Resolution(640, 480),
            Encoding = MMALEncoding.I420,
            Opacity = 255
        };
                
        var overlay = cam.AddOverlay(video, overlayConfig, File.ReadAllBytes("/home/pi/test1.raw"));
        overlay.ConfigureRenderer();
        overlay.UpdateOverlay();
             
        var portConfig = new MMALPortConfig(MMALEncoding.JPEG, MMALEncoding.I420, quality: 90);

        //Create our component pipeline.  
        imgEncoder.ConfigureOutputPort(portConfig, imgCaptureHandler);
                
        cam.Camera.StillPort.ConnectTo(imgEncoder);
        cam.Camera.PreviewPort.ConnectTo(video);
                
        cam.PrintPipeline();
                
        await cam.ProcessAsync(cam.Camera.StillPort);        
    }
    
    // Only call when you no longer require the camera, i.e. on app shutdown.
    cam.Cleanup();
}

In this example, we are using an unencoded YUV420 image and configuring the renderer using the settings in overlayConfig.

External Processes

MMALSharp supports external processes to be launched and piped to throughout the duration of camera/standalone processing. Example processes which can be launched include FFmpeg and cvlc. MMALSharp uses the ExternalProcessCaptureHandler class to manage the lifetime of external processes and the use of this capture handler differs somewhat to the other capture handler's seen in this library. The ExternalProcessCaptureHandler class exposes a ProcessExternalAsync method which returns a task to represent the external process; it is important to note that this method does not replace the ProcessAsync method found on the MMALCamera/MMALStandalone class, the two are intended to be used together and held within a Task.WhenAll call. Internally, MMALSharp is able to correctly manage the lifetime of your external process so it is piped to and terminated cleanly. This functionality is new to v0.7 and brings both asynchrony and performance improvements (around 10-15% better than v0.6) when using external processes.

The below examples show how to use the ExternalProcessCaptureHandler class in order to pipe to FFmpeg.

FFmpeg - RTMP streaming

public async Task FFmpegRTMPStreaming()
{                        
    MMALCamera cam = MMALCamera.Instance;

    // An RTMP server needs to be listening on the address specified in the capture handler. I have used the Nginx RTMP module for testing.    
    using (var ffCaptureHandler = FFmpegCaptureHandler.RTMPStreamer("mystream", "rtmp://192.168.1.91:6767/live"))
    using (var vidEncoder = new MMALVideoEncoder())
    using (var renderer = new MMALVideoRenderer())
    {
        cam.ConfigureCameraSettings(); 
        
        // Quality has been left as 0 (default) as the Quantisation can have an impact on the requested bitrate which FFmpeg 
        // doesn't like.
        var portConfig = new MMALPortConfig(MMALEncoding.H264, MMALEncoding.I420, bitrate: MMALVideoEncoder.MaxBitrateLevel4, null);

        // Create our component pipeline. Here we are using the H.264 standard with a YUV420 pixel format. The video will be taken at 25Mb/s.
        vidEncoder.ConfigureOutputPort(portConfig, ffCaptureHandler);

        cam.Camera.VideoPort.ConnectTo(vidEncoder);
        cam.Camera.PreviewPort.ConnectTo(renderer);
                                
        // Camera warm up time
        await Task.Delay(2000);

        var cts = new CancellationTokenSource(TimeSpan.FromMinutes(3));
          
        // Take video for 3 minutes.
        await Task.WhenAll(new Task[] {
	    ffCaptureHandler.ProcessExternalAsync(cts.Token),
	    cam.ProcessAsync(cam.Camera.VideoPort, cts.Token),
	});
    }

    // Only call when you no longer require the camera, i.e. on app shutdown.
    cam.Cleanup();
}

Note:

If you intend on using the YouTube live streaming service, you will need to create the below method to return your own FFmpegCaptureHandler. You should replace the internal FFmpegCaptureHandler.RTMPStreamer seen in the example above with your custom method. The reason for this is YouTube streaming requires your RTMP stream to contain an audio input or otherwise it won't work. Internally, our RTMP streaming method does not include an audio stream, and at the current time we don't intend on changing it for this specific purpose.

public static FFmpegCaptureHandler RTMPStreamerWithAudio(string streamName, string streamUrl)
{
    var opts = new ExternalProcessCaptureHandlerOptions
    {
        Filename = "ffmpeg",
        Arguments = $"-re -ar 44100 -ac 2 -acodec pcm_s16le -f s16le -ac 2 -i /dev/zero -f h264 -i - -vcodec copy -acodec aac -ab 128k -g 50 -strict experimental -f flv -metadata streamName={streamName} {streamUrl}",
        EchoOutput = echoOutput,
        DrainOutputDelayMs = 500, // default
        TerminationSignals = ExternalProcessCaptureHandlerOptions.SignalsFFmpeg
    };
    
    return new ExternalProcessCaptureHandler(opts);    
}

Please see here which discusses the issue in-depth.

FFmpeg - Raw video convert

This is a useful capture mode as it will push the elementary H.264 stream into an AVI container which can be opened by media players such as VLC.

public async Task FFmpegRawVideoConvert()
{                        
    MMALCamera cam = MMALCamera.Instance;

    using (var ffCaptureHandler = FFmpegCaptureHandler.RawVideoToAvi("/home/pi/videos/", "testing1234"))
    using (var vidEncoder = new MMALVideoEncoder())
    using (var renderer = new MMALVideoRenderer())
    {
        cam.ConfigureCameraSettings(); 

        var portConfig = new MMALPortConfig(MMALEncoding.H264, MMALEncoding.I420, quality: 10, bitrate: MMALVideoEncoder.MaxBitrateLevel4);

        // Create our component pipeline. Here we are using the H.264 standard with a YUV420 pixel format. The video will be taken at 25Mb/s.
        vidEncoder.ConfigureOutputPort(portConfig, ffCaptureHandler);

        cam.Camera.VideoPort.ConnectTo(vidEncoder);
        cam.Camera.PreviewPort.ConnectTo(renderer);
                          
        // Camera warm up time
        await Task.Delay(2000);

        var cts = new CancellationTokenSource(TimeSpan.FromMinutes(3));
                
        // Take video for 3 minutes.
	await Task.WhenAll(new Task[] {
	    ffCaptureHandler.ProcessExternalAsync(cts.Token),
	    cam.ProcessAsync(cam.Camera.VideoPort, cts.Token),
	});
    }

    // Only call when you no longer require the camera, i.e. on app shutdown.
    cam.Cleanup();
}

FFmpeg - Images to video

This example will push all images processed by an image capture handler into a playable video.

public async Task FFmpegImagesToVideo()
{                        
    MMALCamera cam = MMALCamera.Instance;
    
    // This example will take an image every 10 seconds for 4 hours
    using (var imgCaptureHandler = new ImageStreamCaptureHandler("/home/pi/images/", "jpg"))
    {
        var cts = new CancellationTokenSource(TimeSpan.FromHours(4));

        var tl = new Timelapse { Mode = TimelapseMode.Second, CancellationToken = cts.Token, Value = 10 };
        await cam.TakePictureTimelapse(imgCaptureHandler, MMALEncoding.JPEG, MMALEncoding.I420, tl);

        // Process all images captured into a video at 2fps.
        imgCaptureHandler.ImagesToVideo("/home/pi/images/", 2);
    }
    
    // Only call when you no longer require the camera, i.e. on app shutdown.
    cam.Cleanup();
}

Video record with Circular Buffer

A CircularBufferCaptureHandler class has been added in v0.6 of MMALSharp. This capture handler allows you to record image frames to a circular buffer which will overwrite contents once the buffer has been filled. The capture handler supports both MJPEG and H.264 encodings, with the latter having some slight caveats. An example of how to use the capture handler can be seen below:

public async Task VideoRecordCircularBufferMJPEG()
{
    MMALCamera cam = MMALCamera.Instance;
    MMALCameraConfig.Framerate = new MMAL_RATIONAL_T(25, 1);

    // Using a 6MB circular buffer.
    using (var vidCaptureHandler = new CircularBufferCaptureHandler(6291456, "/home/pi/videos/", "mjpeg"))
    using (var vidEncoder = new MMALVideoEncoder())
    using (var renderer = new MMALVideoRenderer())
    {
        cam.ConfigureCameraSettings();

        var portConfig = new MMALPortConfig(MMALEncoding.MJPEG, MMALEncoding.I420, bitrate: MMALVideoEncoder.MaxBitrateMJPEG);

        // Create our component pipeline. Here we are using MJPEG encoding with a YUV420 pixel format. The video will be taken at 25Mb/s.
        vidEncoder.ConfigureOutputPort(portConfig, vidCaptureHandler);

        cam.Camera.VideoPort.ConnectTo(vidEncoder);
        cam.Camera.PreviewPort.ConnectTo(renderer);

        // Camera warm up time
        await Task.Delay(2000);

        var fireOffRecordingTask = Task.Run(async () =>
        {
            // Wait for 10 seconds before instructing the capture handler to store image frames to the FileStream.
            await Task.Delay(10000);

            vidCaptureHandler.StartRecording();
            
            // Pause this for another 10 seconds. The capture handler will store image frames to the FileStream during this delay period.
            await Task.Delay(10000);

            // We now tell the capture handler to stop recording again for 10 seconds.
            vidCaptureHandler.StopRecording();
            
            await Task.Delay(10000);

            // Continue storing image frames again until camera task stops.
            vidCaptureHandler.StartRecording();
        });

        var cts = new CancellationTokenSource(TimeSpan.FromSeconds(60));

        var camTask = cam.ProcessAsync(cam.Camera.VideoPort, cts.Token);

        await Task.WhenAll(fireOffRecordingTask, camTask);
    }
    
    // Only call when you no longer require the camera, i.e. on app shutdown.
    cam.Cleanup();
}

This example is rather trivial, but it demonstrates how to instruct the capture handler to start and stop recording to the FileStream. In an actual application, I would imagine you would want to set up the capture handler as a static object and handle the disposal elsewhere once you're finished with it.

I mentioned earlier that when using H.264 there are some slight caveats. H.264 recordings require a key frame to be present at the beginning of a video stream in order for it to be decoded; due to this, MMALCameraConfig.InlineHeaders needs to be set to true to generate key frames at regular intervals throughout the duration of your recording. The CircularBufferCaptureHandler will only start recording to your FileStream once it's received a key frame and this will be stored at the beginning of your video file, and any data stored in the Circular Buffer will be subsequently stored after.

A new method has been added against the VideoEncoderComponent called RequestIFrame, which as the name suggests, will allow you to immediately request a new key frame to be generated by the video encoder. This is optional, as key frames are generated at regular intervals by the video encoder when the MMALCameraConfig.InlineHeaders config is set to true.

An example using H.264 encoding can be seen below:

public async Task VideoRecordCircularBufferH264()
{
    MMALCamera cam = MMALCamera.Instance;
    MMALCameraConfig.InlineHeaders = true;
    MMALCameraConfig.Framerate = new MMAL_RATIONAL_T(25, 1);

    // Using a 6MB circular buffer.
    using (var vidCaptureHandler = new CircularBufferCaptureHandler(6291456, "/home/pi/videos/", "h264"))
    using (var vidEncoder = new MMALVideoEncoder())
    using (var renderer = new MMALVideoRenderer())
    {
        cam.ConfigureCameraSettings();

        var portConfig = new MMALPortConfig(MMALEncoding.H264, MMALEncoding.I420, bitrate: MMALVideoEncoder.MaxBitrateLevel4);

        // Create our component pipeline. Here we are using H.264 encoding with a YUV420 pixel format. The video will be taken at 25Mb/s.
        vidEncoder.ConfigureOutputPort(portConfig, vidCaptureHandler);

        cam.Camera.VideoPort.ConnectTo(vidEncoder);
        cam.Camera.PreviewPort.ConnectTo(renderer);

        // Camera warm up time
        await Task.Delay(2000);

        var fireOffRecordingTask = Task.Run(async () =>
        {
            // Wait for 10 seconds before instructing the capture handler to store image frames to the FileStream.
            await Task.Delay(10000); 

            vidCaptureHandler.StartRecording();

            // (Optionally) Request a key frame to be immediately generated by the video encoder.
            vidEncoder.RequestIFrame();

            // Pause this for another 10 seconds. The capture handler will store image frames to the FileStream during this delay period.
            await Task.Delay(10000);

            // We now tell the capture handler to stop recording again for 10 seconds.
            vidCaptureHandler.StopRecording();

            await Task.Delay(10000);

            // Continue storing image frames again until camera task stops.
            vidCaptureHandler.StartRecording();
            
            vidEncoder.RequestIFrame();
        });

        var cts = new CancellationTokenSource(TimeSpan.FromSeconds(60));

        var camTask = cam.ProcessAsync(cam.Camera.VideoPort, cts.Token);

        await Task.WhenAll(fireOffRecordingTask, camTask);
    }
    
    // Only call when you no longer require the camera, i.e. on app shutdown.
    cam.Cleanup();
}

Store Motion Vectors

Starting with v0.6 of MMALSharp, users can now store motion vector data generated by a video encoder component using H.264 encoding. The MMALCameraConfig.InlineMotionVectors config property must be set to true to instruct the video encoder to generate motion vector data. Additionally, the MMALPortConfig class contains a parameter against its constructor called storeMotionVectors which users should set to true, and finally, users should call InitialiseMotionStore against capture handlers implementing IMotionVectorCaptureHandler so the capture handler is aware of the stream you wish to store data to.

An example of how to do this can be seen below:

public async Task StoreMotionVectors()
{
    MMALCamera cam = MMALCamera.Instance;
    MMALCameraConfig.Framerate = new MMAL_RATIONAL_T(25, 1);
    
    // Motion vector data will not be generated unless this config property is set to true.
    MMALCameraConfig.InlineMotionVectors = true;

    // A new file called motion.dat will be created containing the motion vector data.
    using (var motionVectorStore = File.Create("/home/pi/videos/motion.dat"))
    using (var vidCaptureHandler = new VideoStreamCaptureHandler("/home/pi/videos/", "h264"))
    using (var vidEncoder = new MMALVideoEncoder())
    using (var renderer = new MMALVideoRenderer())
    {
        cam.ConfigureCameraSettings();

        // The storeMotionVectors parameter must be set to true to instruct the port to store motion vector data.
        var portConfig = new MMALPortConfig(MMALEncoding.H264, MMALEncoding.I420, bitrate: MMALVideoEncoder.MaxBitrateLevel4, storeMotionVectors: true);

        // Create our component pipeline. Here we are using H.264 encoding with a YUV420 pixel format. The video will be taken at 25Mb/s.
        vidEncoder.ConfigureOutputPort(portConfig, vidCaptureHandler);

        cam.Camera.VideoPort.ConnectTo(vidEncoder);
        cam.Camera.PreviewPort.ConnectTo(renderer);

        // Camera warm up time
        await Task.Delay(2000);

        var cts = new CancellationTokenSource(TimeSpan.FromSeconds(20));

        // Initialise the motion vector stream.
        vidCaptureHandler.InitialiseMotionStore(motionVectorStore);

        await cam.ProcessAsync(cam.Camera.VideoPort, cts.Token);
    }
    
    // Only call when you no longer require the camera, i.e. on app shutdown.
    cam.Cleanup();
}

ImageFx component

The ImageFx component is used to apply image effects to a YUV_UV image or video frame and is exposed through the MMALImageFxComponent class. The component supports YUV420 packed planar and YUV422 packed planar image formats. We already have support for effects via the MMALCameraComponent class which is configured using MMALCameraConfig.ImageFx, however the ImageFx component now allows users who are not using the camera to apply effects to images and videos* in a standalone environment.

Note: When working with video, the ImageFx component cannot be connected to a Video Encoder component directly so you need to add a splitter component inbetween.

Image

In this example, we first take a JPEG image stored as file /home/pi/images/imagefx/testi420.jpg with resolution 640 x 480 and YUV420 pixel format which is then passed to a Image Decoder component. The decoder component will then decode the image to raw YUV420. From here we pass the decoded image to the ImageFx component where we apply the Solarize effect where the image is finally passed to a Image Encoder component where it is re-encoded as a JPEG with YUV420 pixel format.

public async Task DecodeThenEncodePictureFromFilestreamWithImageFxComponent()
{
    MMALStandalone standalone = MMALStandalone.Instance;

    using (var stream = File.OpenRead("/home/pi/images/imagefx/testi420.jpg"))
    using (var inputCaptureHandler = new InputCaptureHandler(stream))
    using (var outputCaptureHandler = new ImageStreamCaptureHandler("/home/pi/images/imagefx", "jpeg"))
    using (var imgDecoder = new MMALImageDecoder())
    using (var imageFx = new MMALImageFxComponent())
    using (var imgEncoder = new MMALImageEncoder())
    {
        var decoderInputPortConfig = new MMALPortConfig(MMALEncoding.JPEG, MMALEncoding.I420, zeroCopy: true);
        var decoderOutputPortConfig = new MMALPortConfig(MMALEncoding.I420, null, width: 640, height: 480, zeroCopy: true);
        
        var imageFxConfig = new MMALPortConfig(MMALEncoding.I420, MMALEncoding.I420, width: 640, height: 480, zeroCopy: true);

        var encoderInputPortConfig = new MMALPortConfig(MMALEncoding.I420, null, width: 640, height: 480, zeroCopy: true);
        var encoderOutputPortConfig = new MMALPortConfig(MMALEncoding.JPEG, MMALEncoding.I420, width: 640, height: 480, zeroCopy: true);

        // Create our component pipeline.
        imgDecoder.ConfigureInputPort(decoderInputPortConfig, inputCaptureHandler)
                  .ConfigureOutputPort(0, decoderOutputPortConfig, null);

        imageFx.ConfigureInputPort(imageFxConfig, imgDecoder.Outputs[0], null)
               .ConfigureOutputPort(0, imageFxConfig, null);
        
        imgEncoder.ConfigureInputPort(encoderInputPortConfig, imageFx.Outputs[0], null)
                  .ConfigureOutputPort<FileEncodeOutputPort>(0, encoderOutputPortConfig, outputCaptureHandler);

        imageFx.ImageEffect = MMAL_PARAM_IMAGEFX_T.MMAL_PARAM_IMAGEFX_SOLARIZE;

        imgDecoder.Outputs[0].ConnectTo(imageFx);
        imageFx.Outputs[0].ConnectTo(imgEncoder);

        standalone.PrintPipeline(imgDecoder);

        await standalone.ProcessAsync(imgDecoder);
    }
}

Video

In this example we first take a H.264 video stored as file /home/pi/videos/imagefx/testi420.h264 with resolution 640 x 480, YUV420 pixel format and 25fps which is then passed to a Video Decoder component. The decoder component will then decode the video to raw YUV420. From here we pass the decoded video to the ImageFx component where we apply the Solarize effect. Due to not being able to directly connect the ImageFx component to a Video Encoder, we need to employ the Splitter component to act as a middle-man. The Splitter is finally connected to the Video Encoder component where the video re-encoded as H.264 with YUV420 pixel format.

public async Task DecodeThenEncodeVideoFromFilestreamWithImageFx()
{
    MMALStandalone standalone = MMALStandalone.Instance;

    using (var stream = File.OpenRead("/home/pi/videos/imagefx/testi420.h264"))
    using (var inputCaptureHandler = new InputCaptureHandler(stream))
    using (var outputCaptureHandler = new VideoStreamCaptureHandler("/home/pi/videos/imagefx", "h264"))
    using (var splitter = new MMALSplitterComponent())
    using (var vidDecoder = new MMALVideoDecoder())
    using (var vidEncoder = new MMALVideoEncoder())
    using (var imageFx = new MMALImageFxComponent())
    {
        var decoderInputConfig = new MMALPortConfig(MMALEncoding.H264, null, width: 640, height: 480, framerate: 25, zeroCopy: true);
        var decoderOutputConfig = new MMALPortConfig(MMALEncoding.I420, null, width: 640, height: 480, zeroCopy: true);

        var imageFxConfig = new MMALPortConfig(MMALEncoding.I420, MMALEncoding.I420, width: 640, height: 480, zeroCopy: true);

        var encoderInputConfig = new MMALPortConfig(MMALEncoding.I420, null, width: 640, height: 480, zeroCopy: true);
        var encoderOutputConfig = new MMALPortConfig(MMALEncoding.H264, MMALEncoding.I420, width: 640, height: 480, framerate: 25, zeroCopy: true);

        var splitterInputConfig = new MMALPortConfig(MMALEncoding.I420, null, framerate: 25, zeroCopy: true);
        var splitterOutputConfig = new MMALPortConfig(null, null, zeroCopy: true);

        vidDecoder.ConfigureInputPort(decoderInputConfig, inputCaptureHandler)
                  .ConfigureOutputPort(0, decoderOutputConfig, null);

        imageFx.ConfigureInputPort(imageFxConfig, vidDecoder.Outputs[0], null)
               .ConfigureOutputPort(0, imageFxConfig, null);

        splitter.ConfigureInputPort(splitterInputConfig, imageFx.Outputs[0], null)
                .ConfigureOutputPort(0, splitterOutputConfig, null);

        vidEncoder.ConfigureInputPort(encoderInputConfig, splitter.Outputs[0], null)
                  .ConfigureOutputPort<FileEncodeOutputPort>(0, encoderOutputConfig, outputCaptureHandler);
        
        vidDecoder.Outputs[0].ConnectTo(imageFx);
        imageFx.Outputs[0].ConnectTo(splitter);
        splitter.Outputs[0].ConnectTo(vidEncoder);

        await standalone.ProcessAsync(vidDecoder);
    }
}

Colour Enhancement

The ImageFx component also allows a user to apply a constant colour enhancement which is added after the effect. This can be added to both images and videos. The below example will demonstrate how to apply a blue colour to your effect using the image example seen earlier.

public async Task DecodeThenEncodePictureFromFilestreamWithImageFxComponentAndColourEnhancement()
{
    MMALStandalone standalone = MMALStandalone.Instance;

    using (var stream = File.OpenRead("/home/pi/images/imagefx/testi420.jpg"))
    using (var inputCaptureHandler = new InputCaptureHandler(stream))
    using (var outputCaptureHandler = new ImageStreamCaptureHandler("/home/pi/images/imagefx", "jpeg"))
    using (var imgDecoder = new MMALImageDecoder())
    using (var imageFx = new MMALImageFxComponent())
    using (var imgEncoder = new MMALImageEncoder())
    {
        var decoderInputPortConfig = new MMALPortConfig(MMALEncoding.JPEG, MMALEncoding.I420, zeroCopy: true);
        var decoderOutputPortConfig = new MMALPortConfig(MMALEncoding.I420, null, width: 640, height: 480, zeroCopy: true);
        
        var imageFxConfig = new MMALPortConfig(MMALEncoding.I420, MMALEncoding.I420, width: 640, height: 480, zeroCopy: true);

        var encoderInputPortConfig = new MMALPortConfig(MMALEncoding.I420, null, width: 640, height: 480, zeroCopy: true);
        var encoderOutputPortConfig = new MMALPortConfig(MMALEncoding.JPEG, MMALEncoding.I420, width: 640, height: 480, zeroCopy: true);

        // Create our component pipeline.
        imgDecoder.ConfigureInputPort(decoderInputPortConfig, inputCaptureHandler)
                  .ConfigureOutputPort(0, decoderOutputPortConfig, null);

        imageFx.ConfigureInputPort(imageFxConfig, imgDecoder.Outputs[0], null)
               .ConfigureOutputPort(0, imageFxConfig, null);
        
        imgEncoder.ConfigureInputPort(encoderInputPortConfig, imageFx.Outputs[0], null)
                  .ConfigureOutputPort<FileEncodeOutputPort>(0, encoderOutputPortConfig, outputCaptureHandler);

        imageFx.ImageEffect = MMAL_PARAM_IMAGEFX_T.MMAL_PARAM_IMAGEFX_SOLARIZE;
        imageFx.ColourEnhancement = new ColourEffects(true, Color.Blue);
        
        imgDecoder.Outputs[0].ConnectTo(imageFx);
        imageFx.Outputs[0].ConnectTo(imgEncoder);

        standalone.PrintPipeline(imgDecoder);

        await standalone.ProcessAsync(imgDecoder);
    }
}