Skip to content

Latest commit

 

History

History
139 lines (88 loc) · 5.49 KB

README.md

File metadata and controls

139 lines (88 loc) · 5.49 KB

Overview:

This is just a research to dig up the power of FFmpegKit to enable live streaming from the device's camera to RTMP or RTSP servers. FFmpegKit provides a convenient wrapper around FFmpeg, making it easy to capture, encode, and transmit audio and video streams.

This is also to to check how well it does against the existing live streaming packages like Haishinkit

Features:

  • Live stream video and audio from the device's camera to RTMP or RTSP servers.
  • Customize FFmpeg commands to meet specific streaming requirements.
  • Seamless integration with AVCaptureSession for camera and microphone access.
  • Asynchronous execution for smooth streaming without blocking the main thread.

Motivation

I have worked with lots of live streaming apps. I have been using libraries such as HaishinKit and LFLiveKit. I always wonder if we can publish the live feeds using ffmpeg on mobile apps. FFmpeg is indeed capable of live streaming to a server, and it's a commonly used tool for this purpose. FFmpeg is a powerful multimedia processing tool that can capture, encode, and transmit audio and video in real-time. But I was not sure if we could do this on mobile end.

I am using FFmpeg-kit https://github.com/arthenica/ffmpeg-kit

Stage 1 (AVFoundation)

The FFmpeg avfoundation input format allows you to capture video and audio from macOS and iOS devices using AVFoundation.

ffmpeg -f avfoundation -i "0:0" -c:v libx264 -c:a aac -f flv rtmp://your-rtmp-server/app/stream

Although it supports avfoundation as an input device, it doesn't inherently provide a preview of the camera feed. avfoundation is more focused on capturing and processing audio and video data rather than rendering a live preview.

Stage 2 (Named Pipe)

While doing my research on using named pipe on ffmpeg on iOS. I found a wonderful example related to this done using flutter by dji_flutter

image

You can create a named pipe like following

let videoPipe = FFmpegKitConfig.registerNewFFmpegPipe()
let audioPipe = FFmpegKitConfig.registerNewFFmpegPipe()
let ffmpegCommand = "-re -f rawvideo -pixel_format bgra -video_size 1920x1080 -framerate 30 -i \(videoPipe!) 
-f s16le -ar 48000 -ac 1 -itsoffset -5 -i \(audioPipe!) 
-framerate 30 -pixel_format yuv420p -c:v h264 -c:a aac -vf "transpose=1,scale=360:640" -b:v 640k -b:a 64k -vsync 1 
-f flv \(url!)"

// Execute FFmpeg command
FFmpegKit.executeAsync(ffmpegCommand) { session in
    // Handle FFmpeg execution completion
    print("FFmpeg execution completed with return code \(session.returnCode)")
}

Writing to pipe

To write to the pipe we just simply use FileHandle and specify the pipe path.

if let currentPipe = self.videoPipe, let fileHandle = try? FileHandle(forWritingTo: URL(fileURLWithPath: currentPipe)) {
    if #available(iOS 13.4, *) {
        try? fileHandle.write(contentsOf: data)
    } else {
        fileHandle.write(data)
    }
    fileHandle.closeFile()
} else {
    print("Failed to open file handle for writing")
}

The output video was laggy because I was not using any kind of buffers. While using buffer lead to another problem where ffmpeg quits while streaming because the named pipe pipe would reach EOF during buffering process.

Stage 3

I looked up the solutions for this problem and found out this - https://unix.stackexchange.com/questions/483359/how-can-i-stop-ffmpeg-from-quitting-when-it-reaches-the-end-of-a-named-pipe

We just need to open the named pipe. On Swift we can do this by:

let videoFileDescriptor = open(videoPipe!, O_RDWR)
let audioFileDescriptor = open(audioPipe!, O_RDWR)

This worked very well!!

Usage

let cameraSource = CameraSource(position: .front)
let microphoneSource = MicrophoneSource()
let ffLiveKit = FFLiveKit()
try? ffLiveKit.connect(connection: RTMPConnection(baseUrl: "rtmp://192.168.1.100:1935"))
ffLiveKit.addSource(camera: cameraSource, microphone: microphoneSource)
cameraSource.startPreview(previewView: self.view)
ffLiveKit.prepare(delegate: self)
if !isRecording {
    try? ffLiveKit.publish(name: "mystream")
} else {
    ffLiveKit.stop()
}

Demo

out.mp4

Performance compared to HaishinKit 🤔🤔

FFmpeg

image


HaishinKit

image

TODO

  • CPU Optimization