Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Feature Request] RTP streaming playback support #180

Open
itlancer opened this issue Dec 15, 2019 · 8 comments
Open

[Feature Request] RTP streaming playback support #180

itlancer opened this issue Dec 15, 2019 · 8 comments

Comments

@itlancer
Copy link

Feature Description

Many TV systems, IP cameras, SIP/VoIP provider uses RTP streaming protocol to deliver video and audio content. There is no way right now to support it playback with AIR. It would be nice feature to extend video streaming support capabilities by that.

Related issue (not the same): #171

Known Workarounds

none
or write native extension for that

@ajwfrost
Copy link
Collaborator

Hi

We did RTP/RTSP support for a Flash app a few years ago, for a customer, using Alchemy (FlasCC), pushing bytes into a netstream object. Worked well enough although we weren't dealing with HD video streams.. Anyway, that's an aside in case it's of interest as an alternative workaround.

What I'd quite like to do is create a new set of media-related classes within the AIR runtime that are more based around a media pipeline approach (think gstreamer or directshow). So we could create media 'source' objects including file-based, http, rtmp, rtp, dash, etc; and then we could have the 'sink' objects being the output devices, and inbetween there's the demuxing, decoding, and other optional processing stages.

I'd be interested in your feedback for this sort of approach, I hope it would have the benefits of being extensible by the ecosystem (e.g. there's no specific reason why the RTP/RTSP/RTCP even DASH components would need to be written in anything other than ActionScript?) and it may also match the platform/OS capabilities and APIs a little more closely which would be better for us implementing it..

Let me know what you think!

thanks

@itlancer
Copy link
Author

Hi
AIR need to completely renew media subsystem (media-related classes) and provide more flexible way to extend video/audio/streaming playback features.
NetStream::appendBytes() with ffmpeg, native extensions or something like that has very slow (performance) right now and quality of result does not live up to expectations (performance lags, FPS drops, low resolution etc).
gstreamer or directshow approach looks cool and architecturally right decision I think.
It doesn't matter how it will be implemented under the hood, what languages will be used and may be there is no reason to embed all such protocols (RTP/RTSP/UDP/HLS/RTMP/WebRTC/HDS, ...), sources (file, HTTP, Socket, ByteArray, SMB, FTP, ...) and codecs/formats (H.264, H.263, H.265/HEVC, WebM, MOV, MKV, AVI, WMV, ....) support to AIR core (only modern and widely used). But there need to be a way to extend such capabilities for developers needs. May be such extending optionally should require (for boost performance for example) native extensions in some cases.

In my experience, what criteria should be satisfied (for modern purposes and market requirements) (of course something from that list have constraints by platforms or hardware restrictions):

  • usage with Video/VideoTexture (or something instead of them) (may be StageVideo too but I'm not sure is there need with it if Video/VideoTexture would be works fine)
  • at least 8K resolution support for video
  • there should be no explicit limits for count of simultaneous playback videos
  • at least 120 FPS support
  • superior performance (stable framerate and smooth playback without lags, frame drops etc)
  • crossplatform (Windows, Android, iOS, macOS, tvOS and Linux/web target too I hope)
  • optional alpha channel support (FLV, some MOV etc)
  • multichannel audio support (not just stereo) inside video
  • HDR support
  • metadata and subtitles support
  • multiple videos (outputs) synchronization support (frame by frame or just by keyframes) (may be it will be achieved by controlling processing stage)
  • it should be controllable: seek, pause, preload, screenshot (draw to BitmapData), volume/pan etc
  • native (embedded/installed to OS/firmware) codecs (decoders) support
  • adaptive streaming support
    • (may be related) IP Cameras support and not just YUV color space support with web (and embedded) cameras

@ajwfrost
Copy link
Collaborator

Great, thanks for that, sounds good - useful feedback!

@itlancer
Copy link
Author

One more thing - all operations with video/audio should be non-blocking (async). Starting/loading, seeking etc should not stop all other processes and render pipeline that causing lags and frame drops.

@crooksy88
Copy link

Are there any plans to add support for playback of H265 HEVC videos to AIR?

@amorganiv
Copy link

Hello Andrew...

Would it be possible to update the NetStream appendBytes method to support MP4 files. Currently, it only works with FLV files.

This would allow the MP4 to be streamed and cached to the device. Otherwise, in order to save the file the MP4 file has to fully download.

@ajwfrost
Copy link
Collaborator

Hi @amorganiv - if the MP4 stream is containing H.264 and AAC content, then it's possible to adjust the container (i.e. take the packets out of MP4 and put them into FLV) and then push this in. We did that for a customer many years ago, sadly we don't own that code! But in terms of natively supporting MP4 - I can check, but we may find it's better to use a new mechanism rather than appendBytes..

thanks

@amorganiv
Copy link

Thanks Andrew... it would be a nice feature to able to stream & cache MP4 files.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

5 participants