-
-
Notifications
You must be signed in to change notification settings - Fork 173
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
gstreamer encoder #3706
Comments
Works pretty well!
|
hard-coded for vp8 for now - just a PoC
The commits above allows us to add
The trick for decoding raw h264 NALs was found here: https://discourse.gnome.org/t/gstreamer-how-to-decode-h264-from-appsrc/10020/7 |
This is likely to sway the decision of how to deal with
Similar to #2764 : segfault in OsLookupColor : OsLookupColor is just a default function the backtracer seems to pick up when it craps itself out |
doesn't seem to work, even with 'nvidia-vaapi-driver' installed
as it would be checked using isinstance by the window video source
we can't specify the input image's rowstride, so this is the best we can do
Interesting, it may soon be possible to use |
but do provide an environment variable to override this which is also useful on Linux to disable a potentially crashy element
Do we need to set the GstVideoColorimetry to full range? ffmpeg does it. For x264enc, we should also handle bandwidth limits by switching to [cbr mode] (https://gstreamer.freedesktop.org/documentation/x264/index.html?gi-language=c#GstX264EncPass) and still set qp-max? |
as it has context limit issues that can trip up the tests or unsuspecting users
so we can re-use the same code for screen capture and bus messages
We can also do xpra/xpra/codecs/gstreamer/encoder.py Line 30 in 4d62441
Because when we hit the artificial context limit on consumer cards, the gstreamer pipeline will just timeout instead of erroring out quickly - which is problematic. The error can be seen at higher debug logging levels. |
despite showing it in the caps options for vaapih264enc
Testing the same version of the code that works perfectly on an old Skylake laptop, this time on a brand new $ gst-launch-1.0 videotestsrc ! videoconvert ! vaapih264enc ! fakesink
ERROR: from element /GstPipeline:pipeline0/GstVideoTestSrc:videotestsrc0: Internal data stream error. Judging by my searches, it could be anything and that's a common problem with vaapi.. |
I'd like to check the current status of Xpra encoding and decoding with Intel QuickSync. I could not find any information on the internet as this is a rapidly moving project. Does it work via vaapi through gstreamer? Apologies if this is the wrong place for this but I've looked everywhere I can. |
@algeorge the gstreamer encoder in this ticket is included in all v5 official builds (currently in the beta area only) but it is disabled by default:
In the future, I would like to add utilities to make it easier for end users to test and enable the pipeline combinations that work best on their specific hardware. For now, this is a manual process. |
Manual is great! Because it even works! As far as settings to pass through, are there any options for gstreamer anywhere? To modify colorspace, bit-depth, quality, bitrate... etc. Or is that handled as part of the regular Xpra profile as written in the referenced conf file. I'd like to know if "profiles" can be made for certain connections using Gstreamer passthrough. |
I would like to add that this depends on the encoder which is why I am so interested in QuickSync. MJPEG has a sub 1ms latency for encoding with a fairly high bitrate. MPEG2 is in the same ballpark with latency because the codec is so simple. |
We used to have MPEG1 and MPEG2 support via the ffmpeg encoder, but this was removed recently. |
Will follow up in #3964 |
Worth a try.
We already have
appsink
andappsrc
working code so it should be possible to feedBGRX
orYUV420P
to a video encoder (initiallyx264
) and get the stream as output.This can be used for testing hardware encoders and see what the difference is with
libva
(#451): perhaps this one won't lock up the system quite so easily?If that pans out, we can try to feed the pixel data to the pipeline without copying it.
It should be possible to implement this as an element in python:
Implementing an audio plotter
How to use Gstreamer AppSrc in Python
As for the pipeline option, there are some examples of gstreamer syntax that match what we already do with
x264
in Cython code:Realtime/zero-latency video stream: what codec parameters to use?
Element references:
Some information on EOS handling: How to wait for x264enc to encode buffered frames on end-of-stream
The text was updated successfully, but these errors were encountered: