-
Notifications
You must be signed in to change notification settings - Fork 271
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add support for higher precision pixel formats #1528
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This looks like a good start, and doesn't architecturally conflict with what I had in mind.
It is rather minimal in what producers/consumers are supported, but I don't see that as a problem for now.
The only bit I don't like is the comment I added in array.h
@@ -502,7 +502,11 @@ struct Filter | |||
AV_PIX_FMT_ABGR, | |||
AV_PIX_FMT_YUV444P, | |||
AV_PIX_FMT_YUV422P, | |||
AV_PIX_FMT_YUV422P10, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Perhaps this could include more formats?
In my testing, some codecs prefered to decode to RGB48/RGBA64, which should be trivial to handle.
And I was able to add the following yuv formats which I think only needed the handling on mapping into pixel_format:
AV_PIX_FMT_YUV444P10,
AV_PIX_FMT_YUV422P10,
AV_PIX_FMT_YUV420P10,
AV_PIX_FMT_YUV444P16,
AV_PIX_FMT_YUV422P16,
AV_PIX_FMT_YUV420P16,
AV_PIX_FMT_YUVA444P10,
AV_PIX_FMT_YUVA422P10,
AV_PIX_FMT_YUVA420P10,
AV_PIX_FMT_YUVA444P16,
AV_PIX_FMT_YUVA422P16,
AV_PIX_FMT_YUVA420P16,
This isn't necessary for merging this PR though.
}); | ||
} | ||
|
||
#ifdef WIN32 | ||
std::future<std::shared_ptr<texture>> copy_async(GLuint source, int width, int height, int stride) | ||
/* Unused? */ |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
yes, I forgot to remove this when removing the shared-texture flow for CEF. Feel free to delete it
{ | ||
configuration config = parse_xml_config(ptree, format_repository); | ||
|
||
config.hdr = (depth != common::bit_depth::bit8); | ||
|
||
if (config.hdr && config.primary.has_subregion_geometry()) { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is a shame, but I can appreciate that making all of these features behave correctly with 16bit will be hard and is not being asked for yet.
Unfortunately this means that it won't be possible to benefit from this work if a key output is needed, so this will only be usable in fullscreen channels, rather than keyed overlay channels. (This is not a problem, just writing out this limitation for when other users are curious)
I'm glad that you approve. This is just the first iteration, to get something out there that can be tested by the client. How "complete" the support will be will ultimately be up to them. |
Hi, This is super interesting work and would have been very useful for Eurovision Song Contest 2024 in Malmö, where we're incorporating a lot of color gradients in the graphics. All the graphics for ESC will be done in CasparCG. Keep up the good work, Niklas and Julian! |
Nice work. Can this also be used for "custom resolution consumers"? |
@@ -225,6 +225,8 @@ struct HDRMetadata | |||
const auto REC_709 = ChromaticityCoordinates{0.640, 0.330, 0.300, 0.600, 0.150, 0.060, 0.3127, 0.3290}; | |||
const auto REC_2020 = ChromaticityCoordinates{0.708, 0.292, 0.170, 0.797, 0.131, 0.046, 0.3127, 0.3290}; | |||
|
|||
typedef const char* BSTR; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Theres already an typedef called String
inside decklink_api.h which handles this abstraction.
So replace the uses of BSTR
with String
and it should be happy
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ah, that's great! Thank you
e323e81
to
e8d8e88
Compare
* Remove the 'magic numbers' related to YCbCr->RGB conversion for SD / HD * Replace with standard coefficients and matrices for bt.601 and bt.709 respectively
* Needed for providing necessary metadata to consumers about the frame's color_space * color_space is deduced from the color-depth channel config. A separate config could be added later * The fragment shader does not perform any color space conversions
@@ -255,15 +257,27 @@ struct server::impl | |||
|
|||
auto format_desc_str = xml_channel.second.get(L"video-mode", L"PAL"); | |||
auto format_desc = video_format_repository_.find(format_desc_str); | |||
auto color_depth = xml_channel.second.get<unsigned char>(L"color-depth", 8); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm not following what this color_depth
value is actually used for.
As far I can tell, it is being passed around a lot to then be only used by https://github.com/niklaspandersson/casparcg-server/blob/aac7d44da5e25613efd1a0ffcb53892a45062d8a/src/core/mixer/mixer.cpp#L77-L87, but does that need/want a color_space as at that point it is RGB?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It is used to determine whether to use one or two bytes per color channel in mixer-generated frames. It's needed to maintain the extra precision throughout the pipeline.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
sorry, I meant to write this on the color-space parsing, not color-depth
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Frames created by the mixer need to be tagged with an explicit color space for consumers such as decklink to be able to attach the correct metadata on their output.
It is also needed to perform correct color space transformations in the mixer when combining material from different color spaces (not implemented yet though)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It is also needed to perform correct color space transformations in the mixer when combining material from different color spaces (not implemented yet though)
I wouldn't expect that to be necessary, seeing how the output of the shader is rgb. So while frames may enter the mixer as different formats in varying color spaces, the output should always be rgb and surely agnostic of whatever the output color space will be?
Frames created by the mixer need to be tagged with an explicit color space for consumers such as decklink to be able to attach the correct metadata on their output.
I would expect this to be a property of the consumer, not the whole channel.
I haven't done much research into how HDR and its metadata works, but from what I have read I would expect that the only changes needed are to:
- composite in 16bit rgb
- use the correct algorithm for converting that to 10bit yuv using the chosen hdr mode and metadata values
so unless the mixer is going to be generating frames in yuv, does it need to know what the consumers are doing?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
RGB is not color space agnostic. Max green, ie. vec3(0, 1.0, 0), represents different shades of green depending on which color space you're using.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Oh right. Having looked into it some more, you are right.
Sometimes referred to as wide and narrow gamut.
I'm still not entirely keen on this being done on the channel like this, but I dont have any better ideas right now. Especially as there isn't a convenient/cheap place to do a per-consumer color conversion
Yes, absolutely. For ffmpeg I think it would just work without any modifications. Do have a specific consumer in mind? |
When we use custom resolutions it will be the screenconsumer |
@niklaspandersson I don't see any comments left that need addressing on this. Let me know if you are happy for it to be merged (or go ahead and merge it yourself) |
@Julusian I'm happy if you are happy. Feel free to merge |
Great work, thank you! Question: channel bit depth goes from 8 to 16 directly, right? I tried to set it to 10 or 12 but it crashed, I just wanted to quickly test it on a laptop without decklink to compare image outputs with Photoshop but it seems that is not possible at the moment.
|
@Sidonai-1 correct, it is limited to 8 or 16bit |
I guess when the screen consumer gets upgraded @MauriceVeraart you will be able to feed large VW at 10 bit, do you use Windows? Does it handle the screen consumer properly and in sync with the audio? I never used it for production because the frame-time was all over the place so it was not very smooth, specially comparing to Decklink. I wanted to see if a 16bit channel would work as it is with custom resolutions and decklink ports but I got an error:
I wonder if "Decklink consumer does not support hdr in combination with secondary ports." comes from a SDK limitation or it's just not fully implemented yet. Hopefully it's the latter 😁 It was a bit strange that the log was complaining about HDR. The channel was set at 16bit in the config, with no color space or HDR settings specified. I thought bit depth and HDR on/off were be 'independent' in some way. Are they completely intertwined? Thanks. |
This branch introduces:
Only minimal testing have been performed on windows.