You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Is your feature request related to a problem? Please describe.
This request is related to the following issue: #87 (apologies for necroposting)
ITU-R BS.1770-4 prescribes 400ms (momentary) and 3s (short-term) sliding windows for measurement, and as a result ffmpeg-normalize will disregard files shorter than 3s.
Describe the solution you'd like
The ability to disable/disregard short-term loudness measurements, to enable integrated loudness measurements on short files (<3s).
Describe alternatives you've considered
The current solution is to pad the file with silence, normalize, then trim, but this is not a scalable solution as it adds multiple passes of processing.
The text was updated successfully, but these errors were encountered:
No need to apologize, perhaps the conclusion wasn't so clear. As you can see in #87, this is not an issue related to this tool, but to the respective loudnorm/ebur128 filter in FFmpeg. So it's an upstream bug — or lack of feature, depending on how you interpret it. As long as the filter does not support it, all we can do is print a warning.
I suggest you post this on https://trac.ffmpeg.org/ as a possible enhancement for the loudnorm/ebur128 filter.
Is your feature request related to a problem? Please describe.
This request is related to the following issue: #87 (apologies for necroposting)
ITU-R BS.1770-4 prescribes 400ms (momentary) and 3s (short-term) sliding windows for measurement, and as a result ffmpeg-normalize will disregard files shorter than 3s.
Describe the solution you'd like
The ability to disable/disregard short-term loudness measurements, to enable integrated loudness measurements on short files (<3s).
Pseudo:
ffmpeg-normalize input.wav -o output.wav --integrated-only
Describe alternatives you've considered
The current solution is to pad the file with silence, normalize, then trim, but this is not a scalable solution as it adds multiple passes of processing.
The text was updated successfully, but these errors were encountered: