Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Questions about Profile settings (for documentation) #3427

Closed
MBB232 opened this issue May 1, 2020 · 10 comments
Closed

Questions about Profile settings (for documentation) #3427

MBB232 opened this issue May 1, 2020 · 10 comments
Labels
docs Bad or missing help texts / documentation libopenshot Issues or PRs that involve the libopenshot C++ backend media-handling Issues related to video/audio file processing & playback question A general question about OpenShot or how to use it. Or visit www.openshot.org/forum/ stale This issue has not had any activity in 60 days :(

Comments

@MBB232
Copy link

MBB232 commented May 1, 2020

I am working on the documentation for Profiles, but there are a few things I do not understand myself. Could someone clear them up before I give wrong info?
https://github.com/MBB232/openshot-qt/blob/develop/doc/profiles.rst

A
progressive=0
0=no = interlaced,
1= yes = progressive
Correct?

B sample_aspect_num=1 & sample_aspect_den=1
This too seems to be a fraction. , but in openshot they are both set to 1, whereas MLTframework uses higher different values for them. https://www.mltframework.org/docs/profiles/
Are there cases where these are not the same that need to be covered? Or just a trivia that should be linked to? Is there an important difference between sample and display aspect ratio?
Is it for cases like this? #3117

C: colorspace: I have added the line for Colorspace as it was not included yet.
I found that it is about YUV colorspace.
The current profiles only use 601 and 709. Does Openshot also support the new 2020 standard used for UHD video?

D: If a source video uses different sizes for "Video Resolution" and "Buffer dimensions", which resolution should be recommended for the profile?

@MBB232 MBB232 added the question A general question about OpenShot or how to use it. Or visit www.openshot.org/forum/ label May 1, 2020
@SuslikV
Copy link
Contributor

SuslikV commented May 2, 2020

C. ...support the new 2020 standard used for UHD video?

  1. Absolutely not. No way to support this any time soon.
  2. Files containers not flagged (yet , see Set decoding and encoding color details libopenshot#495) even for the 709 export. Import of RGB images is using wrong colorspace.
  3. Color space information from the profiles is never used in OpenShot and it is unknown when it will.

@ferdnyc ferdnyc added docs Bad or missing help texts / documentation libopenshot Issues or PRs that involve the libopenshot C++ backend media-handling Issues related to video/audio file processing & playback labels May 2, 2020
@ferdnyc
Copy link
Contributor

ferdnyc commented May 3, 2020

Interlacing

progressive=0
0=no = interlaced,
1= yes = progressive
Correct?

Yes, correct. progressive= is a boolean (true/false) value, and when it's false (0) the format uses interlaced frames.

When a format is not progressive there's a second parameter needed to fully describe how frame images are stored. A second true/false value, top_field_first, represents the ordering of the frame-pairs containing a complete frame's interlaced lines. When top_field_first is 1 (true), the frame holding the odd-numbered lines comes before the one containing even-numbered lines; when it's false, they're in the opposite order.

In terms of input metadata, libopenshot always shows a value for top_field_first when describing any video regardless of format, but it's meaningless and ignored when progressive=1. (In practice it's always 1 for progressive formats, which I suppose is technically correct since when all of the fields are stored in a single frame, the top row is also the first row. But that's tautological, since by that definition progressive=1, top_field_first=0 is an impossible combination.)

Sample aspect / pixel aspect

B sample_aspect_num=1 & sample_aspect_den=1
This too seems to be a fraction.

Any _num, _den value pair is explicitly a fraction, yes.

, but in openshot they are both set to 1

Not always, depends on the profile. 1/1 is most common in formats intended to be shown on a digital display like a computer monitor, though. Typically those screens, and image data encoded for them, will have the same density/resolution/DPI in both dimensions. Each pixel horizontally or vertically in the data maps directly to a single pixel location in the output image.

But broadcast formats, in particular analog ones, may be capable of very different resolution horizontally vs. vertically. The data is encoded in "grids" of pixel values with dimensions that don't match the actual shape of how the frames should be drawn, and that's where the pixel ratio comes into play. Using a non-uniform (1/1) ratio lets frames be encoded more efficiently by using less resolution in one dimension and more in the other, without the resulting image being distorted (changing its aspect ratio).

Other times, pixel aspect is important because it's used to encode as much resolution as possible into a format with a different aspect ratio. Anamorphic DVD is the classic example. DVD frames are formatted for 4:3 (non-widescreen) TVs and have a resolution of ~ 720×480 for NTSC. (That resolution would actually be 3:2 with square pixels, but even 4:3 NTSC has a pixel aspect of 10:11.) Anamorphic DVDs pump that pixel aspect up to 40:33, making 720×480 pixels fit in a widescreen 16:9 aspect instead of the original 4:3 aspect. So, exact same resolution, same bandwidth required to store a frame, same pixel count per frame... but the shape of the image as displayed depends on the pixel aspect ratio.

All of this remains very confusing no matter how many times you've encountered it, in my experience. I assume at some point it must become second-nature and obvious, but I have yet to reach that point and don't expect to. Unless a person is dealing with this stuff professionally on a daily basis, I figure their chances aren't good.

Typically the ratio is stored as a reduced fraction. The 59/54 used as an example in that MLT documentation was probably chosen because it's a fractional value with unusually high components despite being fully reduced: There's no way to express the value 59/54 using any smaller values, as 59 and 54 have no divisors in common other than 1. But some other commonly-encountered pixel ratios are less extreme fractions like 11/10, 8/9, etc.

MLT profiles

Are there cases where these are not the same that need to be covered? Or just a trivia that should be linked to?

It's probably best not to link to the MLT documentation or make any reference to it at all, in the OpenShot documentation. In fact, if there are any existing references/links we should think about removing them to prevent any confusion.

Even though the libopenshot profile data format was originally based on MLT's (a piece of trivia with no practical relevance), the implementation isn't. Because there are major differences in how the file is interpreted, the MLT profile docs don't accurately describe libopenshot profiles even when they have identical data.

Handling of sample/pixel aspect values

Is there an important difference between sample and display aspect ratio?
Is it for cases like this? #3117

It's necessary for output videos created with libopenshot to be encoded into a format like the one in #3117, yes. Or, it would be, if that was working currently.

But not only do we not correctly handle input media with a non-uniform pixel ratio (as #3117 documents), I later discovered (OpenShot/libopenshot#489) that we don't appear to be handling those cases properly in output media either.

Unless I misinterpreted the code, libopenshot's current FFmpegWriter implementation doesn't seem to be applying the profile's sample_aspect_ratio, so it always encodes the output video using square pixels regardless what the profile says.

Profile colorspace

C: colorspace: I have added the line for Colorspace as it was not included yet.
I found that it is about YUV colorspace.
The current profiles only use 601 and 709. Does Openshot also support the new 2020 standard used for UHD video?

As @SuslikV said, not only is 2020 not supported, but whatever the colorspace value is in the profile, it appears to be pretty much ignored in libopenshot's current implementation.

Profiles with a sample aspect that isn't 1.0

D: If a source video uses different sizes for "Video Resolution" and "Buffer dimensions", which resolution should be recommended for the profile?

Not sure I can answer that question without more info, for starters how are we defining those terms in the context of OpenShot?

Profiles define output formatting, so the recommended profile is whatever format they want their output to be. If we're talking about creating profile definitions to match a given input format, basically the profile should use the values displayed by the "Video Details" section of the "File Properties" dialog in OpenShot. Scanning it real quick, I believe every profile parameter is represented there except for colorspace which isn't used anyway.

Documenting the parts that don't work right, or at all

However, due to the various unresolved issues around profile handling, in certain circumstaces the output encoding may not be a perfect match even with a profile that specifies the exact same parameters. Which leads us to (IMHO) the trickiest questions you'll face when writing documentation for OpenShot, for which I have no good answers: How, or do, you document the bugs? Should missing features be addressed, and in what way?

Because colorspace isn't used, does it make sense to document its meaning from a theoretical standpoint based on what we hope it will mean in the future, only to undercut that definition with a note disclaiming that, sorry, right now it doesn't actually work that way? Or is it better to just gloss over it as "an unsupported parameter" and leave it at that, not going into details that aren't relevant now, and may not even apply to some eventual future implementation? Do we just not mention it at all, letting it remain an "undocumented" parameter in the literal sense?

And for things like the output pixel aspect ratio, should the documentation go into the question of non-uniform pixel aspect ratio in output files being a thing OpenShot cannot currently do? (Which seems contradictory to the user guide's purpose of documenting the things you can do in/with OpenShot, not what you can't do.)

Like I said, I have no ready answers to any of those questions. So if anyone feels they can make a case for there being only one correct answer, and why, then you've got my attention.

@ferdnyc
Copy link
Contributor

ferdnyc commented May 3, 2020

All of this remains very confusing no matter how many times you've encountered it, in my experience. I assume at some point it must become second-nature and obvious, but I have yet to reach that point and don't expect to. Unless a person is dealing with this stuff professionally on a daily basis, I figure their chances aren't good.

Or it could be that it's not actually very confusing, it should be old hat for me by now, and the reason I continue to find it confusing is because I'm a dummy. I have to at least acknowledge that possibility as well.

(Tangentially / for background...)

DVD frames are formatted for 4:3 (non-widescreen) TVs and have a resolution of ~ 720×480 for NTSC. (That resolution would actually be 3:2 with square pixels, but even 4:3 NTSC has a pixel aspect of 10:11.) Anamorphic DVDs pump that pixel aspect up to 40:33, making 720×480 pixels fit in a widescreen 16:9 aspect instead of the original 4:3 aspect. So, exact same resolution, same bandwidth required to store a frame, same pixel count per frame... but the shape of the image as displayed depends on the pixel aspect ratio.

That last part is critical, and gets to the "whys" of pixel aspect. For most encoded media formats, the amount of information they can encode will be limited by some sort of bandwidth capacity — whether it's the maximum transfer rate of a cable, the read speed of a physical media device like a DVD player or a hard drive interface, etc., at some point you reach the maximum available bandwidth and can't get any more data any faster than that.

Whatever and wherever that limit is, in the context of the encoded video it can typically be expressed as a pixels-per-second value. The pipeline your data stream travels down doesn't care what size or shape you define for your two-dimensional video frame. All that matters is, in a given second you're only able to receive a certain number of pixels. Divide that by your frame rate, and you've got the total pixel count available to represent each frame.

The pixel count may be a fixed number, but there are a multitude of different ways those pixels can be allocated in the frame. You're free to chop them up into rows and columns of any size and shape just as long as multiplying the dimensions results in the same total pixel count.

So, when you want to encode 16:9 aspect frames and 4:3 aspect frames at the same bandwidth / data rate, one option (the one used by the anamorphic DVD format) is to set a fixed number of rows and columns regardless of the aspect ratio, and then use a different pixel aspect ratio to stretch that data onto a frame with the correct shape.

@MBB232
Copy link
Author

MBB232 commented May 18, 2020

@ferdnyc

When a format is not progressive there's a second parameter needed to fully describe how frame images are stored. A second true/false value, top_field_first, represents the o

While this was very educational for me, are you sure OpenShot supports those lines? If not, it should not be mentioned in the manual. (It may fit as trivia for the FAQ)
Because current interlaced profiles like cvd_ntsc do not have those lines.

Perhaps more importantly, does it matter for the output if your source video is interlaced or not?

Speaking of source video, can OpenShot work with source videos if they have a different frame rate? If so, what profile should the user choose?
If not, are there workarounds? I believe that there are (AI?) frequency scaling tools that try to calculate the missing frames. Are there any recommended? Can they be linked to like OpenShot does with Blender?

I have replaced both images with new PNG's, and I think that is all I can do. The rest is up to someone with more technical experience. If you want video export settings covered too, I can take screenshots and make a paragrapgh setup. But again, I lack knowledge of how to actually use it.
https://github.com/MBB232/openshot-qt/blob/MBB232-profiles/doc/profiles.rst

@ferdnyc
Copy link
Contributor

ferdnyc commented May 18, 2020

@ferdnyc

When a format is not progressive there's a second parameter needed to fully describe how frame images are stored. A second true/false value, top_field_first, represents the o

While this was very educational for me, are you sure OpenShot supports those lines? If not, it should not be mentioned in the manual. (It may fit as trivia for the FAQ)
Because current interlaced profiles like cvd_ntsc do not have those lines.

Sorry, I shouldn't have said "is needed". top_field_first defaults to true, so it would only need to be specified in a profile that needs to output bottom field first.

I'm 99% sure the profile format supports the parameter (I will try to remember to make that 100% when I'm back at my computer), so it's probably worth documenting even though I'm not sure any such interlaced FORMATS actually exist, so it may be a useless parameter in practice.

Perhaps more importantly, does it matter for the output if your source video is interlaced or not?

For the output, not at all. The reader will deinterlace all interlaced input videos — everything's treated as progressive internally. And when writing output video using an interlaced profile, the entire stream will be interlaced regardless what format the source video(s) had.

Speaking of source video, can OpenShot work with source videos if they have a different frame rate?

Absolutely, both different from the output frame rate AND potentially different from each other.

If so, what profile should the user choose?

Whatever they want to export with.

IOW, we often get requests to have OpenShot preserve the parameters of "the source video", because tools like HandBrake and etc. do that automatically. But HandBrake is a transcoder, its only job is to convert one input video into one output video. So it makes sense to match the input as a starting point.

Anyone who's using OpenShot with just a single input video and no other media is probably using it wrong. The point of an editor like OpenShot is to create videos that combine multiple media files — there's almost never "the" input file, there are multiple input files in different formats. So we don't make assumptions about output format.

The best results will usually come from picking a profile that's the same as or lesser than the input files. Meaning, creating a video from higher-quality sources is better than trying to create a high-quality video from lower-quality sources.

But with frame rate, specifically, either matching the input or using a multiple/divisor of the input is best. If you have a 60fps source video, pick either 60, 30, or 15 fps for the output video. If you pick 50, then instead of every other frame getting skipped (@ 30fps), you'll have an output video that drops every 6th frame, and that's going to look a bit jerky.

If you have a 29.97fps source video and a 60fps source video in the same project, you have to pick your poison and deal with the consequences of combining those two in the same project. (30fps may be a good choice, as you probably won't notice the occasional doubled frame on the lower-rate video. But it's really up to the user to see what works for them.)

These are the kind of topics where there are no easy, clear-cut rules because it's very dependent on the user's situation and their goals. In terms of documenting OpenShot, the project profile should be set to whatever target format they intend to export their video in, independent of the properties of the input file(s)... though choosing a format similar to a "primary" video, if any, is usually a good idea.

I have replaced both images with new PNG's, and I think that is all I can do. The rest is up to someone with more technical experience. If you want video export settings covered too, I can take screenshots and make a paragrapgh setup. But again, I lack knowledge of how to actually use it.
https://github.com/MBB232/openshot-qt/blob/MBB232-profiles/doc/profiles.rst

I'll take a look, thanks!

@ferdnyc
Copy link
Contributor

ferdnyc commented May 19, 2020

I should also mention, regarding the interlace info above -- I believe that at least some of that is, as I think I mentioned, quite broken. With no real timeframe on when or if it might be fixed, because there really isn't very much demand at all for interlaced video anymore. And the people who do want it are mostly users in professional/broadcast settings who aren't really OpenShot's target audience.

However, the support is ostensibly there, and present in the UI. And I suspect any attempts to remove it entirely would be met with resistance. So maybe it's better to just studiously talk around it (rather than about it), in the manual. *shrug*

@MBB232
Copy link
Author

MBB232 commented May 19, 2020

IOW, we often get requests to have OpenShot preserve the parameters of "the source video", because tools like HandBrake and etc. do that automatically. But HandBrake is a transcoder, its only job is to convert one input video into one output video. So it makes sense to match the input as a starting point.
Anyone who's using OpenShot with just a single input video and no other media is probably using it wrong. The point of an editor like OpenShot is to create videos that combine multiple media files — there's almost never "the" input file, there are multiple input files in different formats. So we don't make assumptions about output format.

Actually I think there are 3 reasons for that:
1 the target audience.
The interface is simple enough to be accessable to new/simple users. One of the reasons I chose it over Shotcut, Blender and KDEnlive, alongside the realy cool idea of integrating thirdparty tools like Inkscape (Which I wanted to learn a bit anyway so that efficient. )
And the website explicitly states:

We designed OpenShot Video Editor to be an easy to use, quick to learn, and surprisingly powerful video editor.

By cutting up and/or annotating a holiday video, the footage will probably be from the same camera with the same frame rate.

2 As you explain underneath, it does matter, unless you interpolate and rended the missing frames. (For going from 50 to 30 frames, you need to calculate 2 missing frames for each to get 150, if which you then take every 5th frame)

3 I am pretty sure I save seen it advised as important to set the profile to the footage framerate, either here on Github or on the Reddit. This as response to people with too short videos because it had less frames per minute. (That advise may have been wrong, but it's how I learned about the existence of profiles in the first place)

The best results will usually come from picking a profile that's the same as or lesser than the input files. Meaning, creating a video from higher-quality sources is better than trying to create a high-quality video from lower-quality sources.
But with frame rate, specifically, either matching the input or using a multiple/divisor of the input is best. If you have a 60fps source video, pick either 60, 30, or 15 fps for the output video. If you pick 50, then instead of every other frame getting skipped (@ 30fps), you'll have an output video that drops every 6th frame, and that's going to look a bit jerky.

I think that is good advise to give new users. (Experienced users will know how to set things to their own hand).

@MBB232
Copy link
Author

MBB232 commented May 20, 2020

As for interlaced, I came across this issue from last year where you practically stated that interlaced did not work, may never have worked and is unlikely to every work again:

#2922

@ferdnyc wrote

Looping in @jonoomph and @eisneinechse, because after a quick grep through the libopenshot code, I am not finding any code that would seem to handle interlaced output in FFmpegWriter.

PS: As for chosing the same export as your profile (and possibly your import video?) there I read that advise
#3470 (comment)

@SuslikV wrote
When you importing video that isn't matches current Preview Profile of OpenShot - application sets Time property to map each frame of the imported movie to the preview. This is done in frames numbers (not in seconds!), thus when you exporting with different profile - every timing is changing...
To workaround current behavior you can:
set export Video Profile to match Preview Profile (24 fps in your case).
select new Preview Profile (the one you will export to) and redo all cuts again.
select new Preview Profile (the one that matches FPS of the video source) and redo all cuts again and export with the same FPS profile. But because, it's 29.970 fps for your file, OpenShot may miss something during edits. Better to use 25/30/50/60 fps import/export until OpenShot will handle other formats better.

@ferdnyc
Copy link
Contributor

ferdnyc commented May 22, 2020

As for interlaced, I came across this issue from last year where you practically stated that interlaced did not work, may never have worked and is unlikely to every work again:

Pretty much. Because, honestly, I still haven't encountered more than one or two users since I posted that who've even noticed it doesn't work. And I don't think one of those two really cared that it didn't.

PS: As for chosing the same export as your profile (and possibly your import video?) there I read that advise
[...]

Yes, well, the key point there is what I've been saying: The project profile needs to be set to the one you're planning to use when exporting the project. That's by far the most important thing.

The users who encounter serious difficulties with their projects (aside from the ones who just genuinely are attempting complex things that are somewhat beyond OpenShot's capabilities) are usually the ones who do all of their work with the project profile set to the default, often giving no thought to final product at all (maybe not even knowing that there is a project profile). Then they want to set the export profile to whatever they choose and have it "just work", which it never will.

In part that's because most OpenShot users aren't working with uncompressed, high-bandwidth video streams. With video in those formats, like you get off pro equipment, you can edit, adjust, and rescale everything to your heart's content, and it's not a problem. But those are the formats where file size is measured in gigabytes per minute — that flexibility has a cost.

Most OpenShot users are working with video encoded in a highly compressed, very efficient format (like H.264 or H.265) which doesn't lend itself to on-the-fly modification at all. And when working in those formats (especially if the video was captured and encoded on the fly, meaning it might not be properly indexed the way a non-realtime encoder would index it) a little planning ahead can greatly improve results.

@stale
Copy link

stale bot commented Nov 18, 2020

Thank you so much for submitting an issue to help improve OpenShot Video Editor. We are sorry about this, but this particular issue has gone unnoticed for quite some time. To help keep the OpenShot GitHub Issue Tracker organized and focused, we must ensure that every issue is correctly labelled and triaged, to get the proper attention.

This issue will be closed, as it meets the following criteria:

  • No activity in the past 180 days
  • No one is assigned to this issue

We'd like to ask you to help us out and determine whether this issue should be reopened.

  • If this issue is reporting a bug, please can you attempt to reproduce on the latest daily build to help us to understand whether the bug still needs our attention.
  • If this issue is proposing a new feature, please can you verify whether the feature proposal is still relevant.

Thanks again for your help!

@stale stale bot added the stale This issue has not had any activity in 60 days :( label Nov 18, 2020
@stale stale bot closed this as completed Nov 28, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
docs Bad or missing help texts / documentation libopenshot Issues or PRs that involve the libopenshot C++ backend media-handling Issues related to video/audio file processing & playback question A general question about OpenShot or how to use it. Or visit www.openshot.org/forum/ stale This issue has not had any activity in 60 days :(
Projects
None yet
Development

No branches or pull requests

3 participants