Populate space_unit
attribute upon dataset loading/creation?
#386
Labels
question
Further information is requested
space_unit
attribute upon dataset loading/creation?
#386
Proposal
PR #384 introduces an attribute—provisionally named
space_unit
—that specifies the spatial unit (e.g. "mm", "pixels", etc.). This attribute is optionally added by transformation functions such asscale
.I wonder whether it might be beneficial to create this attribute right from the start when loading a dataset via from_file() or from_numpy(), for both "poses" and "bboxes" datasets.
This would mirror our handling of the
time_unit
attribute:"frames"
iffps=None
."seconds"
if a validfps
value is supplied.Making this change would likely require updates to
load_poses.py
,load_bboxes.py
, and possiblyvalidators.datasets.py
. For all of our supported formats—except, potentially, "Anipose"—the spatial unit would be"pixels"
.Caveats
time_unit
is currently aDataset
-level attribute, because both the "position" and "confidence" arrays have atime
dimension.space_unit
in the transforms module is added at theDataArray
level, which makes sense since those transforms operate on individual data arrays. As a result, transforms only modify the attribute on a per-array basis.space_unit
should be added when creating a "poses" or "bboxes" dataset. Logically, it might belong at theDataArray
level (becauseconfidence
does not have aspace
dimension). For a "poses" dataset,space_unit
would then be an attribute ofposition
, whereas for a "bboxes" dataset it would be an attribute of bothposition
andshape
.time_unit
at theDataset
level but introducespace_unit
at theDataArray
level, it becomes more difficult to unify them into a commonunits={dim_name: dim_unit}
dictionary. Perhaps all dimension units should be defined at the array level.Alternative
We could leave things as they are:
space_unit
would not exist by default and would only be added by transformations, at which point the user must specify or decide upon the spatial unit.See Also
time
andframe
#117The text was updated successfully, but these errors were encountered: