You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Currently, part of the static type of a DataArray or Dataset is a Mapping[Hashable, DataArray].
I'm quite sure that 99% of the users will actually use str key values (aka. variable names), while some exotic people (me included) want to use e.g. Enums for their keys.
Currently, we allow to use anything as keys as long as it is hashable, but once the DataArray/set is created, the type information of the keys is lost.
Woudn't that be nice if this would actually return str, so you don't have to cast it or assert it everytime?
This could be solved by making these classes generic.
Another related issue is the underlying data.
This could be introduced as a Generic type as well.
Probably, this should reach some common ground on all wrapping array libs that are out there. Every one should use a Generic Array class that keeps track of the type of the wrapped array, e.g. dask.array.core.Array[np.ndarray].
In return, we could do DataArray[np.ndarray] or then DataArray[dask.array.core.Array[nd.ndarray]].
Describe the solution you'd like
The implementation would be something along the lines of:
da=DataArray(np.arange(10), {"t": np.arange(10)}, dims=["t"])
# will be of type# DataArray[str, np.ndarray]
while you could also create something more fancy
da2=DataArray(dask.array.array([1, 2, 3]), {}, dims=[("tup1", "tup2),])
# will be of type# DataArray[tuple[str, str], dask.array.core.Array]
Any whenever you access the dimensions / coord names / underlying data you will get the correct type.
For now I only see three mayor problems:
non-array types (like lists or anything iterable) will get cast to a np.ndarray and I have no idea how to tell the type checker that DataArray([1, 2, 3], {}, "a") should be DataArray[str, np.ndarray] and not DataArray[str, list[int]]. Depending on the Protocol in the bound TypeVar this might even fail static type analysis or require tons of special casing and overloads.
How does the type checker extract the dimension type for Datasets? This is quite convoluted and I am not sure this can be typed correctly...
The parallel compute workflows are quite dynamic and I am not sure if static type checking can keep track of the underlying datatype... What does DataArray([1, 2, 3], dims="a").chunk({"a": 2}) return? Is it DataArray[str, dask.array.core.Array]? But what about other chunking frameworks?
Describe alternatives you've considered
One could even extend this and add more Generic types.
Different types for dimensions and variable names would be a first (and probably quite a nice) feature addition.
One could even go so far and type the keys and values of variables and coords (for Datasets) differently.
This came up e.g. in #3967
However, this would create a ridiculous amount of Generic types and is probably more confusing than helpful.
Additional context
Probably this feature should be done in consecutive PRs that each implement one Generic each, otherwise this will be a giant task!
The text was updated successfully, but these errors were encountered:
The parallel compute workflows are quite dynamic and I am not sure if static type checking can keep track of the underlying datatype... What does DataArray([1, 2, 3], dims="a").chunk({"a": 2}) return? Is it DataArray[str, dask.array.core.Array]? But what about other chunking frameworks?
It's handled by the from_array method of the ChunkManagerEntrypoint ABC. The implementation of the ABC that gets used depends in general on the value of chunked_array_type passed to .chunk and on what libraries are installed and available. By default it will use the dask implementation, in which case the return type of .chunk would be DataArray[str, dask.array.core.Array]. In general it will return DataArray[str, T_ChunkedArray], but that type is just a placeholder for now.
Is your feature request related to a problem?
Currently, part of the static type of a DataArray or Dataset is a
Mapping[Hashable, DataArray]
.I'm quite sure that 99% of the users will actually use
str
key values (aka. variable names), while some exotic people (me included) want to use e.g. Enums for their keys.Currently, we allow to use anything as keys as long as it is hashable, but once the DataArray/set is created, the type information of the keys is lost.
Consider e.g.
Woudn't that be nice if this would actually return
str
, so you don't have to cast it or assert it everytime?This could be solved by making these classes generic.
Another related issue is the underlying data.
This could be introduced as a Generic type as well.
Probably, this should reach some common ground on all wrapping array libs that are out there. Every one should use a Generic Array class that keeps track of the type of the wrapped array, e.g.
dask.array.core.Array[np.ndarray]
.In return, we could do
DataArray[np.ndarray]
or thenDataArray[dask.array.core.Array[nd.ndarray]]
.Describe the solution you'd like
The implementation would be something along the lines of:
Now you could create a "classical" DataArray:
while you could also create something more fancy
Any whenever you access the dimensions / coord names / underlying data you will get the correct type.
For now I only see three mayor problems:
np.ndarray
and I have no idea how to tell the type checker thatDataArray([1, 2, 3], {}, "a")
should beDataArray[str, np.ndarray]
and notDataArray[str, list[int]]
. Depending on the Protocol in the bound TypeVar this might even fail static type analysis or require tons of special casing and overloads.DataArray([1, 2, 3], dims="a").chunk({"a": 2})
return? Is itDataArray[str, dask.array.core.Array]
? But what about other chunking frameworks?Describe alternatives you've considered
One could even extend this and add more Generic types.
Different types for dimensions and variable names would be a first (and probably quite a nice) feature addition.
One could even go so far and type the keys and values of variables and coords (for Datasets) differently.
This came up e.g. in #3967
However, this would create a ridiculous amount of Generic types and is probably more confusing than helpful.
Additional context
Probably this feature should be done in consecutive PRs that each implement one Generic each, otherwise this will be a giant task!
The text was updated successfully, but these errors were encountered: