Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

sparse upstream-dev test failures #4146

Closed
dcherian opened this issue Jun 11, 2020 · 4 comments
Closed

sparse upstream-dev test failures #4146

dcherian opened this issue Jun 11, 2020 · 4 comments

Comments

@dcherian
Copy link
Contributor

Full log here: https://dev.azure.com/xarray/xarray/_build/results?buildId=3023&view=logs&jobId=2280efed-fda1-53bd-9213-1fa8ec9b4fa8&j=2280efed-fda1-53bd-9213-1fa8ec9b4fa8&t=175181ee-1928-5a6b-f537-168f7a8b7c2d

Here are three of the errors:

 /usr/share/miniconda/envs/xarray-tests/lib/python3.8/site-packages/sparse/_coo/umath.py:739: SystemError
_ test_variable_method[obj.where(*(), **{'cond': <xarray.Variable (x: 10, y: 5)>\n<COO: shape=(10, 5), dtype=bool, nnz=3, fill_value=False>})-True] _
TypeError: expected dtype object, got 'numpy.dtype[uint64]'
    def _match_coo(*args, **kwargs):
        """
        Matches the coordinates for any number of input :obj:`COO` arrays.
        Equivalent to "sparse" broadcasting for all arrays.
    
        Parameters
        ----------
        args : Tuple[COO]
            The input :obj:`COO` arrays.
        return_midx : bool
            Whether to return matched indices or matched arrays. Matching
            only supported for two arrays. ``False`` by default.
        cache : dict
            Cache of things already matched. No cache by default.
    
        Returns
        -------
        matched_idx : List[ndarray]
            The indices of matched elements in the original arrays. Only returned if
            ``return_midx`` is ``True``.
        matched_arrays : List[COO]
            The expanded, matched :obj:`COO` objects. Only returned if
            ``return_midx`` is ``False``.
        """
        from .core import COO
        from .common import linear_loc
    
        cache = kwargs.pop("cache", None)
        return_midx = kwargs.pop("return_midx", False)
        broadcast_shape = kwargs.pop("broadcast_shape", None)
    
        if kwargs:
            linear = [idx[s] for idx, s in zip(linear, sorted_idx)]
>           matched_idx = _match_arrays(*linear)
E           SystemError: CPUDispatcher(<function _match_arrays at 0x7f66b6272af0>) returned a result with an error set

_______________________________ test_dask_token ________________________________

    @requires_dask
    def test_dask_token():
        import dask
    
        s = sparse.COO.from_numpy(np.array([0, 0, 1, 2]))
    
        # https://github.com/pydata/sparse/issues/300
        s.__dask_tokenize__ = lambda: dask.base.normalize_token(s.__dict__)
    
        a = DataArray(s)
        t1 = dask.base.tokenize(a)
        t2 = dask.base.tokenize(a)
        t3 = dask.base.tokenize(a + 1)
        assert t1 == t2
        assert t3 != t2
        assert isinstance(a.data, sparse.COO)
    
        ac = a.chunk(2)
        t4 = dask.base.tokenize(ac)
        t5 = dask.base.tokenize(ac + 1)
        assert t4 != t5
>       assert isinstance(ac.data._meta, sparse.COO)
E       AssertionError: assert False
E        +  where False = isinstance(array([], dtype=int64), <class 'sparse._coo.core.COO'>)
E        +    where array([], dtype=int64) = dask.array<xarray-<this-array>, shape=(4,), dtype=int64, chunksize=(2,), chunktype=numpy.ndarray>._meta
E        +      where dask.array<xarray-<this-array>, shape=(4,), dtype=int64, chunksize=(2,), chunktype=numpy.ndarray> = <xarray.DataArray (dim_0: 4)>\ndask.array<xarray-<this-array>, shape=(4,), dtype=int64, chunksize=(2,), chunktype=numpy.ndarray>\nDimensions without coordinates: dim_0.data
E        +    and   <class 'sparse._coo.core.COO'> = sparse.COO
@shoyer
Copy link
Member

shoyer commented Jun 16, 2020

It looks like all the sparse tests were passing on June 1, but are now failing.

Nothing remotely suspicious has been merged into pydata/sparse in the past 10 days, and I cannot reproduce these issues on my local machine.

But I did notice that the installed version of NumPy switched from 1.19.0rc2 to 1.20.0.dev. It seems quite plausible that this has resulted in some sort of binary incompatibility for NumPy and Numba.

For now, I'm simply going to remove sparse from our upstream dev tests. This isn't a subtle compatibility issue, so I'm going to let the Numba or sparse devs figure this out.

@keewis
Copy link
Collaborator

keewis commented Jun 16, 2020

we don't even install a development version of sparse (maybe we should?)

@shoyer shoyer closed this as completed Jun 16, 2020
@shoyer
Copy link
Member

shoyer commented Jun 16, 2020

That's a good point, we should consider that! But we would really need a version of Numba built in a compatible way to the NumPy dev version we're using in order to make that feasible. For now I'm happy dropping sparse.

@keewis
Copy link
Collaborator

keewis commented Mar 17, 2021

a small update on sparse: right now, enabling the install of sparse from github makes the upstream-dev CI fail on:

xarray/tests/test_sparse.py::test_variable_method[obj.where(*(), **{'cond': <xarray.Variable (x: 10, y: 5)>\n<COO: shape=(10, 5), dtype=bool, nnz=3, fill_value=False>})-True]

with a NEP18 error (numpy.allclose is not implemented for [<class 'numpy.ndarray'>, <class 'sparse._coo.core.COO'>])

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants