Skip to content

Commit

Permalink
Avoid dict normalization in __dask_tokenize__ (#15187)
Browse files Browse the repository at this point in the history
There are  currently [CI failures](https://github.com/rapidsai/cudf/actions/runs/8089269486/job/22105880070?pr=15181#step:7:1050) that seem to be caused by non-deterministic `dict` normalization in `Frame.__dask_tokenize__`. This PR avoids normalizing that dictionary.

Authors:
  - Richard (Rick) Zamora (https://github.com/rjzamora)

Approvers:
  - Bradley Dice (https://github.com/bdice)

URL: #15187
  • Loading branch information
rjzamora authored Feb 29, 2024
1 parent 8507b3d commit b670af6
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion python/cudf/cudf/core/frame.py
Original file line number Diff line number Diff line change
Expand Up @@ -1958,7 +1958,7 @@ def __dask_tokenize__(self):

return [
type(self),
normalize_token(self._dtypes),
str(self._dtypes),
normalize_token(self.to_pandas()),
]

Expand Down

0 comments on commit b670af6

Please sign in to comment.