-
Notifications
You must be signed in to change notification settings - Fork 915
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[REVIEW] Unpin dask
& distributed
for development
#10623
Conversation
Codecov Report
@@ Coverage Diff @@
## branch-22.06 #10623 +/- ##
================================================
+ Coverage 86.33% 86.36% +0.02%
================================================
Files 140 140
Lines 22289 22289
================================================
+ Hits 19244 19250 +6
+ Misses 3045 3039 -6
Continue to review full report at Codecov.
|
gpuci_logger "gpuci_mamba_retry install conda-forge::dask>=2022.03.0 conda-forge::distributed>=2022.03.0 conda-forge::dask-core>=2022.03.0 --force-reinstall" | ||
gpuci_mamba_retry install conda-forge::dask>=2022.03.0 conda-forge::distributed>=2022.03.0 conda-forge::dask-core>=2022.03.0 --force-reinstall |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I've been wondering if a less cumbersome way of achieving this would be to manually remove dask/label/dev
from our list of channels here? Maybe something like
gpuci_logger "gpuci_mamba_retry install conda-forge::dask>=2022.03.0 conda-forge::distributed>=2022.03.0 conda-forge::dask-core>=2022.03.0 --force-reinstall" | |
gpuci_mamba_retry install conda-forge::dask>=2022.03.0 conda-forge::distributed>=2022.03.0 conda-forge::dask-core>=2022.03.0 --force-reinstall | |
gpuci_logger "gpuci_mamba_retry install dask>=2022.03.0 --force-reinstall" | |
gpuc_conda_retry config --remove channels dask/label/dev | |
gpuci_mamba_retry install dask>=2022.03.0 --force-reinstall |
Not sure if removing the channel would persist in other runs though, cc @ajschmidt8 @jakirkham if this solution makes sense
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I remember trying something similar previously but didn't seem to work, testing it now again in CI
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yeah agree. Recall Prem tried several things, but don't recall if this one (and if so whether we encountered any issues with that)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
If this doesn't work, happy to merge in as is since the current setup is working
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Will this actually install Dask main or just the latest release?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We use nightlies from the Dask channel in some cases. Depends if the channel dask/label/dev
is used
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
And do nightlies really mean "once every night", or does it mean it's a new package after every merge?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Every merge. They are equivalent to using main
, but also include Dask's dependencies
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Sounds good, thanks for confirming.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@charlesbluca It didn't work the way we would expect it to just like last time, log here: https://gpuci.gpuopenanalytics.com/job/rapidsai/job/gpuci-22-06/job/cudf/job/prb/job/cudf-gpu-test/CUDA=11.0,GPU_LABEL=driver-450,LINUX_VER=centos7,PYTHON=3.8/150/consoleText
hence reverting these temp changes.
Changes to be in-line with: rapidsai/cudf#10623 Authors: - GALI PREM SAGAR (https://github.com/galipremsagar) Approvers: - Sevag Hanssian (https://github.com/sevagh) - https://github.com/jakirkham - Peter Andreas Entschev (https://github.com/pentschev) URL: #892
@@ -294,6 +294,6 @@ def sort_values( | |||
df4 = df3.map_partitions(sort_function, **sort_kwargs) | |||
if not isinstance(divisions, gd.DataFrame) and set_divisions: | |||
# Can't have multi-column divisions elsewhere in dask (yet) | |||
df4.divisions = methods.tolist(divisions) | |||
df4.divisions = tuple(methods.tolist(divisions)) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This fix is required due to this upstream change: dask/dask#8806
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM; one small question that can be addressed outside of this PR:
@gpucibot merge |
This PR unpins
dask
&distributed
for development.