Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add ASV benchmark CI workflow #139

Merged
merged 13 commits into from
Aug 17, 2022
Merged

Conversation

Illviljan
Copy link
Contributor

@Illviljan Illviljan commented Aug 17, 2022

Add an ASV workflow for easy benchmarking. Adapted from xarray.

@Illviljan
Copy link
Contributor Author

@dcherian, could you add the run-benchmark label?

@dcherian
Copy link
Collaborator

You have a pending invite to xarray-contrib, if you accept I think you have rights to do that. (I'd like to make sure you can do it so not doing it right now)

@Illviljan Illviljan added the run-benchmark Run the ASV benchmark workflow label Aug 17, 2022
@Illviljan
Copy link
Contributor Author

Thanks, it works!

@Illviljan
Copy link
Contributor Author

The workflow appears to work!

But the benchmark is probably broken?

 Error:                Traceback (most recent call last):
                 File "/home/runner/.local/lib/python3.8/site-packages/asv/benchmark.py", line 1293, in main_run_server
                   main_run(run_args)
                 File "/home/runner/.local/lib/python3.8/site-packages/asv/benchmark.py", line 1167, in main_run
                   result = benchmark.do_run()
                 File "/home/runner/.local/lib/python3.8/site-packages/asv/benchmark.py", line 573, in do_run
                   return self.run(*self._current_params)
                 File "/home/runner/.local/lib/python3.8/site-packages/asv/benchmark.py", line 669, in run
                   samples, number = self.benchmark_timing(timer, min_repeat, max_repeat,
                 File "/home/runner/.local/lib/python3.8/site-packages/asv/benchmark.py", line 705, in benchmark_timing
                   timing = timer.timeit(number)
                 File "/home/runner/work/flox/flox/asv_bench/.asv/env/b75ad52f6dea6a19ad824a655424eb23/lib/python3.9/timeit.py", line 177, in timeit
                   timing = self.inner(it, self.timer)
                 File "<timeit-src>", line 6, in inner
                 File "/home/runner/.local/lib/python3.8/site-packages/asv/benchmark.py", line 627, in <lambda>
                   func = lambda: self.func(*param)
                 File "/home/runner/work/flox/flox/asv_bench/benchmarks/combine.py", line 16, in time_combine
                   flox.core._npg_combine(
               AttributeError: module 'flox.core' has no attribute '_npg_combine'
               asv: benchmark failed (exit status 1)

Maybe we should merge this and make a new PR fixing the benchmarks?

@Illviljan Illviljan marked this pull request as ready for review August 17, 2022 18:07
@dcherian
Copy link
Collaborator

Could just fix it here.

Is it necessary to change these paths? If so, I think you need to also add a change deleting the existing file.

@Illviljan
Copy link
Contributor Author

Do you see what needs to be changed? I'm not familiar enough with this code and I can't find something that resembles the old flox.core._npg_combine?

The paths were changed to align with xarray and I don't want to fiddle with the paths. I still have nightmares of getting the correct paths when adding this in xarray.

From what I can tell it seems to understand that it was moved or?
image

Copy link
Collaborator

@dcherian dcherian left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I mean that the benchmarks folder has been moved but in the diff it doesn't show this. It looks like asv_bench/benchmarks/combine.py was added, but the earlier benchmarks/combine.py was not removed

asv_bench/benchmarks/combine.py Outdated Show resolved Hide resolved
asv_bench/benchmarks/combine.py Outdated Show resolved Hide resolved
Comment on lines 23 to 27
- name: Setup Miniconda
uses: conda-incubator/setup-miniconda@v2
with:
# installer-url: https://github.com/conda-forge/miniforge/releases/latest/download/Mambaforge-Linux-x86_64.sh
installer-url: https://github.com/conda-forge/miniforge/releases/latest/download/Miniforge3-Linux-x86_64.sh
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Could micromamba this like with the other workflows and get some caching.

Copy link
Contributor Author

@Illviljan Illviljan Aug 17, 2022

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I copied from the other worklows and it seems to be working? Does this make sense to you?

@dcherian dcherian changed the title Add ASV workflow Add ASV benchmark CI workflow Aug 17, 2022
Copy link
Collaborator

@dcherian dcherian left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks great. thanks @Illviljan feel free to merge when you're done.

@Illviljan Illviljan merged commit fbc2af8 into xarray-contrib:main Aug 17, 2022
dcherian added a commit that referenced this pull request Oct 9, 2022
* main:
  Update ci-additional.yaml (#167)
  Refactor before redoing cohorts (#164)
  Fix mypy errors in core.py (#150)
  Add link to numpy_groupies (#160)
  Bump codecov/codecov-action from 3.1.0 to 3.1.1 (#159)
  Use math.prod instead of np.prod (#157)
  Remove None output from _get_expected_groups (#152)
  Fix mypy errors in xarray.py, xrutils.py, cache.py (#144)
  Raise error if multiple by's are used with Ellipsis (#149)
  pre-commit autoupdate (#148)
  Add mypy ignores (#146)
  Get pre commit bot to update (#145)
  Remove duplicate examples headers (#147)
  Add ci additional (#143)
  Bump mamba-org/provision-with-micromamba from 12 to 13 (#141)
  Add ASV benchmark CI workflow (#139)
  Fix func count for dtype O with numpy and numba (#138)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
run-benchmark Run the ASV benchmark workflow
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants