Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Larger CI Runners to Prevent MIRI OOMing and Improve CI Times #1833

Closed
tustvold opened this issue Jun 10, 2022 · 6 comments
Closed

Larger CI Runners to Prevent MIRI OOMing and Improve CI Times #1833

tustvold opened this issue Jun 10, 2022 · 6 comments
Labels
development-process Related to development process of arrow-rs enhancement Any new improvement worthy of a entry in the changelog question Further information is requested

Comments

@tustvold
Copy link
Contributor

tustvold commented Jun 10, 2022

Is your feature request related to a problem or challenge? Please describe what you are trying to do.

Since updating MIRI in #1828 it is periodically OOMing - https://github.com/apache/arrow-rs/actions/workflows/miri.yaml

image

https://github.com/apache/arrow-rs/actions/runs/2473012537

Describe the solution you'd like

I'm not entirely sure what the best course of action here is, rolling back to a 6 month old MIRI is not ideal and would require backing out changes in #1822, but then neither is having it randomly fail.

It has been a long-time annoyance of mine that the the CI currently takes ~40 minutes to chug through, despite significant caching. This is largely because the runners are rather piddly - https://docs.github.com/en/actions/using-github-hosted-runners/about-github-hosted-runners#supported-runners-and-hardware-resources. This also precludes automatically running any meaningful benchmarks (#1274). Perhaps we should invest some time into a more powerful CI system such as buildkite which is used by other arrow projects...

Describe alternatives you've considered

None

@tustvold tustvold added question Further information is requested enhancement Any new improvement worthy of a entry in the changelog development-process Related to development process of arrow-rs labels Jun 10, 2022
@alamb
Copy link
Contributor

alamb commented Jun 10, 2022

other parts of the arrow project (e.g. C++) was talking about this -- there may be more beefy infrastructure we could take advantage of there

@jhorstmann
Copy link
Contributor

I recently noticed some of the fuzzing tests for filters running for a long time. Watching htop also showed memory usage increasing while running fuzz_filter. Memory usage was already at ~8Gb before that test though.

Maybe some of those tests could be excluded from running in the miri cfg.

@tustvold
Copy link
Contributor Author

tustvold commented Jul 7, 2022

Another dimension this may provide improvement is disk space, with the 14GB proving insufficient for some use-cases (#2004)

@viirya
Copy link
Member

viirya commented Jul 7, 2022

Recently I saw many times of running out of disk space in CI. Although it seems better now.

@alamb
Copy link
Contributor

alamb commented Jul 23, 2022

I got inspired this afternoon while waiting for some other PRs to finish up CI and started a few improvements using the github runners via #2149

@tustvold
Copy link
Contributor Author

The improvements to split up the CI have largely addressed this, so closing

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
development-process Related to development process of arrow-rs enhancement Any new improvement worthy of a entry in the changelog question Further information is requested
Projects
None yet
Development

No branches or pull requests

4 participants