-
Notifications
You must be signed in to change notification settings - Fork 624
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Speedup CI #1578
Comments
This repo has examples for how we can cache go package downloads in CI: https://github.com/mvdan/github-actions-golang It also has the caching needed to get incremental builds, which would likely also help some of this CI time. Seems like the caching for incremental go builds may also help with the docker image build time? |
We can also run e2e upgrade tests on merge to main only: osmosis/tests/e2e/e2e_setup_test.go Line 155 in c8ac95c
Upgrade testing takes a long time and it doesn't add much value running it on every PR / commit to PR. As long as we know that upgrade tests pass on main, that should be sufficient. We should still continue running e2e tests on every PR, just the upgrade part can be skipped |
oh thats a great idea. Perhaps we can make that variable also get run if theres any change in the upgrades sub folder as well? |
For CodeQL, here are there docs on speeding it up: https://docs.github.com/en/code-security/code-scanning/automatically-scanning-your-code-for-vulnerabilities-and-errors/troubleshooting-the-codeql-workflow#the-build-takes-too-long TL;DR:
But to be honest, I don't really see much point in keeping it. I'd rather just use gosec, which is AST based code scanning |
## What is the purpose of the change Try adding a cache for go build data, to help speedup CI. Copied from mvdan here: https://github.com/mvdan/github-actions-golang/blob/master/.github/workflows/test.yml cref #1578 ## Brief Changelog - Adds go cache to CI ## Testing and Verifying We will need to keep an eye on if there are edge cases that make this not work / give incorrect results, or things don't speedup. A bit hard to know this without it being in place though. ## Documentation and Release Note - Does this pull request introduce a new feature or user-facing behavior changes? yes - Is a relevant changelog entry added to the `Unreleased` section in `CHANGELOG.md`? no (I don't think CI is in scope) - How is the feature or change documented? We need to figure out a strategy for if/how we want to document CI.
For docker build times, I think we can replace the cc @nikever |
## What is the purpose of the change Removes CodeQL. It takes forever, and I'm unconvinced its adding anything of value. We should just get Informal's gosec fixed, and use that imo. If folks feel like CodeQL is useful, we can investigate speedup strategies in #1578 , but atm I'm not sure its helping us any. ## Documentation and Release Note - Does this pull request introduce a new feature or user-facing behavior changes? no - Is a relevant changelog entry added to the `Unreleased` section in `CHANGELOG.md`? no, its CI - How is the feature or change documented? not applicable
Added the cache option in #1602 There are some redundant work between |
We've completed everything thats not test-e2e related! Roman did a great job at summarizing a path to tackle for doing that in #1646 , nothing comes to my mind immediately for speedup steps beyond that. So going to close this in favor of 1646 for now! |
Background
We should invest time in speeding up CI. Lots of time is lost in just waiting for CI to finish executing. If theres any reason we cannot do all of the above in github actions, imo we should investigate moving to a paid CI service, perhaps Circle, where they can all be done. Getting the feedback loop right for rapid merging multiple related PR's is important.
First where we stand on latencies. As of time of issue creation, we are blocked on:
Furthermore, many github actions are stuck in 'queued', and take awhile before they begin execution, further increasing this dead time cost.
Components
Acceptance Criteria
The text was updated successfully, but these errors were encountered: