Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

build: use same roachtest parallelism and cpuquota for AWS and GCE #99369

Merged
merged 1 commit into from
Mar 23, 2023

Conversation

renatolabs
Copy link
Contributor

The parallelism and cpuquota passed to AWS is much lower than that for GCE when invoking roachtest nightly builds. The AWS values have been imported from TeamCity ~3 years ago [1] and haven't changed since then. However, we are starting to see Roachtest Nightly builds time out on AWS [2] now that teams are starting to write more roachtests that run on AWS.

This commit removes the custom PARALLELISM and CPUQUOTA settings we had in place for AWS, making it consistent with GCE.

[1] see 8219a7f
[2] https://teamcity.cockroachdb.com/viewLog.html?buildId=9182737&buildTypeId=Cockroach_Nightlies_RoachtestNightlyAwsBazel

Epic: none

Release note: None

The parallelism and cpuquota passed to AWS is much lower than that for
GCE when invoking roachtest nightly builds. The AWS values have been
imported from TeamCity ~3 years ago [1] and haven't changed since
then. However, we are starting to see Roachtest Nightly builds time
out on AWS [2] now that teams are starting to write more roachtests that
run on AWS.

This commit removes the custom `PARALLELISM` and `CPUQUOTA` settings
we had in place for AWS, making it consistent with GCE.

[1] see 8219a7f
[2] https://teamcity.cockroachdb.com/viewLog.html?buildId=9182737&buildTypeId=Cockroach_Nightlies_RoachtestNightlyAwsBazel

Epic: none

Release note: None
@renatolabs renatolabs requested a review from a team as a code owner March 23, 2023 13:48
@cockroach-teamcity
Copy link
Member

This change is Reviewable

@renatolabs
Copy link
Contributor Author

@cockroachdb/dev-inf for your consideration. Not sure if you have the rationale for the lower AWS settings back in 2020 (other than we had less tests running there at the time).

Copy link
Collaborator

@rickystewart rickystewart left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Not sure if you have the rationale for the lower AWS settings back in 2020 (other than we had less tests running there at the time)

The decision predates me at CRL, but that does seem to be the case. The change looks fine to me.

@renatolabs renatolabs added the backport-23.1.x Flags PRs that need to be backported to 23.1 label Mar 23, 2023
@renatolabs
Copy link
Contributor Author

TFTR!

bors r=rickystewart

@craig
Copy link
Contributor

craig bot commented Mar 23, 2023

Build succeeded:

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
backport-23.1.x Flags PRs that need to be backported to 23.1
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants