Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Create additional source sets for different test types #11564

Closed
Tracked by #14407
nathanklick opened this issue Feb 15, 2024 · 2 comments · Fixed by #15009
Closed
Tracked by #14407

Create additional source sets for different test types #11564

nathanklick opened this issue Feb 15, 2024 · 2 comments · Fixed by #15009
Assignees

Comments

@nathanklick
Copy link
Contributor

nathanklick commented Feb 15, 2024

Description

In order to efficiently classify, manage, and execute different types of tests, we need to separate the tests into individual source sets by type.

Proposed Source Sets

  • test (executed on every PR or default branch commit)
    • These are the standard unit tests which should run quickly on every pull request and commit merged into the develop or release branches.
    • These tests should not be annotated with type tags and form the Minimally Accepted Test Suite (MATS).
  • hammer (executed on a schedule or default branch commit)
    • These are resource intensive and potentially time intensive tests designed to put significant stress on individual system components such as parts of the VirtualMap system.
    • These tests should only be executed on a schedule or when merging into the develop or release branch.
  • timingSensitive (executed on every PR or default branch commit)
    • These are unit tests which must be run serially which is potentially a long running suite of tests.
  • timeConsuming (executed on a schedule)
    • These are long running tests which may be resource intensive and may test integrations between system components.
    • These tests should only be executed as a scheduled task.
  • jmh (executed on merge to develop or a release branch)
    • These are performance benchmarks which may be long running and resource intensive.
    • These tests should be executed on a schedule or when merging into develop and release branches.
  • hapiTest (executed on every PR)
    • These tests cover the modular code and replace the existing eet and itest source sets. These are tests which are a form of end to end testing which executes system transactions against a locally running network.
  • itest (executed on every PR)
    • Existing source set.
    • To be removed once the modularization project has been completed.
  • eet (executed on every PR)
    • Existing source set.
    • To be removed once the modularization project has been completed.
  • xtest (not executed)
    • Existing source set.
    • Slated for removal; can be removed now.
@nathanklick nathanklick added the P1 High priority issue, which must be completed in the milestone otherwise the release is at risk. label Feb 19, 2024
@netopyr
Copy link
Contributor

netopyr commented Feb 22, 2024

We are using xtest and plan to expand them. Please do not remove them.

@jjohannes
Copy link
Contributor

jjohannes commented May 15, 2024

I am observing some things on PR builds that we need to investigate/change.

  • The result of test tasks are not taken from cache often enough. I observed cases where I did not change any code (just build config) in a PR and still all tests did re-run. Locally, I can not observer this. Could be a "caching between multiple machines" issue. Jars are changing on the classpath that should not: https://scans.gradle.com/s/cu5tdwwbyow2a/timeline?cacheability=cacheable&hide-timeline&kind=task&outcome=success,failed&sort=longest
    Problem is most likely writeGitProperties - that file should be ignored on the classpath
  • Some of the timingSensitive taskes take too long (10m+) – Task :swirlds-merkle:timingSensitive. We should consider not running those on every PR.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
3 participants