Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Tests suggestion: meta level to exercise permutations of tunables #9652

Open
gdevenyi opened this issue Nov 29, 2019 · 2 comments
Open

Tests suggestion: meta level to exercise permutations of tunables #9652

gdevenyi opened this issue Nov 29, 2019 · 2 comments
Labels
Component: Test Suite Indicates an issue with the test framework or a test case Type: Feature Feature request or new feature

Comments

@gdevenyi
Copy link
Contributor

gdevenyi commented Nov 29, 2019

I noticed at #9648 (comment)

That this bug was only exposed when a tunable was flipped.

That got me thinking that perhaps there are bug lurking in all the possible combinations of tunables, or as in the case of the referenced bug, where tunables are changed during runtime (where allowable). This would likely be an immense amount of tooling, but perhaps the test suite should have the ability to exercise the tunables before/during runs to shake out any bugs. I realize this is likely to be an exponential increase in the number of tests but it might be worth doing once in a while to see what shakes out.

@PrivatePuffin
Copy link
Contributor

PrivatePuffin commented Nov 30, 2019

How this is currently done in the compression/rsend suite, that multiple loops are ran, each with their own set of random options but a limited number of loops is ran.

Lets say one runs 4 loops, it's actually 4*num_distro's Which is a reasonable amount of tests.

Add to this the number of tests run a day and you would basically end up with your exponential number of permutations, without directly having to run them all each test :)

@behlendorf behlendorf added Type: Feature Feature request or new feature Component: Test Suite Indicates an issue with the test framework or a test case labels Dec 6, 2019
@behlendorf
Copy link
Contributor

We do a little bit of this kind of testing today with ztest which modifies some default values. If ztest were rock solid today I could imagine it randomly toggling some of these settings to stress test some less used code paths.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Component: Test Suite Indicates an issue with the test framework or a test case Type: Feature Feature request or new feature
Projects
None yet
Development

No branches or pull requests

3 participants