-
Notifications
You must be signed in to change notification settings - Fork 18
Refine metrics CI invocations #909
Refine metrics CI invocations #909
Conversation
Marked as WIP, as I realised I need to update the README.md. I've tweaked the jenkins metrics CI machines so they should be compatible with these changes - I guess we'll see... |
kubernetes qa-failed 👎 |
fa627d6
to
36945c8
Compare
A quick note on the README update - yes, the two spaces added to the end of a line in the bullet list are deliberate - they force a line break, and hence a new paragraph, within that bullet. |
kubernetes qa-passed 👍 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
lgtm
@@ -125,18 +125,8 @@ pushd "$CURRENTDIR/../metrics" | |||
# ops/second | |||
bash network/network-nginx-ab-benchmark.sh | |||
|
|||
# ping latency | |||
bash network/network-latency.sh |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I see that we still run these tests in metrics/run_all_metrics.sh
, but I'm now wondering if we shouldn't just have a single parameterised script to handle all metrics tests?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
as in a single script where we can say 'script --all' or 'script --kpis' for instance, to remove the duplication between the two scripts?
We could do that. In my mind we'd re-factor the run_all_metrics (well, and change the name). We'd have to add the KSM handling and some other trickery from the run_metrics_ci script as well (like the results storing options as well).
How about I open an Issue to note we could do with a refactor of the scripts?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks - Sounds good to me.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
See #912.
lgtm @klynnrif - could you take a look please? |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
A couple small changes to keep an active voice. Thanks!
metrics/network/README.md
Outdated
@@ -35,7 +35,9 @@ and packet-per-second throughput using iperf3 on single threaded connections. | |||
The bandwidth test shows the speed of the data transfer. On the other hand, | |||
the jitter test measures the variation in the delay of received packets. | |||
The packet-per-second tests show the maximum number of (smallest sized) packets | |||
we can get through the transports. | |||
we can get through the transports. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
allowed through the transports.
metrics/network/README.md
Outdated
@@ -35,7 +35,9 @@ and packet-per-second throughput using iperf3 on single threaded connections. | |||
The bandwidth test shows the speed of the data transfer. On the other hand, | |||
the jitter test measures the variation in the delay of received packets. | |||
The packet-per-second tests show the maximum number of (smallest sized) packets | |||
we can get through the transports. | |||
we can get through the transports. | |||
Command line options are used to choose which tests to run, for how long, and |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Command-line options choose what tests to run, for how long, and...
36945c8
to
19dbee5
Compare
@klynnrif updated, thanks. |
kubernetes qa-passed 👍 |
hmm, looks like something went wonky with my rebase - let me fix and push that again.. |
The iperf3 test always ran all the tests, with a fixed iteration (1) and duration (5s). Add a command line parse to allow us to choose: - which tests to run - how many times to run them - for how long to run each one This will help us: - reduce running tests we don't actually need - improve stability by defining test duration and iterations Fixes: clearcontainers#901 Signed-off-by: Graham whaley <[email protected]>
Rather than run all the iperf3 tests, and then only use one result, refine the invocation with the new cmdline options to run just the one test, but multiple times to reduce variation in the results. Signed-off-by: Graham whaley <[email protected]>
We are running some tests, but not currently using their results as part of the CI check as they are somewhat noisy and need some study and refinement. Remove them from the run for now. Signed-off-by: Graham whaley <[email protected]>
19dbee5
to
d54b008
Compare
fixed |
kubernetes qa-passed 👍 |
related to https://github.com/kubernetes-sigs/cri-o/pull/1910/files#r233832755 related issue here: clearcontainers#907 Fixes clearcontainers#909 Signed-off-by: Antonio Murdaca <[email protected]>
Refine our metrics CI invocations somewhat: