Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

APIv2: /info is getting slower #8076

Closed
edsantiago opened this issue Oct 20, 2020 · 5 comments
Closed

APIv2: /info is getting slower #8076

edsantiago opened this issue Oct 20, 2020 · 5 comments
Assignees
Labels
flakes Flakes from Continuous Integration HTTP API Bug is in RESTful API kind/bug Categorizes issue or PR as related to a bug. locked - please file new issue/PR Assist humans wanting to comment on an old issue or PR with locked comments. stale-issue

Comments

@edsantiago
Copy link
Member

The "ten /info requests" timing test is getting slower:

# Timing: make sure server stays responsive
t0=$SECONDS
for i in $(seq 1 10); do
# FIXME: someday: refactor t(), separate out the 'curl' logic so we
# can call it directly. Then we won't get ten annoying 'ok' lines.
t GET info 200
done
t1=$SECONDS
delta_t=$((t1 - t2))
# Desired number of seconds in which we expect to run.
want=7
if [ $delta_t -le $want ]; then
_show_ok 1 "Time for ten /info requests ($delta_t seconds) <= ${want}s"
else
_show_ok 0 "Time for ten /info requests" "<= $want seconds" "$delta_t seconds"
fi

It was bumped up from 5 to 7 seconds last week, and is about to get bumped up to 10 (#8065) because it is causing CI flakes:

This issue is a placeholder so the slowdown can be investigated and, perhaps, addressed.

@mheon mheon added HTTP API Bug is in RESTful API kind/bug Categorizes issue or PR as related to a bug. flakes Flakes from Continuous Integration labels Oct 20, 2020
edsantiago added a commit to edsantiago/libpod that referenced this issue Oct 20, 2020
- apiv2 - the 'ten /info requests' test is flaking often,
  taking ~8 seconds (our limit is 7, up from 5 a few weeks
  ago). Brent suggested that the first /info call might be
  expensive, because it needs to access storage. So, let's
  prime it by running one /info outside the timing loop.
  And, because even that continues to fail, bump it up
  to 10 seconds and file containers#8076 to track the slowdown.

- toolbox test - WaitForReady() has timed out, even on one
  occasion causing a run failure because it failed 3 times.
  Solution: bump up timeout from 2s to 5s. Not really great,
  but CI systems are underpowered, and it's not unreasonable
  that 2s might be too low.

- sdnotify test - add a 'podman wait' between stop & rm.
  This may prevent a "cannot rm container as it is running"
  race condition.

While working on this, Brent and I noticed a few ways that
test-apiv2 logging can be improved:

- test name: when request is POST, display the jsonified
  parameters, not the original input ones. This should
  make it much easier to reproduce failures.

- use curl's "--write-out" option to capture http code,
  content type, and request time. We were getting the
  first two via grep from logged headers; this is cleaner.
  And there was no other way to get timing. We now include
  the timing as X-Response-Time in the log file.

- abort on *any* curl error, not just 7 (cannot connect).
  Any error at all from curl is bad news.

Signed-off-by: Ed Santiago <[email protected]>
@github-actions
Copy link

A friendly reminder that this issue had no activity for 30 days.

@zhangguanzhang
Copy link
Collaborator

maybe could close this ?

@TomSweeneyRedHat
Copy link
Member

@edsantiago PTAL

@github-actions
Copy link

A friendly reminder that this issue had no activity for 30 days.

@rhatdan
Copy link
Member

rhatdan commented Dec 24, 2020

Closing do to lack of activity.

@rhatdan rhatdan closed this as completed Dec 24, 2020
@github-actions github-actions bot added the locked - please file new issue/PR Assist humans wanting to comment on an old issue or PR with locked comments. label Sep 22, 2023
@github-actions github-actions bot locked as resolved and limited conversation to collaborators Sep 22, 2023
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
flakes Flakes from Continuous Integration HTTP API Bug is in RESTful API kind/bug Categorizes issue or PR as related to a bug. locked - please file new issue/PR Assist humans wanting to comment on an old issue or PR with locked comments. stale-issue
Projects
None yet
Development

No branches or pull requests

6 participants