Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Event stream api leaks file handles #8864

Closed
towe75 opened this issue Jan 2, 2021 · 4 comments · Fixed by #8873
Closed

Event stream api leaks file handles #8864

towe75 opened this issue Jan 2, 2021 · 4 comments · Fixed by #8873
Assignees
Labels
In Progress This issue is actively being worked by the assignee, please do not work on this at this time. kind/bug Categorizes issue or PR as related to a bug. locked - please file new issue/PR Assist humans wanting to comment on an old issue or PR with locked comments.

Comments

@towe75
Copy link
Contributor

towe75 commented Jan 2, 2021

Is this a BUG REPORT or FEATURE REQUEST? (leave only one on its own line)

/kind bug

Description

Repetitiv event stream API requests leak file handles.

Steps to reproduce the issue:

  1. Start a long running stats stream in a terminal. This avoids that systemd will restart the podman socket server between subsequent requests.

curl -s --unix-socket /run/podman/podman.sock http://localhost/v1/libpod/containers/stats

  1. Watch number of file handles in a second terminal.

watch "lsof -p $(pgrep -f 'podman.*system service') | wc -l"

  1. Subscribe to the event stream and cancel it after a second. Repeat this step multiple times.

curl -m 1 -s --unix-socket /run/podman/podman.sock http://localhost/v1/libpod/events

Describe the results you received:

Number of file handles is growing with each subsequent "events" subscription until the podman system service is restarted.

The problem can be monitored on a idle system without starting/stopping containers.

Additional information you deem important (e.g. issue happens only occasionally):

I found this while working on the nomad podman driver. We try to leverage event and stats streams to improve the performance.
Running our unit tests exhausted the open file limit and i started to hunt the problem.

Output of podman version:

Version:      2.2.1
API Version:  2.1.0
Go Version:   go1.14.10
Built:        Tue Dec  8 15:37:43 2020
OS/Arch:      linux/amd64
@openshift-ci-robot openshift-ci-robot added the kind/bug Categorizes issue or PR as related to a bug. label Jan 2, 2021
@Luap99
Copy link
Member

Luap99 commented Jan 2, 2021

What eventLogger backend do you use? (podman info | grep event)

@towe75
Copy link
Contributor Author

towe75 commented Jan 2, 2021

It's a standard fedora setup with journald backend.

podman info | grep event
  eventLogger: journald

@baude
Copy link
Member

baude commented Jan 4, 2021

nice report ... i assume we are talking about these ...

podman  7565 root   52r      REG              253,0  16777216 3942679 /var/log/journal/4804cba2dc9947b7b73488ba5c085ce9/[email protected]~
podman  7565 root   53r      REG              253,0   8388608 3944305 /var/log/journal/4804cba2dc9947b7b73488ba5c085ce9/user-1000@fa82a97192354f88b90977b068a0e6c9-00000000000b73ba-0005b5fa735cc913.journal
podman  7565 root   54r      REG              253,0  92274688 3941208 /var/log/journal/4804cba2dc9947b7b73488ba5c085ce9/system@4ae9c9bc979142ab87569286f52f245c-0000000000030da6-0005b5f9b3b52f6f.journal
podman  7565 root   55r      REG              253,0  92274688 3934590 /var/log/journal/4804cba2dc9947b7b73488ba5c085ce9/system@4ae9c9bc979142ab87569286f52f245c-0000000000000001-0005b5f5bbd57fd5.journal
podman  7565 root   56r      REG              253,0  16777216 3942645 /var/log/journal/4804cba2dc9947b7b73488ba5c085ce9/system.journal
podman  7565 root   57r      REG              253,0  92274688 3944302 /var/log/journal/4804cba2dc9947b7b73488ba5c085ce9/system@4ae9c9bc979142ab87569286f52f245c-00000000000a23fd-0005b5fa554d791d.journal

@baude baude added the In Progress This issue is actively being worked by the assignee, please do not work on this at this time. label Jan 4, 2021
@baude baude self-assigned this Jan 4, 2021
@towe75
Copy link
Contributor Author

towe75 commented Jan 4, 2021

@baude yes, lots of journal descriptors. Thx for having a look! 🧯

baude added a commit to baude/podman that referenced this issue Jan 4, 2021
when reading from journald, we need to close the journal handler for
events and logging.

Fixes: containers#8864

Signed-off-by: baude <[email protected]>
@github-actions github-actions bot added the locked - please file new issue/PR Assist humans wanting to comment on an old issue or PR with locked comments. label Sep 22, 2023
@github-actions github-actions bot locked as resolved and limited conversation to collaborators Sep 22, 2023
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
In Progress This issue is actively being worked by the assignee, please do not work on this at this time. kind/bug Categorizes issue or PR as related to a bug. locked - please file new issue/PR Assist humans wanting to comment on an old issue or PR with locked comments.
Projects
None yet
Development

Successfully merging a pull request may close this issue.

4 participants