Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Handle podman-remote run --rm #7224

Merged
merged 1 commit into from
Aug 5, 2020

Conversation

rhatdan
Copy link
Member

@rhatdan rhatdan commented Aug 4, 2020

We need to remove the container after it has exited for
podman-remote run --rm commands. If we don't remove this
container at this step, we open ourselves up to race conditions.

Signed-off-by: Daniel J Walsh [email protected]

@openshift-ci-robot
Copy link
Collaborator

[APPROVALNOTIFIER] This PR is APPROVED

This pull-request has been approved by: rhatdan

The full list of commands accepted by this bot can be found here.

The pull request process is described here

Needs approval from an approver in each of these files:

Approvers can indicate their approval by writing /approve in a comment
Approvers can cancel approval by writing /approve cancel in a comment

@openshift-ci-robot openshift-ci-robot added the approved Indicates a PR has been approved by an approver from all required OWNERS files. label Aug 4, 2020
@rhatdan
Copy link
Member Author

rhatdan commented Aug 4, 2020

@edsantiago @mheon PTAL

@edsantiago
Copy link
Member

Is this intended to fix #7119? If so could you grep 7119 test/system/*.bats and remove all matching lines?

@mheon
Copy link
Member

mheon commented Aug 4, 2020

LGTM

@edsantiago
Copy link
Member

Flake in special_testing_rootless:

 $ podman-remote --url unix:/tmp/podman.Hu25Ai run --rm quay.io/libpod/alpine_labels:latest cat -v /proc/self/attr/current
[+0224s] # read unixpacket @->/run/user/14096/libpod/tmp/socket/32f534034d29cb66f3d3d341a7ddbec1bb13244ee426497a2ffed12579e4f1fb/attach: read: connection reset by peer

I saw that on one CI run in #7111 but was unable to reproduce it (despite a lot of effort) hence did not file an issue.

The socket in question is a conmon one, not the podman-remote socket. I suspect there is a troublesome interaction between conmon and podman-remote but I don't know how to even file an issue without a reproducer.

@edsantiago
Copy link
Member

LGTM but (if this really is intended as a fix for #7119) I'd really like those tests enabled

@edsantiago
Copy link
Member

FWIW I pulled the PR, removed the 7119 skips from bats tests, and let them run. Root and rootless, seven runs each over an hour or so, no failures. That inspires confidence.

In case it helps:

$ perl -i -ne 'print unless /7119/' test/system/*bats

We need to remove the container after it has exited for
podman-remote run --rm commands.  If we don't remove this
container at this step, we open ourselves up to race conditions.

Signed-off-by: Daniel J Walsh <[email protected]>
@rhatdan
Copy link
Member Author

rhatdan commented Aug 5, 2020

Fixes: #7119

@TomSweeneyRedHat
Copy link
Member

LGTM
assuming happy tests

@edsantiago
Copy link
Member

CI passed all in one go, no flakes.

/lgtm

@openshift-ci-robot openshift-ci-robot added the lgtm Indicates that a PR is ready to be merged. label Aug 5, 2020
@openshift-merge-robot openshift-merge-robot merged commit 6260677 into containers:master Aug 5, 2020
@github-actions github-actions bot added the locked - please file new issue/PR Assist humans wanting to comment on an old issue or PR with locked comments. label Sep 24, 2023
@github-actions github-actions bot locked as resolved and limited conversation to collaborators Sep 24, 2023
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
approved Indicates a PR has been approved by an approver from all required OWNERS files. lgtm Indicates that a PR is ready to be merged. locked - please file new issue/PR Assist humans wanting to comment on an old issue or PR with locked comments.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

6 participants