-
Notifications
You must be signed in to change notification settings - Fork 2.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
The podman remote API service doesn't report a failing image build #12666
Comments
I'll check this thanks. |
@thommarko This is fixed in upstream here: #12405 Could you please try with latest |
Closing on the assumption that this was #12405 - reopen if it turns out to be a different issue |
/reopen @flouthoc i'm sorry but #12405 did not fix the issue. Maven output
Podman version and service output:
|
@thommarko: Reopened this issue. In response to this:
Instructions for interacting with me using PR comments are available here. If you have questions or suggestions related to my behavior, please file an issue against the kubernetes/test-infra repository. |
A friendly reminder that this issue had no activity for 30 days. |
@thommarko I think your podman server is still running older version. Could you make sure both your client and server are on |
A friendly reminder that this issue had no activity for 30 days. |
Since we got no response for information, I think the problem is fixed. Closing, Reopen if I am mistaken. |
Is this a BUG REPORT or FEATURE REQUEST? (leave only one on its own line)
/kind bug
Description
In out setup we cannot use local podman installations, so we use the remote podman api with the fabric8 docker-maven-plugin in our maven builds.
Our Problem is in case of a failing image build the podman remote API just puts out a debug message (in its own log output) but does't fail the maven build.
The maven output just shows the following.
...
cif DOCKER> [nexus.server:5011/cif:1.0.0-SNAPSHOT]: Built image null
...
------------------------------------------------------------------------
BUILD SUCCESS
------------------------------------------------------------------------
Total time: 33.711 s
Finished at: 2021-12-17T10:52:46+01:00
------------------------------------------------------------------------
"Build image null" instead of a correct "Built image sha256:xxxxx" is the only hint that something went wrong.
Because we do not want to increase the image version for every build during development its getting even worse, because there may be an "old" version (of course with the same name and tag) of the image in the remote podman cache. And this leads to a "Built image sha256:xxxxx" output and a push of the "old" image even if the latest image build failed.
The only chance the verify a build is greping the podman api service log for possibly occouring errors.
Steps to reproduce the issue:
Setup a remote podman API - "podman system service tcp:hostname:2375 --log-level=debug --time=3600"
Setup a maven project with the fabic8 Plugin
Dockerfile:
pom.xml
Place the two files in the same folder and call "mvn clean install -Ddocker.host=http://hostname:2375"
Should result into a debug level error in podmans service output ...
... and an BUILD SUCCESS on the maven side.
Describe the results you received:
Maven/fabric8 do not notice a failing image build.
Describe the results you expected:
A failing image builds should lead to a failing maven build.
Additional information you deem important (e.g. issue happens only occasionally):
We use a rootless podman setup. Even for the reomte API.
Output of
podman version
:Output of
podman info --debug
:Package info (e.g. output of
rpm -q podman
orapt list podman
):Have you tested with the latest version of Podman and have you checked the Podman Troubleshooting Guide? (https://github.com/containers/podman/blob/master/troubleshooting.md)
Yes
Additional environment details (AWS, VirtualBox, physical, etc.):
OpenStack VMs based on RHEL8
est version of Podman and have you checked the Podman Troubleshooting Guide? (https://github.com/containers/podman/blob/master/troubleshooting.md)**
Yes
Additional environment details (AWS, VirtualBox, physical, etc.):
OpenStack VMs based on RHEL8
The text was updated successfully, but these errors were encountered: