-
Notifications
You must be signed in to change notification settings - Fork 4.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Failing Test]: Some Python integration tests runs result in environment mismatch. #28653
Comments
Tentatively adding as a blocker until confirmed it's not affecting the release branch |
09:38:53 RuntimeError: Pipeline construction environment and pipeline runtime environment are not compatible. If you use a custom container image, check that the Python interpreter minor version and the Apache Beam version in your image match the versions used at pipeline construction time. Submission environment: beam:version:sdk_base:apache/beam_python3.11_sdk:2.52.0.dev. Runtime environment: beam:version:sdk_base:apache/beam_python3.11_sdk:2.51.0.dev. |
likely this will not affect the release branch, but something is misconfigured. |
Seeing this in one job:
... |
I believe we currently don't stage tarballs in tests, and somehow the provided wheel is either not compatible or got corrupted during retrieval: #28605 |
I think this is caused by #28605 . |
I am not totally following whether this could impact the release. Would we expect to be seeing red tests on the release branch? We did manage to get green Python tests today. |
I don't attribute this issue to a regression in 2.51.0, but there may be flakiness in streaming test pipelines until this issue fixed or Dataflow runner rolls out a release (tentative ETA end of this week). Longer story: Python integration tests are supposed to pass --sdk_location. Due to a race during installation, some workers fail to install the SDK and become incorrectly intialized. This would not happen to workers using so called sibling sdk container protocol. Users on released Beam sdk don't stage SDK at job submission so wouldn't see this particular failure mode. I will remove this issue from 2.51.0 blocker lists for now. |
All Dataflow python pipelines use sibling protocol now. |
What happened?
See failing runs on:
https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/11337/
Issue Failure
Failure: Test is flaky
Issue Priority
Priority: 1 (unhealthy code / failing or flaky postcommit so we cannot be sure the product is healthy)
Issue Components
The text was updated successfully, but these errors were encountered: