-
Notifications
You must be signed in to change notification settings - Fork 1.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
CSI volume getting mounted as EmptyDir #4325
Comments
This is really strange. The KFP backend does very minimal modifications to the pipeline before handing it to Argo. I wonder could it be caused by using outdated Go clients for Kubernetes and Argo. Maybe the
Can you please check the Workflow yaml submitted to execution?
|
That only the contains the name:
|
Is it running in Azure? Help wanted from community. |
Yes, it's running in AKS. |
Then the most likely cause is the one I listed: The outdated Go clients for Kubernetes and Argo lose fields when constructing Workflow object from YAML. @rmgogogo I really think we should improve the backend build story so that we can upgrade the modules. They're pretty old now. |
Then this should be resolved by #3770. |
/assign @jingzhang36 |
This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions. |
/cc @Bobgy |
I think this should be fixed by upgrading kubernetes client, partially related to #4553. |
@Bobgy what's missing for this issue to be resolved? |
@Shaked the argo upstream issue seems to target to be released as part of argo V3 in January: argoproj/argo-workflows#4426 (comment). We'll upgrade argo to V3 and get this resolved after argo's release. |
This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions. |
@Bobgy any update about this issue? |
This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions. |
We have upgraded to latest argo, so most likely the issue is fixed. Please let me know if it's not the case. |
Hi @Bobgy, I am still facing this issue with kubeflow 1.4 and AWS EKS 1.21. Any inputs on how to get past this? |
What steps did you take:
I'm trying to mount a CSI volume in my kubeflow pipeline using the Azure Key Vault Provider for Secrets Store CSI Driver.
I'm using python to create the pipeline and mount the volume:
What happened:
The pipeline starts, but the volume gets mounted to the container as EmptyDir.
What did you expect to happen:
The volume is mounted as a CSI volume.
Environment:
How did you deploy Kubeflow Pipelines (KFP)?
https://github.com/kaizentm/kubemlops/blob/master/setup/kfp/kubeflow-install.sh
KFP version: 1.0.0
KFP SDK version:
kfp 1.0.0
kfp-server-api 1.0.0
Anything else you would like to add:
I tried unzipping the compiled pipeline and submitting it using the argo cli. This worked, the volume was mounted successfully.
This is what the volume looks like in the yaml file:
This is what it looks like in the pod created by the kubeflow pipeline:
This is what it looks like in the pod created by submitting directly to argo:
The csi driver logs contain no record of any attempts to create the volume.
/kind bug
/area backend
The text was updated successfully, but these errors were encountered: