Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Use ExternalID in NodeStageVolume RPC #7754

Merged
merged 2 commits into from
Apr 20, 2020
Merged

Use ExternalID in NodeStageVolume RPC #7754

merged 2 commits into from
Apr 20, 2020

Conversation

angrycub
Copy link
Contributor

This resolves an issue with the gcp-compute-persistent-disk-csi-driver where a volume will fail to mount with the following error:

2020-04-20T13:25:29Z  Setup Failure  failed to setup alloc: pre-run hook "csi_hook" failed: rpc error: code = InvalidArgument desc = NodeStageVolume Volume ID is invalid: failed to get id components. Expected projects/{project}/zones/{zone}/disks/{name}. Got: mysql

@angrycub
Copy link
Contributor Author

In my test, the volume was defined as follows:

type = "csi"
id = "mysql"
name = "mysql"
#selfLink from the json
external_id = "projects/my_project/zones/us-east1-b/disks/cv-disk-1"
access_mode = "single-node-writer"
attachment_mode = "file-system"
plugin_id = "gcepd"

Copy link
Member

@tgross tgross left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This code change mostly LGTM. The only thing that worries me about this change is that we didn't see the problem with the EBS plugins. According to https://github.com/kubernetes-sigs/aws-ebs-csi-driver#features the EBS plugin does implement NODE_STAGE_VOLUME so I would have expected to see this appear there.

Have we tested this branch with EBS to make sure we don't have a regression? (I'm hoping this isn't a case of inconsistent implementation of the spec but it wouldn't surprise me either.)

@@ -166,7 +166,7 @@ func (v *volumeManager) stageVolume(ctx context.Context, vol *structs.CSIVolume,
// CSI NodeStageVolume errors for timeout, codes.Unavailable and
// codes.ResourceExhausted are retried; all other errors are fatal.
return v.plugin.NodeStageVolume(ctx,
vol.ID,
vol.ExternalID,
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

There's a getter method vol.RemoteID() for this. Let's use that here for consistency with the v.plugin.NodePublishVolume call.

@angrycub
Copy link
Contributor Author

Manually tested AWS EBS using the original fix version (using vol.ExternalID) and it does not regress it. Perhaps the AWS EBS volume driver doesn't actually cause staging to happen?

@tgross
Copy link
Member

tgross commented Apr 20, 2020

Perhaps the AWS EBS volume driver doesn't actually cause staging to happen?

Yeah, entirely possible that it supports the RPC but only actually does the staging work under specific circumstances.

@tgross
Copy link
Member

tgross commented Apr 20, 2020

Yeah, entirely possible that it supports the RPC but only actually does the staging work under specific circumstances.

Was just digging through the code and it looks like this might also help explain the NVME issue you were reporting in chat: https://github.com/kubernetes-sigs/aws-ebs-csi-driver/blob/04f0e8a8c70a8b4ba0fa76b24f2aef7bc26a18cc/pkg/driver/node.go#L479 (assuming the Docker folks fixed the problem with the actual mounting step).

Copy link
Member

@tgross tgross left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@github-actions
Copy link

github-actions bot commented Jan 9, 2023

I'm going to lock this pull request because it has been closed for 120 days ⏳. This helps our maintainers find and focus on the active contributions.
If you have found a problem that seems related to this change, please open a new issue and complete the issue template so we can capture all the details necessary to investigate further.

@github-actions github-actions bot locked as resolved and limited conversation to collaborators Jan 9, 2023
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants