Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

OCI Image Updater use-case #33285

Closed
weeshal opened this issue Jan 4, 2023 · 8 comments
Closed

OCI Image Updater use-case #33285

weeshal opened this issue Jan 4, 2023 · 8 comments
Assignees
Labels
Client This issue points to a problem in the data-plane of the library. Container Registry customer-reported Issues that are reported by GitHub users external to the Azure organization. needs-team-attention Workflow: This issue needs attention from Azure service team or SDK team question The issue doesn't require a change to the product in order to be resolved. Most issues start as that
Milestone

Comments

@weeshal
Copy link

weeshal commented Jan 4, 2023

Library name

Azure.Containers.ContainerRegistry

Please describe the feature.

We would like to use the azure sdk to automate our container deployment process. At a specified cadence, an agent running on an edge machine would pull the latest artifacts from an Azure Container Registry and load that image onto the machine. The registry is not configured for anonymous access - our workflow currently is to utilize a pull-token to authenticate the download. We would ideally retrieve the token password through an Azure KeyVault SecretClient and use it to pull the desired image. Specifically, we would like to call DownloadBlobAsync.

Based on my understanding, the current SDK only supports anonymous access operations, but I would like to inquire about any potential plans to support authentication through a token/password.

@ghost ghost added needs-triage Workflow: This is a new issue that needs to be triaged to the appropriate team. customer-reported Issues that are reported by GitHub users external to the Azure organization. question The issue doesn't require a change to the product in order to be resolved. Most issues start as that labels Jan 4, 2023
@jsquire jsquire added Container Registry Client This issue points to a problem in the data-plane of the library. needs-team-attention Workflow: This issue needs attention from Azure service team or SDK team and removed needs-triage Workflow: This is a new issue that needs to be triaged to the appropriate team. labels Jan 4, 2023
@jsquire
Copy link
Member

jsquire commented Jan 4, 2023

Thank you for your feedback. Tagging and routing to the team member best able to assist.

@annelo-msft annelo-msft added this to the 2023-02 milestone Jan 4, 2023
@annelo-msft
Copy link
Member

Thanks for your question, @weeshal!

Based on my understanding, the current SDK only supports anonymous access operations

The library does support anonymous access. It also supports authentication via AAD, as described here.

Would this be sufficient for your purposes?

@annelo-msft annelo-msft added needs-author-feedback Workflow: More information is needed from author to address the issue. and removed needs-team-attention Workflow: This issue needs attention from Azure service team or SDK team labels Jan 4, 2023
@weeshal
Copy link
Author

weeshal commented Jan 5, 2023

Hey Anne, yes that was sufficient in terms of authentication. However, wanted to follow up on downloading larger blobs - the single DownloadBlobAsync method times out when downloading a larger artifact (~5GB). Here is the error I am facing:
System.AggregateException: Retry failed after 4 tries. Retry settings can be adjusted in ClientOptions.Retry. (Stream was too long.) (Stream was too long.) (Stream was too long.) (Stream was too long.) ---> System.IO.IOException: Stream was too long. at System.IO.MemoryStream.Write(Byte[] buffer, Int32 offset, Int32 count) at System.IO.MemoryStream.WriteAsync(ReadOnlyMemory1 buffer, CancellationToken cancellationToken) --- End of stack trace from previous location --- at Azure.Core.Pipeline.ResponseBodyPolicy.CopyToAsync(Stream source, Stream destination, CancellationTokenSource cancellationTokenSource) at Azure.Core.Pipeline.ResponseBodyPolicy.ProcessAsync(HttpMessage message, ReadOnlyMemory1 pipeline, Boolean async) at Azure.Core.Pipeline.BearerTokenAuthenticationPolicy.ProcessAsync(HttpMessage message, ReadOnlyMemory1 pipeline, Boolean async) at Azure.Core.Pipeline.RedirectPolicy.ProcessAsync(HttpMessage message, ReadOnlyMemory1 pipeline, Boolean async) at Azure.Core.Pipeline.RetryPolicy.ProcessAsync(HttpMessage message, ReadOnlyMemory1 pipeline, Boolean async) --- End of inner exception stack trace --- at Azure.Core.Pipeline.RetryPolicy.ProcessAsync(HttpMessage message, ReadOnlyMemory1 pipeline, Boolean async) at Azure.Containers.ContainerRegistry.ContainerRegistryBlobRestClient.GetBlobAsync(String name, String digest, CancellationToken cancellationToken) at Azure.Containers.ContainerRegistry.Specialized.ContainerRegistryBlobClient.DownloadBlobAsync(String digest, CancellationToken cancellationToken) at AzcRobotAgent.ContainerUpdateWorker.ExecuteAsync(CancellationToken stoppingToken) in C:\Users\vishalgi\Azure-Compute-Robotics\src\dotnet\AzcRobotAgent\ContainerUpdateWorker.cs:line 145 ---> (Inner Exception #1) System.IO.IOException: Stream was too long. at System.IO.MemoryStream.Write(Byte[] buffer, Int32 offset, Int32 count) at System.IO.MemoryStream.WriteAsync(ReadOnlyMemory1 buffer, CancellationToken cancellationToken) --- End of stack trace from previous location --- at Azure.Core.Pipeline.ResponseBodyPolicy.CopyToAsync(Stream source, Stream destination, CancellationTokenSource cancellationTokenSource) at Azure.Core.Pipeline.ResponseBodyPolicy.ProcessAsync(HttpMessage message, ReadOnlyMemory1 pipeline, Boolean async) at Azure.Core.Pipeline.BearerTokenAuthenticationPolicy.ProcessAsync(HttpMessage message, ReadOnlyMemory1 pipeline, Boolean async) at Azure.Core.Pipeline.RedirectPolicy.ProcessAsync(HttpMessage message, ReadOnlyMemory1 pipeline, Boolean async) at Azure.Core.Pipeline.RetryPolicy.ProcessAsync(HttpMessage message, ReadOnlyMemory1 pipeline, Boolean async)<--- ---> (Inner Exception #2) System.IO.IOException: Stream was too long. at System.IO.MemoryStream.Write(Byte[] buffer, Int32 offset, Int32 count) at System.IO.MemoryStream.WriteAsync(ReadOnlyMemory1 buffer, CancellationToken cancellationToken) --- End of stack trace from previous location --- at Azure.Core.Pipeline.ResponseBodyPolicy.CopyToAsync(Stream source, Stream destination, CancellationTokenSource cancellationTokenSource) at Azure.Core.Pipeline.ResponseBodyPolicy.ProcessAsync(HttpMessage message, ReadOnlyMemory1 pipeline, Boolean async) at Azure.Core.Pipeline.BearerTokenAuthenticationPolicy.ProcessAsync(HttpMessage message, ReadOnlyMemory1 pipeline, Boolean async) at Azure.Core.Pipeline.RedirectPolicy.ProcessAsync(HttpMessage message, ReadOnlyMemory1 pipeline, Boolean async) at Azure.Core.Pipeline.RetryPolicy.ProcessAsync(HttpMessage message, ReadOnlyMemory1 pipeline, Boolean async)<--- ---> (Inner Exception #3) System.IO.IOException: Stream was too long. at System.IO.MemoryStream.Write(Byte[] buffer, Int32 offset, Int32 count) at System.IO.MemoryStream.WriteAsync(ReadOnlyMemory1 buffer, CancellationToken cancellationToken) --- End of stack trace from previous location --- at Azure.Core.Pipeline.ResponseBodyPolicy.CopyToAsync(Stream source, Stream destination, CancellationTokenSource cancellationTokenSource) at Azure.Core.Pipeline.ResponseBodyPolicy.ProcessAsync(HttpMessage message, ReadOnlyMemory1 pipeline, Boolean async) at Azure.Core.Pipeline.BearerTokenAuthenticationPolicy.ProcessAsync(HttpMessage message, ReadOnlyMemory1 pipeline, Boolean async) at Azure.Core.Pipeline.RedirectPolicy.ProcessAsync(HttpMessage message, ReadOnlyMemory1 pipeline, Boolean async) at Azure.Core.Pipeline.RetryPolicy.ProcessAsync(HttpMessage message, ReadOnlyMemory1 pipeline, Boolean async)<---`

Any suggestions on this?

@ghost ghost added needs-team-attention Workflow: This issue needs attention from Azure service team or SDK team and removed needs-author-feedback Workflow: More information is needed from author to address the issue. labels Jan 5, 2023
@weeshal weeshal changed the title [FEATURE REQ] Token-authentication for downloading artifacts OCI Image Updater use-case Jan 5, 2023
@annelo-msft
Copy link
Member

Hi @weeshal, I'm glad to hear you were able to authenticate and call DownloadBlob!

This is a known issue with large blobs, and the resolution is being tracked by #32414. The fix will not release in beta 5, but we intend it to come in one of the following betas.

If this is your remaining outstanding issue, may I close this issue and have you follow that one instead?

@annelo-msft annelo-msft added needs-author-feedback Workflow: More information is needed from author to address the issue. and removed needs-team-attention Workflow: This issue needs attention from Azure service team or SDK team labels Jan 6, 2023
@weeshal
Copy link
Author

weeshal commented Jan 6, 2023

Sounds good to me. Thank you!

@ghost ghost added needs-team-attention Workflow: This issue needs attention from Azure service team or SDK team and removed needs-author-feedback Workflow: More information is needed from author to address the issue. labels Jan 6, 2023
@weeshal
Copy link
Author

weeshal commented Jan 6, 2023

While on the subject, I noticed that the digest received from the manifest directly was not the one that referred to the actual image itself. It required drilling into the first (and in our case, only) layer, and using the digest within the OciBlobDescriptor. Not sure if this already tracked, or intended, but it would be great to use the root level digest to download the entire image - It seems the manifest at the root level is of type application/vnd.unknown.config.v1+json instead of application/vnd.oci.image.config.v1+json. See more info here: https://oras.land/cli/3_manifest_config/.

@annelo-msft
Copy link
Member

Thanks @weeshal!

Sounds good to me. Thank you!

Great, I'm closing this issue in response to this.

While on the subject, I noticed that the digest received from the manifest directly was not the one that referred to the actual image itself.

Interesting! I'm curious to learn more and help solve this. Would you be willing to file a new issue (so the title reflects the issue) with a repro description?

Many thanks!

@annelo-msft
Copy link
Member

@weeshal, I am wondering if this PR will address the issue you've described in this comment.

@github-actions github-actions bot locked and limited conversation to collaborators Apr 25, 2023
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
Client This issue points to a problem in the data-plane of the library. Container Registry customer-reported Issues that are reported by GitHub users external to the Azure organization. needs-team-attention Workflow: This issue needs attention from Azure service team or SDK team question The issue doesn't require a change to the product in order to be resolved. Most issues start as that
Projects
None yet
Development

No branches or pull requests

3 participants