-
Notifications
You must be signed in to change notification settings - Fork 9.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
terraform providers lock should use the plugin cache #33837
Comments
Thanks for this feature request! If you are viewing this issue and would like to indicate your interest, please use the 👍 reaction on the issue description to upvote this issue. We also welcome additional use case descriptions. Thanks again! |
Yesterday running So for each of |
Related #27811 |
I would love this improvement, we also take a long to regenerate the lock files using the same command as @jsyrjala. I would like to run the |
this is only possible for the architecture of the system since the cache is only for one architecture. Closes #33837
I'm going to lock this issue because it has been closed for 30 days ⏳. This helps our maintainers find and focus on the active issues. |
Terraform Version
Use Cases
Provider maintenance: Regular updates.
We have many workspaces and we use multiple architectures:
Updating the
.terraform.lock.hcl
of a workspace viaterraform providers lock -platform=darwin_arm64 -platform=darwin_amd64 -platform=linux_amd64 -platform=windows_amd64
takes very long.For all of the workspaces, the hashes (even of the not modified providers) are retrieved over and over from the registry.
So this "penalty" even multiplies.
Recently, the repeated downloads even fail: E.g.
Downloading the file with the browser is no problem.
Attempted Solutions
I configured the plugin cache. But it is not used by
terraform providers lock
.Proposal
Let
terraform providers lock
use the plugin cache (read & write). That includes caching plugins of all architectures.References
terraform init
#29958 is very similarThe text was updated successfully, but these errors were encountered: