-
-
Notifications
You must be signed in to change notification settings - Fork 4.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Error: Unauthorized on .terraform/modules/eks/aws_auth.tf line 65, in resource "kubernetes_config_map" "aws_auth": #1287
Comments
If you run the aws eks command from command line, what happens? Also in your kubernetes provider block try using
(also see #1280) |
I tried adding that token to the providers kubernetes { } but i get the following error:
|
Adding this to my kubernetes.tf provider file works. thank you
|
Hi. I am not having the same luck as@kaykhancheckpoint. I just ran the same terraform in a different account: there it worked and in the second account it fails. I even erased the contents of .kube/config in case there was something stale. Tried enabling and disabling tokens. |
you know what - i think it was a capacity issue with us-east-1 - sorry to bother y'all. |
Same here
|
I had this same issue with eu-west-2 (London) - since last Thursday when i deployed the same code, which worked, When I tried it again today, it fails.. I didn't change any code, but it seems to have been an issue with the terraform state file and resources when i last did an apply.. Seems the iam role for my cluster didn't create properly. :/ When i looked at the AWS EKS console, i saw this When i clicked the link - it takes me to https://docs.aws.amazon.com/eks/latest/userguide/troubleshooting_iam.html#security-iam-troubleshoot-cannot-view-nodes-or-workloads so basically there was no role for the cluster - |
I had the same issue and solved it by adding proper AWS credentials to the environment. The one I added before didn't have enough permissions, but after adding another one with admin permissions, it worked. |
I am most definitely new to AWS, EKS & terraform, but I do think I know what the problem is., I ran into the same issue. Initially the kubernetes cluster I was trying to work on was created by a federated Administrator user. I further tried to work on this cluster as another user who did have the "Administrator" role, but as per AWS snippet below, but to keep things simple, I have to basically destroy and re-create the cluster as the new user for things to work properly.
|
this is the solution, then you update you $HOME/.aws/cred.yaml file with the ACKEY, SecretKEY, and the Token: |
I'm going to lock this issue because it has been closed for 30 days ⏳. This helps our maintainers find and focus on the active issues. If you have found a problem that seems similar to this, please open a new issue and complete the issue template so we can capture all the details necessary to investigate further. |
Description
Fails to create
module.eks.kubernetes_config_map.aws_auth
because of unauthorised error.Versions
Terraform v0.14.8
Reproduction
Steps to reproduce the behavior:
Are you using workspaces?
yes
Have you cleared the local cache (see Notice section above)
yes
List steps in order that led up to the issue you encountered
terraform apply -var-file=prod.tfvars
Code Snippet to Reproduce
eks-cluster.ts
kubernetes.tf
Expected behavior
Create
module.eks.kubernetes_config_map.aws_auth[0]
Actual behavior
fails to create
Terminal Output Screenshot(s)
Additional context
aws provider is pointing to a profile that has AdministratorAccess.
The text was updated successfully, but these errors were encountered: