Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support for Local Account disabled for AKS #13248

Closed
2 tasks done
aristosvo opened this issue Sep 7, 2021 · 4 comments · Fixed by #13260
Closed
2 tasks done

Support for Local Account disabled for AKS #13248

aristosvo opened this issue Sep 7, 2021 · 4 comments · Fixed by #13260

Comments

@aristosvo
Copy link
Collaborator

aristosvo commented Sep 7, 2021

Community Note

  • Please vote on this issue by adding a 👍 reaction to the original issue to help the community and maintainers prioritize this request
  • Please do not leave "+1" or "me too" comments, they generate extra noise for issue followers and do not help prioritize the request
  • If you are interested in working on this issue or have submitted a pull request, please leave a comment

Description

To upgrade security for AKS significantly, we would like to have the option to disable local accounts implemented.

When this would be implemented, you are unable to get a working kubeconfig without AAD login. This removes the threat of misuse of the local kubeconfig file of an administrator.

Using the Kubernetes provider, kubectl provider or Helm provider would be done as shown below:

The configuration as shown below does not work when managed AAD is enabled, requiring a little fix.

New or Affected Resource(s)

Potential Terraform Configuration

# infra/main.tf
resource "azurerm_resource_group" "example" {
  name     = "example-resources"
  location = "West Europe"
}

resource "azurerm_kubernetes_cluster" "example" {
  name                   = "example-aks1"
  location               = azurerm_resource_group.example.location
  resource_group_name    = azurerm_resource_group.example.name
  dns_prefix             = "exampleaks1"
  local_account_disabled = true

  default_node_pool {
    name       = "default"
    node_count = 1
    vm_size    = "Standard_D2_v2"
  }

  role_based_access_control {
    enabled                 = true

    azure_active_directory {
      managed                = true
      admin_group_object_ids = var.cluster_admin_groups
    }
  }

  identity {
    type = "SystemAssigned"
  }

  tags = {
    Environment = "Production"
  }
}
# application/main.tf
data "azurerm_kubernetes_cluster" "default" {
  name                = "myakscluster"
  resource_group_name = "my-example-resource-group"
}


data "azurerm_kubernetes_cluster" "default" {
  name                = "${var.prefix}-aks"
  resource_group_name = azurerm_resource_group.example.name
}

data "azurerm_client_config" "current" {}

provider "kubernetes" {
  host                   = data.azurerm_kubernetes_cluster.default.kube_config.0.host
  cluster_ca_certificate = base64decode(data.azurerm_kubernetes_cluster.default.kube_config.0.cluster_ca_certificate)

  exec {
    api_version = "client.authentication.k8s.io/v1beta1"
    command     = "kubelogin"
    args = [
      "get-token",
      "--login",
      "azurecli", # spn if you want to use service principal, but requires use of env vars AAD_SERVICE_PRINCIPAL_CLIENT_ID and AAD_SERVICE_PRINCIPAL_CLIENT_SECRET
      "--environment",
      "AzurePublicCloud",
      "--tenant-id",
      data.azurerm_client_config.current.tenant_id,
      "--server-id",
      "6dae42f8-4368-4678-94ff-3960e28e3630",
      "|",
      "jq",
      ".status.token"
    ]
  }
}

provider "helm" {
  kubernetes {
    host                   = data.azurerm_kubernetes_cluster.default.kube_config.0.host
    cluster_ca_certificate = base64decode(data.azurerm_kubernetes_cluster.default.kube_config.0.cluster_ca_certificate)

    exec {
      api_version = "client.authentication.k8s.io/v1beta1"
      command     = "kubelogin"
      args = [
        "get-token",
        "--login",
        "azurecli", # spn if you want to use service principal, but requires use of env vars AAD_SERVICE_PRINCIPAL_CLIENT_ID and AAD_SERVICE_PRINCIPAL_CLIENT_SECRET
        "--environment",
        "AzurePublicCloud",
        "--tenant-id",
        data.azurerm_client_config.current.tenant_id,
        "--server-id",
        "6dae42f8-4368-4678-94ff-3960e28e3630",
        "|",
        "jq",
        ".status.token"
      ]
    }
  }
}

References

@aristosvo
Copy link
Collaborator Author

Fixed in #12386, sorry for the confusion :)

@aristosvo
Copy link
Collaborator Author

aristosvo commented Sep 7, 2021

Reopened because the data source azurerm_kubernetes_cluster errors out if role_based_access_control.azure_active_directory.managed = true and local_account_disabled = true:

╷
│ Error: retrieving Admin Access Profile for Managed Kubernetes Cluster "arc-met-ingress-aks" (Resource Group "arc-met-ingress-aks-resources"): containerservice.ManagedClustersClient#GetAccessProfile: Failure responding to request: StatusCode=400 -- Original Error: autorest/azure: Service returned an error. Status=400 Code="BadRequest" Message="Getting static credential is not allowed because this cluster is set to disable local accounts."
│ 
│   with data.azurerm_kubernetes_cluster.default,
│   on main.tf line 91, in data "azurerm_kubernetes_cluster" "default":
│   91: data "azurerm_kubernetes_cluster" "default" {
│ 
╵

@github-actions
Copy link

This functionality has been released in v2.76.0 of the Terraform Provider. Please see the Terraform documentation on provider versioning or reach out if you need any assistance upgrading.

For further feature requests or bug reports with this functionality, please create a new GitHub issue following the template. Thank you!

@github-actions
Copy link

I'm going to lock this issue because it has been closed for 30 days ⏳. This helps our maintainers find and focus on the active issues.
If you have found a problem that seems similar to this, please open a new issue and complete the issue template so we can capture all the details necessary to investigate further.

@github-actions github-actions bot locked as resolved and limited conversation to collaborators Oct 10, 2021
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Projects
None yet
2 participants