Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Compatability with kubernetes >= 32.0.0 #868

Open
sebhoss opened this issue Jan 28, 2025 · 17 comments
Open

Compatability with kubernetes >= 32.0.0 #868

sebhoss opened this issue Jan 28, 2025 · 17 comments
Labels
needs_info Needs additional information from original reporter

Comments

@sebhoss
Copy link

sebhoss commented Jan 28, 2025

SUMMARY

After upgrading the Python kubernetes library to version 32+, we can no longer use the kubernetes.core Ansible module.

ISSUE TYPE
  • Bug Report
COMPONENT NAME

k8s_info but it think it affects all components

ANSIBLE VERSION
uv run -- ansible --version
ansible [core 2.18.1]
  config file = /var/home/seb/git/git.infra.run/infra.run/infrastructure/helm-deployments/main/ansible.cfg
  configured module search path = ['/home/seb/.config/ansible/home/plugins/modules', '/usr/share/ansible/plugins/modules']
  ansible python module location = /var/home/seb/git/git.infra.run/infra.run/infrastructure/helm-deployments/main/.venv/lib/python3.13/site-packages/ansible
  ansible collection location = /var/home/seb/git/git.infra.run/infra.run/infrastructure/helm-deployments/main/.ansible
  executable location = /var/home/seb/git/git.infra.run/infra.run/infrastructure/helm-deployments/main/.venv/bin/ansible
  python version = 3.13.0 (main, Oct  8 2024, 01:04:00) [Clang 18.1.8 ] (/var/home/seb/git/git.infra.run/infra.run/infrastructure/helm-deployments/main/.venv/bin/python3)
  jinja version = 3.1.5
  libyaml = True
COLLECTION VERSION
uv run -- ansible-galaxy collection list kubernetes.core

# /var/home/seb/git/git.infra.run/infra.run/infrastructure/helm-deployments/main/.ansible/ansible_collections
Collection      Version
--------------- -------
kubernetes.core 5.1.0  

# /var/home/seb/git/git.infra.run/infra.run/infrastructure/helm-deployments/main/.venv/lib/python3.13/site-packages/ansible_collections
Collection      Version
--------------- -------
kubernetes.core 5.1.0  
CONFIGURATION
ANSIBLE_HOME(env: ANSIBLE_HOME) = /home/seb/.config/ansible/home
COLLECTIONS_PATHS(/var/home/seb/git/git.infra.run/infra.run/infrastructure/helm-deployments/main/ansible.cfg) = ['/var/home/seb/git/git.infra.run/infra.run/infrastructure/helm-deployments/main/.ansible']
CONFIG_FILE() = /var/home/seb/git/git.infra.run/infra.run/infrastructure/helm-deployments/main/ansible.cfg
DEFAULT_VAULT_PASSWORD_FILE(/var/home/seb/git/git.infra.run/infra.run/infrastructure/helm-deployments/main/ansible.cfg) = /var/home/seb/git/git.infra.run/infra.run/infrastructure/helm-deployments/main/.VAULT_PASSWORD
EDITOR(env: EDITOR) = /var/home/seb/.local/bin/hx
GALAXY_CACHE_DIR(env: ANSIBLE_GALAXY_CACHE_DIR) = /home/seb/.cache/ansible/galaxy
INTERPRETER_PYTHON(/var/home/seb/git/git.infra.run/infra.run/infrastructure/helm-deployments/main/ansible.cfg) = auto_silent
MAX_FILE_SIZE_FOR_DIFF(env: ANSIBLE_MAX_DIFF_SIZE) = 104857600
PAGER(env: PAGER) = less

GALAXY_SERVERS:
OS / ENVIRONMENT

Fedora 41

STEPS TO REPRODUCE
- name: Check whether namespace exists
  delegate_to: localhost
  kubernetes.core.k8s_info:
    api_version: v1
    kind: Namespace
    name: "{{ k8s_namespace_name }}"
    context: "{{ kubectl_context }}"
  register: k8s_namespace
EXPECTED RESULTS

We want to see successful results from k8s modules

ACTUAL RESULTS

We are seeing strack traces like this:

          File "/home/haggl/repositories/infra.run/infrastructure/helm-deployments/.venv/lib/python3.13/site-packages/kubernetes/dynamic/client.py", line 273, in request
            api_response = self.client.call_api(
                path,
            ...<11 lines>...
                _request_timeout=params.get('_request_timeout')
            )

          File "/home/haggl/repositories/infra.run/infrastructure/helm-deployments/.venv/lib/python3.13/site-packages/kubernetes/client/api_client.py", line 348, in call_api
            return self.__call_api(resource_path, method,
                   ~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^
                                   path_params, query_params, header_params,
                                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
            ...<2 lines>...
                                   _return_http_data_only, collection_formats,
                                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
                                   _preload_content, _request_timeout, _host)
                                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

          File "/home/haggl/repositories/infra.run/infrastructure/helm-deployments/.venv/lib/python3.13/site-packages/kubernetes/client/api_client.py", line 180, in __call_api
            response_data = self.request(
                method, url, query_params=query_params, headers=header_params,
                post_params=post_params, body=body,
                _preload_content=_preload_content,
                _request_timeout=_request_timeout)

          File "/home/haggl/repositories/infra.run/infrastructure/helm-deployments/.venv/lib/python3.13/site-packages/kubernetes/client/api_client.py", line 373, in request
            return self.rest_client.GET(url,
                   ~~~~~~~~~~~~~~~~~~~~^^^^^
                                        query_params=query_params,
                                        ^^^^^^^^^^^^^^^^^^^^^^^^^^
                                        _preload_content=_preload_content,
                                        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
                                        _request_timeout=_request_timeout,
                                        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
                                        headers=headers)
                                        ^^^^^^^^^^^^^^^^

          File "/home/haggl/repositories/infra.run/infrastructure/helm-deployments/.venv/lib/python3.13/site-packages/kubernetes/client/rest.py", line 244, in GET
            return self.request("GET", url,
                   ~~~~~~~~~~~~^^^^^^^^^^^^
                                headers=headers,
                                ^^^^^^^^^^^^^^^^
                                _preload_content=_preload_content,
                                ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
                                _request_timeout=_request_timeout,
                                ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
                                query_params=query_params)
                                ^^^^^^^^^^^^^^^^^^^^^^^^^^

          File "/home/haggl/repositories/infra.run/infrastructure/helm-deployments/.venv/lib/python3.13/site-packages/kubernetes/client/rest.py", line 238, in request
            raise ApiException(http_resp=r)
    module_stdout: ''
    msg: |-
        MODULE FAILURE: No start of json char found
        See stdout/stderr for the exact error

Downgrading kubernetes to the latest 31.X version fixes this for us.

@gravesm
Copy link
Member

gravesm commented Jan 28, 2025

I'm unable to reproduce this. I tested using:

  • kubernetes.core==5.1.0
  • kubernetes==32.0.0
  • ansible==2.18.1
  • python==3.13.1
  • k8s server==1.32.0

Both the k8s and k8s_info module worked. Other than your k8s server not returning a response, there's nothing in your stack trace that would indicate what's wrong.

@gravesm gravesm added the needs_info Needs additional information from original reporter label Jan 28, 2025
@yurnov
Copy link
Contributor

yurnov commented Jan 28, 2025

I didn't see any breaking changes in kubernetes python client changelog

I will check in the close-to-production scenario with kubernetes==32.0.0 and ansible-core==2.16.14 (I still have some playbook incompatible with 2.17+) and kubernetes.core version 5.1.0

@sebhoss
Copy link
Author

sebhoss commented Jan 28, 2025

We are still running a 1.31.x Kubernetes cluster - maybe that's the issue? I think there were some auth related changes in 1.32 and since our stack trace is complaining that we are identified as system:anonymous even though we are actually correctly logged in our cluster maybe this is related? Stack trace is here:

...
          File "/var/home/seb/git/git.infra.run/infra.run/infrastructure/helm-deployments/main/.venv/lib/python3.13/site-packages/kubernetes/client/rest.py", line 244, in GET                                                                                                                        
            return self.request("GET", url,                                                                                                                                                                                                                                                           
                   ~~~~~~~~~~~~^^^^^^^^^^^^                                                                                                                                                                                                                                                           
                                headers=headers,                                                                                                                                                                                                                                                      
                                ^^^^^^^^^^^^^^^^                                                                                                                                                                                                                                                      
                                _preload_content=_preload_content,                                                                                                                                                                                                                                    
                                ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^       
                                _request_timeout=_request_timeout,                                                                                                                                                                                                                                    
                                ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^                                                                                                                                                                                                                                    
                                query_params=query_params)
                                ^^^^^^^^^^^^^^^^^^^^^^^^^^                                                                                                                                                                                                                                            
          File "/var/home/seb/git/git.infra.run/infra.run/infrastructure/helm-deployments/main/.venv/lib/python3.13/site-packages/kubernetes/client/rest.py", line 238, in request                                                                                                                    
            raise ApiException(http_resp=r)                              
        kubernetes.client.exceptions.ApiException: (403)                                                                                                                                                                                                                                              
        Reason: Forbidden                                                                                                                                                                                                                                                                             
        HTTP response headers: HTTPHeaderDict({'Audit-Id': '982e85b5-6f62-4363-a720-d6a020fba96c', 'Cache-Control': 'no-cache, private', 'Content-Type': 'application/json', 'X-Content-Type-Options': 'nosniff', 'X-Kubernetes-Pf-Flowschema-Uid': '6e7575f0-a901-484a-a9f8-2bdda56efb96', 'X-Kuberne
tes-Pf-Prioritylevel-Uid': '35c758f2-7858-4284-810a-da32819bcd82', 'Date': 'Tue, 28 Jan 2025 12:55:42 GMT', 'Content-Length': '189'})                                                                                                                                                                 
        HTTP response body: b'{"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \\"system:anonymous\\" cannot get path \\"/apis\\"","reason":"Forbidden","details":{},"code":403}\n'                                                                     
                                                                                                                                                                                                                                                                                                      
                                                                                                                                                                                                                                                                                                      
        During handling of the above exception, another exception occurred:                                                                                                                                                                                                                           
                                                                         
        Traceback (most recent call last):                                                                                                                                                                                                                                                            
          File "/home/seb/.ansible/tmp/ansible-tmp-1738068942.0578573-89745-158054276595492/AnsiballZ_k8s_info.py", line 107, in <module>                                                                                                                                                             
            _ansiballz_main()                                                                                                                                                                                                                                                                         
            ~~~~~~~~~~~~~~~^^                                                                                                                                                                                                                                                                         
          File "/home/seb/.ansible/tmp/ansible-tmp-1738068942.0578573-89745-158054276595492/AnsiballZ_k8s_info.py", line 99, in _ansiballz_main                                                                                                                                                       
            invoke_module(zipped_mod, temp_path, ANSIBALLZ_PARAMS)                                                                                 
            ~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^                                                                                                                                                                                                                                    
          File "/home/seb/.ansible/tmp/ansible-tmp-1738068942.0578573-89745-158054276595492/AnsiballZ_k8s_info.py", line 47, in invoke_module                                                                                                                                                         
            runpy.run_module(mod_name='ansible_collections.kubernetes.core.plugins.modules.k8s_info', init_globals=dict(_module_fqn='ansible_collections.kubernetes.core.plugins.modules.k8s_info', _modlib_path=modlib_path),
            ~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^                                                                        
                             run_name='__main__', alter_sys=True)                                                                                                                                                                                                                                     
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^                                                                                                                                                                                                                                     
          File "<frozen runpy>", line 226, in run_module                                                                                           
          File "<frozen runpy>", line 98, in _run_module_code                                                                                                                                                                                                                                         
          File "<frozen runpy>", line 88, in _run_code                                                                                                                                                                                                                                                
          File "/tmp/ansible_kubernetes.core.k8s_info_payload_ke0qgg1o/ansible_kubernetes.core.k8s_info_payload.zip/ansible_collections/kubernetes/core/plugins/modules/k8s_info.py", line 229, in <module>                                                                                           
          File "/tmp/ansible_kubernetes.core.k8s_info_payload_ke0qgg1o/ansible_kubernetes.core.k8s_info_payload.zip/ansible_collections/kubernetes/core/plugins/modules/k8s_info.py", line 221, in main                                                                                               
          File "/tmp/ansible_kubernetes.core.k8s_info_payload_ke0qgg1o/ansible_kubernetes.core.k8s_info_payload.zip/ansible_collections/kubernetes/core/plugins/module_utils/k8s/client.py", line 352, in get_api_client                                                                              
          File "/tmp/ansible_kubernetes.core.k8s_info_payload_ke0qgg1o/ansible_kubernetes.core.k8s_info_payload.zip/ansible_collections/kubernetes/core/plugins/module_utils/k8s/client.py", line 246, in wrapper                                                                                     
          File "/tmp/ansible_kubernetes.core.k8s_info_payload_ke0qgg1o/ansible_kubernetes.core.k8s_info_payload.zip/ansible_collections/kubernetes/core/plugins/module_utils/k8s/client.py", line 259, in create_api_client                                                                           
          File "/var/home/seb/git/git.infra.run/infra.run/infrastructure/helm-deployments/main/.venv/lib/python3.13/site-packages/kubernetes/dynamic/client.py", line 84, in __init__                                                                                                                 
            self.__discoverer = discoverer(self, cache_file)
                                ~~~~~~~~~~^^^^^^^^^^^^^^^^^^
          File "/tmp/ansible_kubernetes.core.k8s_info_payload_ke0qgg1o/ansible_kubernetes.core.k8s_info_payload.zip/ansible_collections/kubernetes/core/plugins/module_utils/client/discovery.py", line 190, in __init__
          File "/tmp/ansible_kubernetes.core.k8s_info_payload_ke0qgg1o/ansible_kubernetes.core.k8s_info_payload.zip/ansible_collections/kubernetes/core/plugins/module_utils/client/discovery.py", line 45, in __init__
          File "/tmp/ansible_kubernetes.core.k8s_info_payload_ke0qgg1o/ansible_kubernetes.core.k8s_info_payload.zip/ansible_collections/kubernetes/core/plugins/module_utils/client/discovery.py", line 93, in __init_cache
          File "/var/home/seb/git/git.infra.run/infra.run/infrastructure/helm-deployments/main/.venv/lib/python3.13/site-packages/kubernetes/dynamic/discovery.py", line 232, in discover
            self.__resources = self.parse_api_groups(request_resources=False)
                               ~~~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^
          File "/var/home/seb/git/git.infra.run/infra.run/infrastructure/helm-deployments/main/.venv/lib/python3.13/site-packages/kubernetes/dynamic/discovery.py", line 118, in parse_api_groups
            groups_response = self.client.request('GET', '/{}'.format(DISCOVERY_PREFIX)).groups
                              ~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
          File "/var/home/seb/git/git.infra.run/infra.run/infrastructure/helm-deployments/main/.venv/lib/python3.13/site-packages/kubernetes/dynamic/client.py", line 57, in inner
            raise api_exception(e)
...

We can reproduce this across multiple machines running different OSes and everyone can work again once we downgrade to kubernetes 31.x.y

@yurnov
Copy link
Contributor

yurnov commented Jan 28, 2025

Hi @sebhoss,

could you try something like:

- name: Check whether namespace exists
  delegate_to: localhost
  kubernetes.core.k8s_info:
    api_version: v1
    kind: Namespace
    name: "{{ k8s_namespace_name }}"
    kubeconfig: "{{ path_to_your_kubeconfig }}"
  register: k8s_namespace

with kubeconfig that has only a single context?

And in general, how do you pass credentials to the module?

@gravesm
Copy link
Member

gravesm commented Jan 28, 2025

Just tested against a 1.31.4 server and is working with kubernetes=32.0.0. How are you authenticating to the cluster? There are a few issues open for the 32.0.0 release around authentication that people seem to be having, especially with GKE and EKS.

@sebhoss
Copy link
Author

sebhoss commented Jan 28, 2025

Thanks for the quick feedback here!

No change when using the kubeconfig field set to a single context. We are using an OIDC based login against our bare metal clusters and use https://github.com/int128/kubelogin to get tokens. The users section of a kubeconfig looks like this:

users:
- name: kube.internal
  user:
    exec:
      apiVersion: client.authentication.k8s.io/v1beta1
      args:
      - oidc-login
      - get-token
      - --token-cache-dir=~/.cache/kubelogin
      - --oidc-issuer-url=https://login.internal/auth/realms/kube.internal
      - --oidc-client-id=kube.internal
      command: kubectl
      env: null
      interactiveMode: IfAvailable
      provideClusterInfo: false

@yurnov
Copy link
Contributor

yurnov commented Jan 28, 2025

No change when using the kubeconfig field set to a single context. We are using an OIDC based login against our bare metal clusters and use https://github.com/int128/kubelogin to get tokens. The users section of a kubeconfig looks like this:

users:

  • name: kube.internal
    user:
    exec:
    apiVersion: client.authentication.k8s.io/v1beta1
    args:
    - oidc-login
    - get-token
    - --token-cache-dir=~/.cache/kubelogin
    - --oidc-issuer-url=https://login.internal/auth/realms/kube.internal
    - --oidc-client-id=kube.internal
    command: kubectl
    env: null
    interactiveMode: IfAvailable
    provideClusterInfo: false

This should not be an issue, plugin/exec-based authentication is supported for 'normal' modules, and since PR #698 for kubectl connection plugin

@sebhoss
Copy link
Author

sebhoss commented Jan 28, 2025

I realized that the apiVersion could be updated to client.authentication.k8s.io/v1 but that did not fix our issue

@gravesm
Copy link
Member

gravesm commented Jan 28, 2025

It sounds like kubernetes-client/python#2333 (comment) may be your issue. I think this is a kubernetes problem not a kubernetes.core problem, but we'll leave it open until kubernetes hopefully does a bugfix and see if that sorts things out.

@sebhoss
Copy link
Author

sebhoss commented Jan 28, 2025

Thanks!

@yurnov
Copy link
Contributor

yurnov commented Jan 28, 2025

especially with GKE and EKS.

I will test the combination of kubernetes==32.0.0, ansible-core==2.16.18 with kubernetes.core:=5.1.0 with the K8s 1.31.3 deployed as AWS EKS, I have this combination (except the version of kubernetes).

Will back soon

@yurnov
Copy link
Contributor

yurnov commented Jan 28, 2025

Hi @sebhoss,

I'm back with my test result, @gravesm was right, it's an issue with kubernetes python client authentication with credential plugins (client.authentication.k8s.io/v1). So, when kubeconfig is traditional with user client certificate it works fine, but when used exec module use client.authentication.k8s.io/v1, and external executable doesn't.

In my particular case collection works fine with kubernetes==32.0.0 with own cluster but the same issue as in your case with AWS EKS.

For history:
ansible-core==2.16.14
kubernetes.core 5.1.0
Kubernetes v1.31.4

@sebhoss
Copy link
Author

sebhoss commented Jan 28, 2025

Thanks @yurnov !

@elcfd
Copy link

elcfd commented Jan 29, 2025

FYI I came across this issue when trying to run against Digital Ocean K8s clusters. The current workaround is to install kubernetes 31.0.0

REF

@elcfd
Copy link

elcfd commented Feb 4, 2025

Update on this a fix has been merged so just waiting on a new release.

I have built the main branch of kubernetes python and verified it sorts my Digital Ocean issue and am sure it will sort other providers.

@josegal
Copy link

josegal commented Feb 5, 2025

Hi -- I found the same issue using a credential manager; will there be a patch release containing the fix from kubernetes-client/python#2338 ? Thank you!

@yurnov
Copy link
Contributor

yurnov commented Feb 5, 2025

Hi -- I found the same issue using a credential manager; will there be a patch release containing the fix from kubernetes-client/python#2338? Thank you!

Just pin kubernetes>=28.0.0,!=32.0.0 in your requrements.txt or where you have a pip install

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
needs_info Needs additional information from original reporter
Projects
None yet
Development

No branches or pull requests

5 participants