Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Shell unmatched syntax error during the [kubernetes_sigs.kubespray.etcd : Gen_certs | run cert generation script for etcd and kube control plane nodes] task. #10609

Closed
adamhgriffith-uofu opened this issue Nov 9, 2023 · 2 comments · Fixed by #10610
Labels
kind/bug Categorizes issue or PR as related to a bug.

Comments

@adamhgriffith-uofu
Copy link

Shell unmatched syntax error during the [kubernetes_sigs.kubespray.etcd : Gen_certs | run cert generation script for etcd and kube control plane nodes] task.

Environment:

  • Cloud provider or hardware configuration: Three identical on-prem VMs.

    $ sudo hostnamectl
    Static hostname: node1
    Icon name: computer-vm
    Chassis: vm 🖴
    Machine ID: abcd123
    Boot ID: abcd123
    Virtualization: xen
    Operating System: Rocky Linux 9.2 (Blue Onyx)       
    CPE OS Name: cpe:/o:rocky:rocky:9::baseos
    Kernel: Linux 5.14.0-284.30.1.el9_2.x86_64
    Architecture: x86-64
    Hardware Vendor: Xen
    Hardware Model: HVM domU
    Firmware Version: 4.11.4-pre
  • OS (printf "$(uname -srm)\n$(cat /etc/os-release)\n"):

    • Kernel version: Linux 5.14.0-284.30.1.el9_2.x86_64 x86_64
    • OS Release Info:
      NAME="Rocky Linux"
      VERSION="9.2 (Blue Onyx)"
      ID="rocky"
      ID_LIKE="rhel centos fedora"
      VERSION_ID="9.2"
      PLATFORM_ID="platform:el9"
      PRETTY_NAME="Rocky Linux 9.2 (Blue Onyx)"
      ANSI_COLOR="0;32"
      LOGO="fedora-logo-icon"
      CPE_NAME="cpe:/o:rocky:rocky:9::baseos"
      HOME_URL="https://rockylinux.org/"
      BUG_REPORT_URL="https://bugs.rockylinux.org/"
      SUPPORT_END="2032-05-31"
      ROCKY_SUPPORT_PRODUCT="Rocky-Linux-9"
      ROCKY_SUPPORT_PRODUCT_VERSION="9.2"
      REDHAT_SUPPORT_PRODUCT="Rocky Linux"
      REDHAT_SUPPORT_PRODUCT_VERSION="9.2"
      
  • Version of Ansible (ansible --version):

    ansible [core 2.14.11]
      config file = /home/sampleuser2/repos/chpc-ansible/git-kubespray/ansible.cfg
      configured module search path = ['/home/sampleuser2/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules']
      ansible python module location = /home/sampleuser2/.conda/envs/git-kubespray/lib/python3.9/site-packages/ansible
      ansible collection location = /home/sampleuser2/.ansible/collections:/usr/share/ansible/collections
      executable location = /home/sampleuser2/.conda/envs/git-kubespray/bin/ansible
      python version = 3.9.16 (main, May 15 2023, 23:46:34) [GCC 11.2.0] (/home/sampleuser2/.conda/envs/git-kubespray/bin/python)
      jinja version = 3.1.2
      libyaml = True
    
  • Version of Python (python --version):

    Python 3.9.16

Kubespray version (commit) (git rev-parse --short HEAD): cca7615

Network plugin used: calico

Full inventory with variables (ansible -i inventory/sample/inventory.ini all -m debug -a "var=hostvars[inventory_hostname]"):

Variables: (all defaults)

Inventory file:

[all]
node1 ansible_host=node1.my.domain etcd_member_name=etcd1
node2 ansible_host=node2.my.domain etcd_member_name=etcd2
node3 ansible_host=node3.my.domain etcd_member_name=etcd3

[all:vars]
ansible_ssh_private_key_file=/path/to/id_ed25519
ansible_user=sampleuser

[kube_control_plane]
node1

[etcd]
node1
node2
node3

[kube_node]
node2
node3

[calico_rr]

[k8s_cluster:children]
kube_control_plane
kube_node
calico_rr

Command used to invoke ansible:

Load Kubespray as an Ansible collection:

ansible-galaxy install -r ./requirements.yml

where ./requirements.yml is:

collections:
  - ansible.posix
  - ansible.netcommon
  - community.general
  - kubernetes.core
  - name: https://github.com/kubernetes-sigs/kubespray
    type: git
    version: master

and create the cluster from the inventory file described above:

ansible-playbook -i inventory/inventory.ini --become --become-user=root ./playbook.yml

where ./playbook.yml is:

- name: Install Kubernetes
  ansible.builtin.import_playbook: kubernetes_sigs.kubespray.cluster

Output of ansible run:

(venv) $ ansible-playbook -i inventory/inventory.ini --become --become-user=root ./playbook.yml
...
...
TASK [kubernetes_sigs.kubespray.etcd : Gen_certs | run cert generation script for etcd and kube control plane nodes] **************************************************************************************
...
...
<node1.my.domain> SSH: EXEC ssh -o ControlMaster=auto -o ControlPersist=5m -o ConnectionAttempts=10 -o StrictHostKeyChecking=no -o UserKnownHostsFile=/dev/null -o Port=22 -o 'IdentityFile="/path/to/id_ed25519"' -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o 'User="sampleuser"' -o ConnectTimeout=10 -o 'ControlPath="/home/sampleuser2/.ansible/cp/21453df1a7"' -tt node1.my.domain '/bin/sh -c '"'"'sudo -H -S -n  -u root /bin/sh -c '"'"'"'"'"'"'"'"'echo BECOME-SUCCESS-svuwpgsgkrygesjzsszhcsnfccgmgzym ; ALL_PROXY='"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"''"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"' FTP_PROXY='"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"''"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"' HTTPS_PROXY='"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"''"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"' HTTP_PROXY='"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"''"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"' NO_PROXY='"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"''"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"' all_proxy='"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"''"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"' ftp_proxy='"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"''"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"' http_proxy='"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"''"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"' https_proxy='"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"''"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"' no_proxy='"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"''"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"' MASTERS='"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'      node1
        node2
        node3
  '"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"' HOSTS='"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'      node1
  '"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"' /usr/bin/python3 /users/sampleuser/.ansible/tmp/ansible-tmp-1699561410.2820885-185792-272278171487851/AnsiballZ_command.py'"'"'"'"'"'"'"'"' && sleep 0'"'"''
<node1.my.domain> (1, b'Unmatched \'\'\'.\r\nnode2: Command not found.\r\nnode3: Command not found.\r\n"\'"\'"\' HOSTS=\'"\'"\'": Command not found.\r\nUnmatched \'\'\'.\r\n', b'Shared connection to node1.my.domain closed.\r\n')
<node1.my.domain> Failed to connect to the host via ssh: Shared connection to node1.my.domain closed.
<node1.my.domain> ESTABLISH SSH CONNECTION FOR USER: sampleuser
<node1.my.domain> SSH: EXEC ssh -o ControlMaster=auto -o ControlPersist=5m -o ConnectionAttempts=10 -o StrictHostKeyChecking=no -o UserKnownHostsFile=/dev/null -o Port=22 -o 'IdentityFile="/home/sampleuser2/.cloudlab/ssh/id_ed25519"' -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o 'User="sampleuser"' -o ConnectTimeout=10 -o 'ControlPath="/home/sampleuser2/.ansible/cp/21453df1a7"' node1.my.domain '/bin/sh -c '"'"'rm -f -r /users/sampleuser/.ansible/tmp/ansible-tmp-1699561410.2820885-185792-272278171487851/ > /dev/null 2>&1 && sleep 0'"'"''
<node1.my.domain> (0, b'', b'')

fatal: [node1]: FAILED! => {
    "changed": false,
    "module_stderr": "Shared connection to node1.my.domain closed.\r\n",
    "module_stdout": "Unmatched '''.\r\nnode2: Command not found.\r\nnode3: Command not found.\r\n\"'\"'\"' HOSTS='\"'\"'\": Command not found.\r\nUnmatched '''.\r\n",
    "msg": "MODULE FAILURE\nSee stdout/stderr for the exact error",
    "rc": 1
}
NO MORE HOSTS LEFT ***********************************************************************************************************************************************************************************************************************************

PLAY RECAP *******************************************************************************************************************************************************************************************************************************************
localhost                  : ok=3    changed=0    unreachable=0    failed=0    skipped=0    rescued=0    ignored=0   
node1                      : ok=492  changed=3    unreachable=0    failed=1    skipped=582  rescued=0    ignored=0   
node2                      : ok=425  changed=2    unreachable=0    failed=0    skipped=493  rescued=0    ignored=0   
node3                      : ok=425  changed=2    unreachable=0    failed=0    skipped=492  rescued=0    ignored=0 

Anything else do we need to know:

@adamhgriffith-uofu adamhgriffith-uofu added the kind/bug Categorizes issue or PR as related to a bug. label Nov 9, 2023
@VannTen
Copy link
Contributor

VannTen commented Nov 10, 2023

Could you check if the linked PR fix it (once CI passes...) ? It's possible that the jinja templating has caused some issue to the shell...

@adamhgriffith-uofu
Copy link
Author

Thank you for the quick turn-around. Using your feature branch did fix this issue.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
kind/bug Categorizes issue or PR as related to a bug.
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants