Skip to content
This repository has been archived by the owner on Jul 11, 2021. It is now read-only.

Agent errors (interface is nil) #6

Open
adamsmesher opened this issue Apr 9, 2019 · 14 comments
Open

Agent errors (interface is nil) #6

adamsmesher opened this issue Apr 9, 2019 · 14 comments
Assignees
Labels
bug Something isn't working enhancement New feature or request

Comments

@adamsmesher
Copy link

image

agents errors on windows10x64 and windows7x64 OSs is same:
`panic: interface conversion: interface {} is nil, not []interface {}

goroutine 19 [running]:
main.StartPulse()
/reternal-agent/agent/src/corebeacon.go:73 +0x284
created by main.StartBeacon
/reternal-agent/agent/src/corebeacon.go:65 +0x3e`

@d3vzer0 d3vzer0 added bug Something isn't working enhancement New feature or request labels Apr 9, 2019
@d3vzer0
Copy link
Owner

d3vzer0 commented Apr 9, 2019

I didn't fully implement proper error handling on the Go networking component of the agent yet so it's difficult to tell. However, I suspect this error occurs when the agent was unable to reach the destination IP/URL for the C2 channel.

Changing/setting the destination host via the UI is something I have planned for this thursday. You can validate what destination IP is used by the agent by checking the compiler logs. You can view these by using the following process:

  • run: docker ps |grep compiler
  • note the CONTAINER ID that belongs to the reternal-quickstart_compiler container
  • run: docker logs <container_id>

The output should show all compilation attempts with the variables. As example:

2019-04-08 10:19:31,142 - reternal-agent - INFO - Building: windows-386 http://localhost:9000/api/v1/ping
[2019-04-08 10:19:31,142: INFO/ForkPoolWorker-1] Building: windows-386 http://localhost:9000/api/v1/ping

The last piece specifies the URL the agent will attempt to reach. Does this match with the host your C2 container is running on? :)

@d3vzer0
Copy link
Owner

d3vzer0 commented Apr 10, 2019

@devnullz The latest commit is still a concept, but you can change the destination at the top right whenever you want. Don't forget to compile the payloads via the UI again :) The adjusted C2 IP still temporary and stored inside your session until I think of a more suitable solution

Screenshot 2019-04-10 at 00 24 40

@d3vzer0
Copy link
Owner

d3vzer0 commented May 15, 2019

Closing the issue for now since the latest release implements payload creation in an updated manner. If the issue still occurs please re-open a ticket ^^

@d3vzer0 d3vzer0 closed this as completed May 15, 2019
@uchakin
Copy link

uchakin commented May 16, 2019

@d3vzer0

I have same issue after payload starting

Capture
Agent error on windows10x64

Error in output from compiler log after request from payload:

10.0.10.4 - - [16/May/2019 15:32:56] "POST /api/v1/ping HTTP/1.1" 500 -
Traceback (most recent call last):
File "/usr/local/lib/python3.6/site-packages/flask/app.py", line 2309, in call
return self.wsgi_app(environ, start_response)
File "/usr/local/lib/python3.6/site-packages/flask/app.py", line 2295, in wsgi_app
response = self.handle_exception(e)
File "/usr/local/lib/python3.6/site-packages/flask_restful/init.py", line 269, in error_router
return original_handler(e)
File "/usr/local/lib/python3.6/site-packages/flask/app.py", line 1741, in handle_exception
reraise(exc_type, exc_value, tb)
File "/usr/local/lib/python3.6/site-packages/flask/_compat.py", line 34, in reraise
raise value.with_traceback(tb)
File "/usr/local/lib/python3.6/site-packages/flask/app.py", line 2292, in wsgi_app
response = self.full_dispatch_request()
File "/usr/local/lib/python3.6/site-packages/flask/app.py", line 1815, in full_dispatch_request
rv = self.handle_user_exception(e)
File "/usr/local/lib/python3.6/site-packages/flask_restful/init.py", line 269, in error_router
return original_handler(e)
File "/usr/local/lib/python3.6/site-packages/flask/app.py", line 1718, in handle_user_exception
reraise(exc_type, exc_value, tb)
File "/usr/local/lib/python3.6/site-packages/flask/_compat.py", line 34, in reraise
raise value.with_traceback(tb)
File "/usr/local/lib/python3.6/site-packages/flask/app.py", line 1813, in full_dispatch_request
rv = self.dispatch_request()
File "/usr/local/lib/python3.6/site-packages/flask/app.py", line 1799, in dispatch_request
return self.view_functionsrule.endpoint
File "/usr/local/lib/python3.6/site-packages/flask_restful/init.py", line 458, in wrapper
resp = resource(*args, **kwargs)
File "/usr/local/lib/python3.6/site-packages/flask/views.py", line 88, in view
return self.dispatch_request(*args, **kwargs)
File "/usr/local/lib/python3.6/site-packages/flask_restful/init.py", line 573, in dispatch_request
resp = meth(*args, **kwargs)
File "/reternal-c2/app/api_pulse.py", line 43, in post
result = get_tasks.get()
File "/usr/local/lib/python3.6/site-packages/celery/result.py", line 226, in get
on_message=on_message,
File "/usr/local/lib/python3.6/site-packages/celery/backends/asynchronous.py", line 188, in wait_for_pending
for _ in self._wait_for_pending(result, **kwargs):
File "/usr/local/lib/python3.6/site-packages/celery/backends/asynchronous.py", line 255, in _wait_for_pending
on_interval=on_interval):
File "/usr/local/lib/python3.6/site-packages/celery/backends/asynchronous.py", line 56, in drain_events_until
yield self.wait_for(p, wait, timeout=1)
File "/usr/local/lib/python3.6/site-packages/celery/backends/asynchronous.py", line 65, in wait_for
wait(timeout=timeout)
File "/usr/local/lib/python3.6/site-packages/celery/backends/redis.py", line 127, in drain_events
message = self._pubsub.get_message(timeout=timeout)
File "/usr/local/lib/python3.6/site-packages/redis/client.py", line 3135, in get_message
response = self.parse_response(block=False, timeout=timeout)
File "/usr/local/lib/python3.6/site-packages/redis/client.py", line 3034, in parse_response
if not block and not connection.can_read(timeout=timeout):
File "/usr/local/lib/python3.6/site-packages/redis/connection.py", line 628, in can_read
return self._parser.can_read() or self._selector.can_read(timeout)
File "/usr/local/lib/python3.6/site-packages/redis/selector.py", line 28, in can_read
return self.check_can_read(timeout)
File "/usr/local/lib/python3.6/site-packages/redis/selector.py", line 156, in check_can_read
events = self.read_poller.poll(timeout)
RuntimeError: concurrent poll() invocation

Reternal ip - 10.0.10.5 (I am running on Ubuntu 18.04, all docker containers are up.)
Windows 10x64 ip - 10.0.10.4

Is there anything incorrect in there? or I failed something?

@d3vzer0 d3vzer0 reopened this May 16, 2019
@d3vzer0
Copy link
Owner

d3vzer0 commented May 16, 2019

@400notOK

Hmm I think networking from the reternal-c2 container towards Redis or Mongo DB has a time-out (based on the conrurrent poll() error). Did you run the Ansible install or manual compose? :)

@uchakin
Copy link

uchakin commented May 16, 2019

@d3vzer0

I ran via docker compose manual.

My docker-compose.yml:

services:
  mongodb:
    image: mongo
    restart: unless-stopped
    networks:
    - rtn-backend
    ports:
    - 27017:27017
    volumes:
    - mongodb-data:/data/db
  redis-service:
    image: redis
    restart: unless-stopped
    networks:
    - rtn-backend
    ports:
    - 6379:6379
  api:
    build: ./backend
    depends_on:
    - mongodb
    - redis-service
    networks:
    - rtn-frontend
    - rtn-backend
    ports:
    - 5000:5000
    environment:
      JWT_SECRET:
      FLASK_SECRET:
      CORS_DOMAIN: http://10.0.10.5
      C2_DEST: http://10.0.10.5:9000/api/v1/ping
  api-socket:
    build: ./backend
    depends_on:
    - redis-service
    networks:
    - rtn-backend
    environment:
      JWT_SECRET:
      FLASK_SECRET:
    command:
    - celery
    - -A
    - app.tasks.listener.celery
    - worker
    - -Q
    - api
  compiler:
    build: ./agent
    depends_on:
    - redis-service
    networks:
    - rtn-backend
  ui:
    build:
      context: ./ui
      args:
        VUE_APP_BASEAPI: http://10.0.10.5:5000/api/v1
        VUE_APP_SOCKETHOST: http://10.0.10.5:5000
    depends_on:
    - api
    networks:
    - rtn-frontend
    volumes:
    - ./navigator-reternal.json:/usr/share/nginx/html/mitre-navigator/assets/config.json
    ports:
    - 10.0.10.5:80:80
  c2:
    build: ./c2
    depends_on:
    - redis-service
    - mongodb
    environment:
      C2_SECRET:
      C2_PORT: 9000
    networks:
    - rtn-backend
    ports:
    - 10.0.10.5:9000:9000
volumes:
  mongodb-data:
    driver: local
networks:
  rtn-frontend:
    driver: bridge
  rtn-backend:
    driver: bridge

@d3vzer0
Copy link
Owner

d3vzer0 commented May 16, 2019

@400notOK

Hmm difficult to tell. I will be pushing an update this evening to (likely) fix most of the issues that I'm currently aware of + implement content encryption with key-exchange between the agent and C2 server. I would also suggest trying the Ansible installer even if its on localhost. Ansible will properly configure networking and provide some easier configuration with relays :)

https://github.com/d3vzer0/reternal-quickstart/wiki/1.A-Ansible-Install-Guide

@uchakin
Copy link

uchakin commented May 17, 2019

@d3vzer0

When I tried ansible way 🙈, I got:

tester@tester-VirtualBox:~/reternal-quickstart/ansible$ sudo -E ansible-playbook reternal.yml -i inventories/development/hosts.ini --ask-become-pass --become-method=sudo --ask-vault-pass --user root
SU password: 
Vault password: 

PLAY [all] *************************************************************************************************************************************************************************************************

TASK [Gathering Facts] *************************************************************************************************************************************************************************************
ok: [rtn-dev-01]

TASK [generic : include distribution specific install] *****************************************************************************************************************************************************
fatal: [rtn-dev-01]: FAILED! => {"reason": "Unable to retrieve file contents\nCould not find or access '/home/tester/reternal-quickstart/ansible/Ubuntu.yml' on the Ansible Controller.\nIf you are using a module and expect the file to exist on the remote, see the remote_src option"}
	to retry, use: --limit @/home/tester/reternal-quickstart/ansible/reternal.retry

PLAY RECAP *************************************************************************************************************************************************************************************************
rtn-dev-01                 : ok=1    changed=0    unreachable=0    failed=1   `

@d3vzer0
Copy link
Owner

d3vzer0 commented May 17, 2019

Strange, there should be a proper Ubuntu template for the 'generic' Ansible role available. Do you mind running and sharing the following details? :) This should help with finding out whats going on/why it can't find the Ubuntu template:

  • Ansible version: ansible --version
  • Debug output: add -vvv to the end of your Ansible command when running the playbook
  • Contents of the host config: ie. the rtn-dev-01.yml file

@uchakin
Copy link

uchakin commented May 17, 2019

No problem)

Ansible version:

ansible 2.7.10
  config file = None
  configured module search path = ['/home/tester/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules']
  ansible python module location = /usr/local/lib/python3.6/dist-packages/ansible
  executable location = /usr/local/bin/ansible
  python version = 3.6.7 (default, Oct 22 2018, 11:32:17) [GCC 8.2.0]

Debug output

tester@tester-VirtualBox:~/reternal-quickstart/ansible$ sudo -E ansible-playbook reternal.yml -i inventories/development/hosts.ini --ask-become-pass --become-method=sudo --ask-vault-pass --user root -vvv
ansible-playbook 2.7.10
  config file = None
  configured module search path = ['/home/tester/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules']
  ansible python module location = /usr/local/lib/python3.6/dist-packages/ansible
  executable location = /usr/local/bin/ansible-playbook
  python version = 3.6.7 (default, Oct 22 2018, 11:32:17) [GCC 8.2.0]
No config file found; using defaults
SUDO password: 
Vault password: 
/home/tester/reternal-quickstart/ansible/inventories/development/hosts.ini did not meet host_list requirements, check plugin documentation if this is unexpected
/home/tester/reternal-quickstart/ansible/inventories/development/hosts.ini did not meet script requirements, check plugin documentation if this is unexpected
/home/tester/reternal-quickstart/ansible/inventories/development/hosts.ini did not meet yaml requirements, check plugin documentation if this is unexpected
Parsed /home/tester/reternal-quickstart/ansible/inventories/development/hosts.ini inventory source with ini plugin

PLAYBOOK: reternal.yml *************************************************************************************************************************************************************************************
7 plays in reternal.yml

PLAY [all] *************************************************************************************************************************************************************************************************
<10.0.10.5> ESTABLISH SSH CONNECTION FOR USER: root
<10.0.10.5> SSH: EXEC ssh -C -o ControlMaster=auto -o ControlPersist=60s -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o User=root -o ConnectTimeout=10 -o ControlPath=/home/tester/.ansible/cp/f3e606edbb 10.0.10.5 '/bin/sh -c '"'"'echo ~root && sleep 0'"'"''
<10.0.10.5> (0, b'/root\n', b'')
<10.0.10.5> ESTABLISH SSH CONNECTION FOR USER: root
<10.0.10.5> SSH: EXEC ssh -C -o ControlMaster=auto -o ControlPersist=60s -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o User=root -o ConnectTimeout=10 -o ControlPath=/home/tester/.ansible/cp/f3e606edbb 10.0.10.5 '/bin/sh -c '"'"'( umask 77 && mkdir -p "` echo /root/.ansible/tmp/ansible-tmp-1558090712.4602225-71520360526200 `" && echo ansible-tmp-1558090712.4602225-71520360526200="` echo /root/.ansible/tmp/ansible-tmp-1558090712.4602225-71520360526200 `" ) && sleep 0'"'"''
<10.0.10.5> (0, b'ansible-tmp-1558090712.4602225-71520360526200=/root/.ansible/tmp/ansible-tmp-1558090712.4602225-71520360526200\n', b'')
Using module file /usr/local/lib/python3.6/dist-packages/ansible/modules/system/setup.py
<10.0.10.5> PUT /home/tester/.ansible/tmp/ansible-local-963wzyjaqsm/tmp_brvtppw TO /root/.ansible/tmp/ansible-tmp-1558090712.4602225-71520360526200/AnsiballZ_setup.py
<10.0.10.5> SSH: EXEC sftp -b - -C -o ControlMaster=auto -o ControlPersist=60s -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o User=root -o ConnectTimeout=10 -o ControlPath=/home/tester/.ansible/cp/f3e606edbb '[10.0.10.5]'
<10.0.10.5> (0, b'sftp> put /home/tester/.ansible/tmp/ansible-local-963wzyjaqsm/tmp_brvtppw /root/.ansible/tmp/ansible-tmp-1558090712.4602225-71520360526200/AnsiballZ_setup.py\n', b'')
<10.0.10.5> ESTABLISH SSH CONNECTION FOR USER: root
<10.0.10.5> SSH: EXEC ssh -C -o ControlMaster=auto -o ControlPersist=60s -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o User=root -o ConnectTimeout=10 -o ControlPath=/home/tester/.ansible/cp/f3e606edbb 10.0.10.5 '/bin/sh -c '"'"'chmod u+x /root/.ansible/tmp/ansible-tmp-1558090712.4602225-71520360526200/ /root/.ansible/tmp/ansible-tmp-1558090712.4602225-71520360526200/AnsiballZ_setup.py && sleep 0'"'"''
<10.0.10.5> (0, b'', b'')
<10.0.10.5> ESTABLISH SSH CONNECTION FOR USER: root
<10.0.10.5> SSH: EXEC ssh -C -o ControlMaster=auto -o ControlPersist=60s -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o User=root -o ConnectTimeout=10 -o ControlPath=/home/tester/.ansible/cp/f3e606edbb -tt 10.0.10.5 '/bin/sh -c '"'"'/usr/bin/python3 /root/.ansible/tmp/ansible-tmp-1558090712.4602225-71520360526200/AnsiballZ_setup.py && sleep 0'"'"''
<10.0.10.5> (0, b'\r\n{"ansible_facts": {"ansible_local": {}, "ansible_distribution": "Ubuntu", "ansible_distribution_release": "bionic", "ansible_distribution_version": "18.04", "ansible_distribution_major_version": "18", "ansible_distribution_file_path": "/etc/os-release", "ansible_distribution_file_variety": "Debian", "ansible_distribution_file_parsed": true, "ansible_os_family": "Debian", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABAQDJdmZXeLzgPMuPh1RndayS4kPaCFC/TtHVzvAJbTJ7H26PRD68d4x8KnfTJDgRiCReoNLiiJDnjc/nHdMFOIc66yx04ExqRPcWSNYRYUIaYddbghyOQ8MVZkrXetQ246UMObDnPhymkV/kK+avNaljbr0Vkyezojka+aZk4a28D9QZ13zPYfYqbjl4g0HPInJE/cu1yOpYiOEzZToSGYZfY69gZOQWMQkiPdBDethHgUrbqvtXrOtkiyo1xd5huifFBgJy7vEei2Opw4nLt1o33h0Om3ZdefT9Z8wHzDSi2xLBq0XoDYt0O7xh55FYA2uU0E+GWyUJ7aUfqoypggSv", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBK7m+mN0dVZH1cmO68orc42JhGORcGTM2JbTozZa7PWYmsOJWm5agT6cnQ0NyKk/UEuaZd2o0cNylaFdYutBb9U=", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIF8a6Q6Ndtkk6IHHDQ1Hq0tYPR6qP5Takdv9H0IMCqs8", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_selinux": {"status": "Missing selinux Python library"}, "ansible_selinux_python_present": false, "ansible_python": {"version": {"major": 3, "minor": 6, "micro": 7, "releaselevel": "final", "serial": 0}, "version_info": [3, 6, 7, "final", 0], "executable": "/usr/bin/python3", "has_sslcontext": true, "type": "cpython"}, "ansible_cmdline": {"BOOT_IMAGE": "/boot/vmlinuz-4.18.0-20-generic", "root": "UUID=e916035c-3d5d-46a9-8289-8b8aae82f3bd", "ro": true, "quiet": true, "splash": true}, "ansible_iscsi_iqn": "", "ansible_system": "Linux", "ansible_kernel": "4.18.0-20-generic", "ansible_machine": "x86_64", "ansible_python_version": "3.6.7", "ansible_fqdn": "tester-VirtualBox", "ansible_hostname": "tester-VirtualBox", "ansible_nodename": "tester-VirtualBox", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "33c0b47939a74853a4b733d4a4e27dec", "ansible_fips": false, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true}, "search": ["avp.ru"]}, "ansible_system_capabilities_enforced": "True", "ansible_system_capabilities": ["cap_chown", "cap_dac_override", "cap_dac_read_search", "cap_fowner", "cap_fsetid", "cap_kill", "cap_setgid", "cap_setuid", "cap_setpcap", "cap_linux_immutable", "cap_net_bind_service", "cap_net_broadcast", "cap_net_admin", "cap_net_raw", "cap_ipc_lock", "cap_ipc_owner", "cap_sys_module", "cap_sys_rawio", "cap_sys_chroot", "cap_sys_ptrace", "cap_sys_pacct", "cap_sys_admin", "cap_sys_boot", "cap_sys_nice", "cap_sys_resource", "cap_sys_time", "cap_sys_tty_config", "cap_mknod", "cap_lease", "cap_audit_write", "cap_audit_control", "cap_setfcap", "cap_mac_override", "cap_mac_admin", "cap_syslog", "cap_wake_alarm", "cap_block_suspend", "cap_audit_read+ep"], "ansible_is_chroot": false, "ansible_virtualization_type": "virtualbox", "ansible_virtualization_role": "guest", "ansible_pkg_mgr": "apt", "ansible_env": {"MAIL": "/var/mail/root", "USER": "root", "SSH_CLIENT": "10.0.10.5 43128 22", "LC_TIME": "ru_RU.UTF-8", "SHLVL": "1", "HOME": "/root", "SSH_TTY": "/dev/pts/3", "LC_MONETARY": "ru_RU.UTF-8", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "LOGNAME": "root", "_": "/bin/sh", "XDG_SESSION_ID": "37", "TERM": "xterm-256color", "PATH": "/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games", "LC_ADDRESS": "ru_RU.UTF-8", "XDG_RUNTIME_DIR": "/run/user/0", "LANG": "en_US.UTF-8", "LC_TELEPHONE": "ru_RU.UTF-8", "LC_NAME": "ru_RU.UTF-8", "SHELL": "/bin/bash", "LC_MEASUREMENT": "ru_RU.UTF-8", "LC_IDENTIFICATION": "ru_RU.UTF-8", "PWD": "/root", "SSH_CONNECTION": "10.0.10.5 43128 10.0.10.5 22", "LC_NUMERIC": "ru_RU.UTF-8", "LC_PAPER": "ru_RU.UTF-8"}, "ansible_date_time": {"year": "2019", "month": "05", "weekday": "\\u041f\\u044f\\u0442\\u043d\\u0438\\u0446\\u0430", "weekday_number": "5", "weeknumber": "19", "day": "17", "hour": "13", "minute": "58", "second": "34", "epoch": "1558090714", "date": "2019-05-17", "time": "13:58:34", "iso8601_micro": "2019-05-17T10:58:34.168331Z", "iso8601": "2019-05-17T10:58:34Z", "iso8601_basic": "20190517T135834168241", "iso8601_basic_short": "20190517T135834", "tz": "MSK", "tz_offset": "+0300"}, "ansible_apparmor": {"status": "enabled"}, "ansible_lsb": {"id": "Ubuntu", "description": "Ubuntu 18.04.2 LTS", "release": "18.04", "codename": "bionic", "major_release": "18"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Core(TM) i5-6500 CPU @ 3.20GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 1, "ansible_processor_vcpus": 1, "ansible_memtotal_mb": 2989, "ansible_memfree_mb": 101, "ansible_swaptotal_mb": 2047, "ansible_swapfree_mb": 2021, "ansible_memory_mb": {"real": {"total": 2989, "used": 2888, "free": 101}, "nocache": {"free": 1597, "used": 1392}, "swap": {"total": 2047, "free": 2021, "used": 26, "cached": 1}}, "ansible_bios_date": "12/01/2006", "ansible_bios_version": "VirtualBox", "ansible_form_factor": "Other", "ansible_product_name": "VirtualBox", "ansible_product_serial": "0", "ansible_product_uuid": "0c8fe6d6-f3fc-4d3f-b3b7-59f724e0e714", "ansible_product_version": "1.2", "ansible_system_vendor": "innotek GmbH", "ansible_devices": {"loop1": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "1", "scheduler_mode": "none", "sectors": "4600", "sectorsize": "512", "size": "2.25 MB", "host": "", "holders": []}, "loop6": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "1", "scheduler_mode": "none", "sectors": "29704", "sectorsize": "512", "size": "14.50 MB", "host": "", "holders": []}, "loop4": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "1", "scheduler_mode": "none", "sectors": "288080", "sectorsize": "512", "size": "140.66 MB", "host": "", "holders": []}, "sr0": {"virtual": 1, "links": {"ids": ["ata-VBOX_CD-ROM_VB2-01700376"], "uuids": ["2019-01-25-18-43-09-09"], "labels": ["VBox_GAs_6.0.4"], "masters": []}, "vendor": "VBOX", "model": "CD-ROM", "sas_address": null, "sas_device_handle": null, "removable": "1", "support_discard": "0", "partitions": {}, "rotational": "1", "scheduler_mode": "cfq", "sectors": "167808", "sectorsize": "2048", "size": "81.94 MB", "host": "IDE interface: Intel Corporation 82371AB/EB/MB PIIX4 IDE (rev 01)", "holders": []}, "loop2": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "1", "scheduler_mode": "none", "sectors": "26600", "sectorsize": "512", "size": "12.99 MB", "host": "", "holders": []}, "loop0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "1", "scheduler_mode": "none", "sectors": "70736", "sectorsize": "512", "size": "34.54 MB", "host": "", "holders": []}, "loop7": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "1", "scheduler_mode": "none", "sectors": "0", "sectorsize": "512", "size": "0.00 Bytes", "host": "", "holders": []}, "sda": {"virtual": 1, "links": {"ids": ["ata-VBOX_HARDDISK_VB3594a016-b627669b"], "uuids": [], "labels": [], "masters": []}, "vendor": "ATA", "model": "VBOX HARDDISK", "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "0", "partitions": {"sda1": {"links": {"ids": ["ata-VBOX_HARDDISK_VB3594a016-b627669b-part1"], "uuids": ["e916035c-3d5d-46a9-8289-8b8aae82f3bd"], "labels": [], "masters": []}, "start": "2048", "sectors": "128636928", "sectorsize": 512, "size": "61.34 GB", "uuid": "e916035c-3d5d-46a9-8289-8b8aae82f3bd", "holders": []}}, "rotational": "1", "scheduler_mode": "cfq", "sectors": "128639808", "sectorsize": "512", "size": "61.34 GB", "host": "SATA controller: Intel Corporation 82801HM/HEM (ICH8M/ICH8M-E) SATA Controller [AHCI mode] (rev 02)", "holders": []}, "loop5": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "1", "scheduler_mode": "none", "sectors": "7576", "sectorsize": "512", "size": "3.70 MB", "host": "", "holders": []}, "loop3": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "1", "scheduler_mode": "none", "sectors": "186344", "sectorsize": "512", "size": "90.99 MB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {"sda1": ["ata-VBOX_HARDDISK_VB3594a016-b627669b-part1"], "sda": ["ata-VBOX_HARDDISK_VB3594a016-b627669b"], "sr0": ["ata-VBOX_CD-ROM_VB2-01700376"]}, "uuids": {"sr0": ["2019-01-25-18-43-09-09"], "sda1": ["e916035c-3d5d-46a9-8289-8b8aae82f3bd"]}, "labels": {"sr0": ["VBox_GAs_6.0.4"]}, "masters": {}}, "ansible_uptime_seconds": 22449, "ansible_mounts": [{"mount": "/", "device": "/dev/sda1", "fstype": "ext4", "options": "rw,relatime,errors=remount-ro", "uuid": "e916035c-3d5d-46a9-8289-8b8aae82f3bd", "size_total": 64559468544, "size_available": 48224399360, "block_size": 4096, "block_total": 15761589, "block_available": 11773535, "block_used": 3988054, "inode_total": 4022272, "inode_available": 3600226, "inode_used": 422046}, {"mount": "/snap/gtk-common-themes/818", "device": "/dev/loop0", "fstype": "squashfs", "options": "ro,nodev,relatime", "uuid": "N/A", "size_total": 36306944, "size_available": 0, "block_size": 131072, "block_total": 277, "block_available": 0, "block_used": 277, "inode_total": 27345, "inode_available": 0, "inode_used": 27345}, {"mount": "/snap/gnome-system-monitor/57", "device": "/dev/loop5", "fstype": "squashfs", "options": "ro,nodev,relatime", "uuid": "N/A", "size_total": 3932160, "size_available": 0, "block_size": 131072, "block_total": 30, "block_available": 0, "block_used": 30, "inode_total": 747, "inode_available": 0, "inode_used": 747}, {"mount": "/snap/gnome-characters/139", "device": "/dev/loop2", "fstype": "squashfs", "options": "ro,nodev,relatime", "uuid": "N/A", "size_total": 13631488, "size_available": 0, "block_size": 131072, "block_total": 104, "block_available": 0, "block_used": 104, "inode_total": 1598, "inode_available": 0, "inode_used": 1598}, {"mount": "/snap/gnome-logs/45", "device": "/dev/loop6", "fstype": "squashfs", "options": "ro,nodev,relatime", "uuid": "N/A", "size_total": 15335424, "size_available": 0, "block_size": 131072, "block_total": 117, "block_available": 0, "block_used": 117, "inode_total": 1720, "inode_available": 0, "inode_used": 1720}, {"mount": "/snap/gnome-calculator/260", "device": "/dev/loop1", "fstype": "squashfs", "options": "ro,nodev,relatime", "uuid": "N/A", "size_total": 2359296, "size_available": 0, "block_size": 131072, "block_total": 18, "block_available": 0, "block_used": 18, "inode_total": 1269, "inode_available": 0, "inode_used": 1269}, {"mount": "/snap/core/6350", "device": "/dev/loop3", "fstype": "squashfs", "options": "ro,nodev,relatime", "uuid": "N/A", "size_total": 95420416, "size_available": 0, "block_size": 131072, "block_total": 728, "block_available": 0, "block_used": 728, "inode_total": 12816, "inode_available": 0, "inode_used": 12816}, {"mount": "/snap/gnome-3-26-1604/74", "device": "/dev/loop4", "fstype": "squashfs", "options": "ro,nodev,relatime", "uuid": "N/A", "size_total": 147587072, "size_available": 0, "block_size": 131072, "block_total": 1126, "block_available": 0, "block_used": 1126, "inode_total": 27631, "inode_available": 0, "inode_used": 27631}, {"mount": "/media/tester/VBox_GAs_6.0.4", "device": "/dev/sr0", "fstype": "iso9660", "options": "ro,nosuid,nodev,relatime,nojoliet,check=s,map=n,blocksize=2048,uid=1000,gid=1000,dmode=500,fmode=400", "uuid": "2019-01-25-18-43-09-09", "size_total": 85917696, "size_available": 0, "block_size": 2048, "block_total": 41952, "block_available": 0, "block_used": 41952, "inode_total": 0, "inode_available": 0, "inode_used": 0}], "ansible_interfaces": ["br-277be8281ff8", "br-13b976b2b409", "enp0s3", "lo", "docker0"], "ansible_enp0s3": {"device": "enp0s3", "macaddress": "08:00:27:32:af:67", "mtu": 1500, "active": true, "module": "e1000", "type": "ether", "pciid": "0000:00:03.0", "speed": 1000, "promisc": false, "ipv4": {"address": "10.0.10.5", "broadcast": "10.0.10.255", "netmask": "255.255.255.0", "network": "10.0.10.0"}, "ipv6": [{"address": "fe80::72d4:1151:bff9:a00d", "prefix": "64", "scope": "link"}]}, "ansible_br_277be8281ff8": {"device": "br-277be8281ff8", "macaddress": "02:42:e7:41:a7:94", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.0242e741a794", "stp": false, "promisc": false, "ipv4": {"address": "172.19.0.1", "broadcast": "172.19.255.255", "netmask": "255.255.0.0", "network": "172.19.0.0"}, "ipv6": [{"address": "fe80::42:e7ff:fe41:a794", "prefix": "64", "scope": "link"}]}, "ansible_br_13b976b2b409": {"device": "br-13b976b2b409", "macaddress": "02:42:c0:29:22:c1", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.0242c02922c1", "stp": false, "promisc": false, "ipv4": {"address": "172.18.0.1", "broadcast": "172.18.255.255", "netmask": "255.255.0.0", "network": "172.18.0.0"}, "ipv6": [{"address": "fe80::42:c0ff:fe29:22c1", "prefix": "64", "scope": "link"}]}, "ansible_docker0": {"device": "docker0", "macaddress": "02:42:e1:6f:f3:82", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.0242e16ff382", "stp": false, "promisc": false, "ipv4": {"address": "172.17.0.1", "broadcast": "172.17.255.255", "netmask": "255.255.0.0", "network": "172.17.0.0"}, "ipv6": [{"address": "fe80::42:e1ff:fe6f:f382", "prefix": "64", "scope": "link"}]}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "host", "netmask": "255.0.0.0", "network": "127.0.0.0"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_default_ipv4": {"gateway": "10.0.10.1", "interface": "enp0s3", "address": "10.0.10.5", "broadcast": "10.0.10.255", "netmask": "255.255.255.0", "network": "10.0.10.0", "macaddress": "08:00:27:32:af:67", "mtu": 1500, "type": "ether", "alias": "enp0s3"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.0.10.5", "172.19.0.1", "172.18.0.1", "172.17.0.1"], "ansible_all_ipv6_addresses": ["fe80::72d4:1151:bff9:a00d", "fe80::42:e7ff:fe41:a794", "fe80::42:c0ff:fe29:22c1", "fe80::42:e1ff:fe6f:f382"], "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": "*", "fact_path": "/etc/ansible/facts.d"}}}\r\n', b'Shared connection to 10.0.10.5 closed.\r\n')
<10.0.10.5> ESTABLISH SSH CONNECTION FOR USER: root
<10.0.10.5> SSH: EXEC ssh -C -o ControlMaster=auto -o ControlPersist=60s -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o User=root -o ConnectTimeout=10 -o ControlPath=/home/tester/.ansible/cp/f3e606edbb 10.0.10.5 '/bin/sh -c '"'"'rm -f -r /root/.ansible/tmp/ansible-tmp-1558090712.4602225-71520360526200/ > /dev/null 2>&1 && sleep 0'"'"''
<10.0.10.5> (0, b'', b'')

TASK [Gathering Facts] *************************************************************************************************************************************************************************************
task path: /home/tester/reternal-quickstart/ansible/reternal.yml:1
ok: [rtn-dev-01]
META: ran handlers

TASK [generic : include distribution specific install] *****************************************************************************************************************************************************
task path: /home/tester/reternal-quickstart/ansible/roles/generic/tasks/main.yml:1
fatal: [rtn-dev-01]: FAILED! => {
    "reason": "Unable to retrieve file contents\nCould not find or access '/home/tester/reternal-quickstart/ansible/Ubuntu.yml' on the Ansible Controller.\nIf you are using a module and expect the file to exist on the remote, see the remote_src option"
}
	to retry, use: --limit @/home/tester/reternal-quickstart/ansible/reternal.retry

PLAY RECAP *************************************************************************************************************************************************************************************************
rtn-dev-01                 : ok=1    changed=0    unreachable=0    failed=1

Contents of the host config:

hostname: rtn-dev-01
ansible_host: &mgt_addr 10.0.10.5 # Example of IP of TestVM, replace this with target host IP

firewall:
  defaults:
    - direction: incoming
      policy: allow
  rules:
    - rule: allow
      to_port: 22
      protocol: tcp
    - rule: allow
      to_port: 80
      protocol: tcp

@d3vzer0
Copy link
Owner

d3vzer0 commented May 17, 2019

I couldn't reproduce the same error yet, but I made a minor change in the playbook. Try pulling the latest quickstart repo and run the playbook again :) Ie. my output / command:

Screenshot 2019-05-17 at 16 02 03

@uchakin
Copy link

uchakin commented May 20, 2019

Did you test ansible way behind proxy?

@d3vzer0
Copy link
Owner

d3vzer0 commented May 23, 2019

Did you test ansible way behind proxy?

Nope, but I'm going to test that now :) It seems a deployment with Vagrant is having similar issues. I'll test it out tomorrow and come with a fix ^^

@d3vzer0
Copy link
Owner

d3vzer0 commented May 27, 2019

Found the issue :) Ansible couldn't find the template file because in some cases the identified distro is seen as Ubuntu (capital U) or ubuntu (small u). In the first case Ansible couldn't find the template file since the file is named ubuntu.yml. Fixed the issue to make all distro versions lowercase and loading the template afterwards.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
bug Something isn't working enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

3 participants