Skip to content
This repository has been archived by the owner on Jun 10, 2024. It is now read-only.

Reviewing

Dag Wieers edited this page Jan 29, 2019 · 12 revisions

Reviewing issues and PRs

So when we ask people to review issues and PRs, what do we actually mean?

Reviewing issues

Users often report issues with Ansible or Ansible modules. The first thing that is needed is to verify the behaviour.

  • Verify if the user made an obvious mistake in the example code. We often see user errors reported as bugs.
  • Sometimes the user expected a different behaviour, sometimes the documentation is unclear. Clarify.
  • If there is no minimal reproducer, ask the reporter to reduce the complexity. This helps pinpoint the issue.
  • If it looks a real bug, does the behaviour still exist in the most recent release, or the devel branch?

If you can reproduce the reported issue, understanding what happens in the code and eventually fixing the issue becomes a reality. A lot of issues never get past a proper reproducer, which may indicate this is an issue with the user's environment.

If you require more information from the reporter, you can comment with needs_info and @ansibot will label the issue and the reporter will be notified. If the reporter does not answer questions or report back, an issue can be closed in due time.

Reviewing PRs

When new modules are contributed, or new functionality is added, or bugs are fixed, we need to test if the new code works as expected. For this it helps that we have integration tests for every module, these integration tests describe the expected behaviour of a module.

Sanity tests

Before you start reviewing new code, rest assured that we have already implemented a very extensive set of tests testing a lot of known error conditions, for documentation, coding styles, python gotchas, consistency tests, etc.

The CI framework will report any issues found and @ansibot will report those issues and label the PR accordingly. Depending on the state of the PR, you can already review the code and comment specific code segments or comment the structure, flow and implementation of the module.

Integration tests

In Ansible integration tests are really just playbooks that test every aspect of a module under different conditions. So every possible use of the module should be tested against:

  • Idempotency (Does rerunning a task report no changes?)
  • Check-mode (Does dry-runnning a task behaves the same as a real run? Does it not make any changes?
  • Return values (Does the module return values consistently under different conditions)

So when constructing integration tests you have to think about what are the different ways to use this module? and how can I test every possible incarnation of the module?.

So if a typical module would be used to manage (create, change, remove) an object a typical playbook would consists of the following stages:

  1. Prepare the supporting environment
  2. Test actions
  • Create the object
  • Change the object (in different ways)
  • Remove the object
  1. Failure conditions
  2. Clean up the supporting environment

Each of these test actions tested will have to be tested four times:

  • Perform action in check-mode (this should indicate a change)
  • Perform action for real (this should indicate a change)
  • Perform action again in check-mode (this should indicate no change)
  • Perform action again for real (this should indicate no change)

And every of these actions should be testing whether or not there was a change, and if the return values are as we expect. For testing the failure cases, we test if the action failed (we use ignore_errors: yes to continue the play) and if the return message is as expected.

Test environment

Since our CI framework may not have access to the infrastructure required for testing (e.g. HP-UX infrastructure) it is essential that we have these integration tests available so other users/reviewers can test the proper behaviour in their own environment. (In this case we label the integration test as unsupported so it is not being run by Shippable).

Reporting back

If you found issues, want to discuss architecture, found corner-cases, etc. we usually use the Github "Review changes" functionality to Request changes, Comment or Approve the pull-request. For general concerns you can simply add a comment to the PR itself.

Approving a PR

When you are satisfied by a PR and you have reviewed and tested the changes, you can approve the PR through the "Review changes" interface, and if you are a member of the Working Group, you can add a "shipit" in a comment to add your support. @ansibot will take those shipits into account and may merge the PR if sufficient shipits have been added.

Please do not add shipits if you were not able to test the changes (i.e. ran the integration tests, or made your own tests). You can still approve of the PR, but do not you did not actually test the code.

@ansibot will never merge new modules, new modules always require a core contributor to merge that PR. If a new module is ready to be merged, sufficient shipits will trigger a core contributor to review the module, in doubt you can have the reviewed module put on the core meeting agenda to be evaluated.

(ARchived) Working groups

Working groups are now in the Ansible forum

Ansible project:
Community, Contributor Experience, Docs, News, Outreach, RelEng, Testing

Cloud:
AWS, Azure, CloudStack, Container, DigitalOcean, Docker, hcloud, Kubernetes, Linode, OpenStack, oVirt, Virt, VMware

Networking:
ACI, AVI, F5, Meraki, Network, NXOS

Ansible Developer Tools:
Ansible-developer-tools

Software:
Crypto, Foreman, GDrive, GitLab, Grafana, IPA, JBoss, MongoDB, MySQL, PostgreSQL, RabbitMQ, Zabbix

System:
AIX, BSD, HP-UX, macOS, Remote Management, Solaris, Windows

Security:
Security-Automation, Lockdown

Tooling:
AWX, Galaxy, Molecule

Communities

Modules:
unarchive, xml

Plugins:
httpapi

Wiki

Roles, Communication, Reviewing, Checklist, TODO

Clone this wiki locally