Skip to content

nilcons-contrib/nipa

 
 

Repository files navigation

NIPA - Netdev Infrastructure for Patch Automation

This project is a simple CI/build bot for patchwork.

Patchwork is a web interface for patches posted to mailing lists, and can also handle test results being reported against said patches.

Currently this project only includes simple checks and build testing, all Linux kernel-centric. Patches are not tested against existing kernel selftests.

Goals

The main goal of NIPA is to minimize the amount of time netdev and BPF maintainers have to spend validating patches.

As soon as patches hit the mailing list NIPA needs to validate them and report errors to patchwork. If patch is deemed bad maintainers can simply discard it from patchwork.

Because of load generated on the ML and the test systems results are not reported directly to the authors of patches, we don't want to facilitate "post just to be tested" scenarios.

The system needs to be easily run by individual developers. The intention is to package it as a container in due course. Having everyone test their patches locally allows for better scaling (no need for big central infrastructure) and hopefully creates an incentive for contributing.

Structure

The project is split into multiple programs with different uses.

pw_poller.py fetches emails from patchwork and runs tests in worker threads. There is one worker thread for each tree, enabling testing multiple series at a time (although admittedly the concurrency is limited because pw_poller.py itself also needs the trees to de-mux patches). Poller creates a directory with results for each series, and sub-dirs for each patch.

Once tests are done another daemon - pw_upload.py uploads the results as checks to patchwork.

ingest_mdir.py is supposed to serve the purpose of testing patches locally, it can be pointed at a directory and run all the checks on patches that directory contains (patches are expected to be generated by git format-patch). ingest_mdir.py has not been tested in a while so it's probably broken.

Configuration

Configuration is read from INI files in main project directory.

There is a main config file called nipa.config but each script allows script-specific settings to be applied (see sources).

Logging

NIPA supports org mode file format for easy reading in Emacs, and XML-based output.

Tests

Tests can be either written in Python and be passed the Series / Patch objects, or written as scripts which then return 0 on success, 250 on warning, or other values on error.

Tests also return (or print to a special file descriptor) the info which will be displayed in patchwork's short summary.

Series tests

Series tests are run once on the entire series. pw_upload.py multiplicates them to each patch since patchwork does not support "series checks"

subject_prefix

Check if subject prefix contains the tree name.

patch_count

Check if number of patches in the series is not larger than 15.

cover_letter

Check if series has a cover letter (require one only if there are more than two patches, otherwise the series is trivial).

fixes_present

Check if any of the patches in the series contains a Fixes tag.

If the tree name does not contain "next" in it assume that the patches are targeting current release cycle, therefore they are fixes.

Patch tests

verify_signedoff

Check that the Signed-off-by tag matches the From field. This test was taken from GregKH's repo, but there's a number of versions of this check circulating.

The check may be a little looser than some may expect, because it's satisfied if authors name or email address match between From and Signed-off-by, not necessarily both of them.

The original test validates the committer had signed-off the commit as well as the author, which is obviously meaningless when the test infra applies the patches to the tree by itself.

verify_fixes

Check that the Fixes tag is correct. This test was taken from GregKH's repo, but there's a number of versions of this check circulating.

The hash is expected to be present in the tree to which patch is being applied. This is a slight departure from GregKH's original where the hash is checked against Linus's tree.

source_inline

Check if there are any inline keywords in the C source files.

header_inline

Try to catch static functions without the inline keyword in headers.

checkpatch

Run selected tests of kernel's scripts/checkpatch.pl on the patches.

build_allmodconfig_warn

Check if allmodconfig-configured kernel builds with the patch. Catch new errors and warnings with W=1 C=1 flags.

For now comparison is only by warning count, so warnings may get silently replaced by a different one.

build_32bit

Check if allmodconfig-configured kernel builds for 32bit platforms.

cc_maintainers

Check if addresses pointed out by get_maintainers.pl are included in the To/Cc of the mails.

Warn if not all included, error if nobody is included or author of a change blamed by a Fixes tag is not.

kdoc

Run kernel-doc and check for warnings/errors. Similarly to build tests only compare the number of errors for now.

maintainers

Run get_maintainers.pl --self-test.

Currently disabled because it's extremely slow.

module_param

Warn if patch is adding module parameters.

stable

Warn if patch is explicitly CCing the stable tree which is against netdev policy.

signed

Check for patch attestation (as generated by [patatt](https://github.com/mricon/patatt)). Warn when there is no signature or if the key for a signature isn't available. Fail if the signature doesn't match the attestation.

To Do

  • build one-by-one for a PR
  • add tree aliases (bpf, bpf-next, ipsec, ipsec-next, etc.)
  • run coccicheck
  • rev xmas tree
  • make a better MAINTAINERS check than checkpatch
  • add a marker for patches with replies from buildbot
  • split the apply try from the test tree
  • on a pull fixes may point to the commits in the pull
  • series ID injection
  • misspell-fixer
  • make htmldocs
  • split out uploader to separate user
  • add async tests

Releases

No releases published

Packages

No packages published

Languages

  • Python 79.8%
  • Shell 20.2%