Skip to content

Commit

Permalink
Update docs site for version 2.20.0a0 (pantsbuild#153)
Browse files Browse the repository at this point in the history
  • Loading branch information
WorkerPants and kaos authored Feb 16, 2024
1 parent d458a03 commit 7a10b38
Show file tree
Hide file tree
Showing 166 changed files with 62,425 additions and 20,961 deletions.
11 changes: 0 additions & 11 deletions docs/_README.md

This file was deleted.

Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@ The `adhoc_tool` target allows you to execute "runnable" targets inside the Pant

`adhoc_tool` provides you with the building blocks needed to put together a custom build process without needing to develop and maintain a plugin. The level of initial effort involved in using `adhoc_tool` is significantly lower than that of [writing a plugin](../writing-plugins/overview.mdx), so it's well-suited to consuming one-off scripts, or for rapidly prototyping a process before actually writing a plugin. The tradeoff is that there is more manual work involved in defining build processes that reflect your codebase's structure, and that the targets that define the tools you consume are less easy to reuse.

The `antlr` demo in the [`example-adhoc` respository](https://github.com/pantsbuild/example-adhoc) shows an example of running a JVM-based tool to transparently generate Python code that can be used in another language:
The `antlr` demo in the [`example-adhoc` repository](https://github.com/pantsbuild/example-adhoc) shows an example of running a JVM-based tool to transparently generate Python code that can be used in another language:

```
adhoc_tool(
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -121,6 +121,6 @@ Dumping thread stacks:
- Run: `gdb /path/to/python/binary PROCESS_ID`
3. Enable logging to write the thread dump to `gdb.txt`: `set logging on`
4. Dump all thread backtraces: `thread apply all bt`
5. If you use pyenv to mange your Python install, a gdb script will exist in the same directory as the Python binary. Source it into gdb:
5. If you use pyenv to manage your Python install, a gdb script will exist in the same directory as the Python binary. Source it into gdb:
- `source ~/.pyenv/versions/3.8.5/bin/python3.8-gdb.py` (if using version 3.8.5)
6. Dump all Python stacks: `thread apply all py-bt`
4 changes: 2 additions & 2 deletions docs/docs/contributions/development/developing-rust.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -10,11 +10,11 @@ Hacking on the Pants engine in Rust.
We welcome contributions to Rust! We use Rust to implement the Pants engine in a performant, safe, and ergonomic way.

:::note Still learning Rust? Ask to get added to reviews
We'd be happy to ping you on Rust changes we make for you to see how Rust is used in the wild. Please message us on the #engine channel in [Slack](/community/members) to let us know your interest.
We'd be happy to ping you on Rust changes we make for you to see how Rust is used in the wild. Please message us on the #development channel in [Slack](/community/members) to let us know your interest.
:::

:::caution Recommendation: share your plan first
Because changes to Rust deeply impact how Pants runs, it is especially helpful to share any plans to work on Rust before making changes. Please message us on [Slack](/community/members) in the #engine channel or open a [GitHub issue](https://github.com/pantsbuild/pants/issues).
Because changes to Rust deeply impact how Pants runs, it is especially helpful to share any plans to work on Rust before making changes. Please message us on [Slack](/community/members) in the #development channel or open a [GitHub issue](https://github.com/pantsbuild/pants/issues).
:::

## Code organization
Expand Down
6 changes: 3 additions & 3 deletions docs/docs/contributions/development/internal-architecture.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -66,9 +66,9 @@ But both of the goals are important because together they allow for an API that
There are a few constraints that decide which `Rule`s are able to provide dependencies for one another:

- `param_consumption` - When a `Rule` directly uses a `Param` as a positional argument, that `Param` is removed from scope for any of that `Rule`'s dependencies.
- For example, for a `Rule` `y` with a positional argument `A` and a `Get(B, C)`: if there is a `Param` `A` in scope at `y` and it is used to satisfy the positional argument, it cannot also be used to (transitively) to satisfy the `Get(B, C)` (i.e., a hyptothetical rule that consumes both `A` and `C` would not be eligible in that position).
- For example, for a `Rule` `y` with a positional argument `A` and a `Get(B, C)`: if there is a `Param` `A` in scope at `y` and it is used to satisfy the positional argument, it cannot also be used to (transitively) to satisfy the `Get(B, C)` (i.e., a hypothetical rule that consumes both `A` and `C` would not be eligible in that position).
- On the other hand, for a `Rule` `w` with `Get(B, C)` and `Get(D, E)`, if there is a `Param` `A` in scope at `w`, two dependency `Rule`s that consume `A` (transitively) _can_ be used to satisfy those `Get`s. Only consuming a `Param` as a positional argument removes it from scope.
- `provided_params` - When deciding whether one `Rule` can use another `Rule` to provide the output type of a `Get`, a constraint is applied that the candidate depedency must (transitively) consume the `Param` that is provided by the `Get`.
- `provided_params` - When deciding whether one `Rule` can use another `Rule` to provide the output type of a `Get`, a constraint is applied that the candidate dependency must (transitively) consume the `Param` that is provided by the `Get`.
- For example: if a `Rule` `z` has a `Get(A, B)`, only `Rule`s that compute an `A` and (transitively) consume a `B` are eligible to be used. This also means that a `Param` `A` which is already in scope for `Rule` `z` is not eligible to be used, because it would trivially not consume `B`.

### Implementation
Expand All @@ -83,7 +83,7 @@ The construction algorithm is broken up into phases:
- If we were to stop `RuleGraph` construction at this phase, it would be necessary to do a form of [dynamic dispatch](https://en.wikipedia.org/wiki/Dynamic_dispatch) at runtime to decide which source of a dependency to use based on the `Param`s that were currently in scope. And the sets of `Param`s used in the memoization key for each `Rule` would still be overly large, causing excess invalidation.
3. [monomorphize](https://github.com/pantsbuild/pants/blob/3a188a1e06d8c27ff86d8c311ff1b2bdea0d39ff/src/rust/engine/rule_graph/src/builder.rs#L325-L353) - "Monomorphize" the polymorphic graph by using the out-set of available `Param`s (initialized during `initial_polymorphic`) and the in-set of consumed `Param`s (computed during `live_param_labeled`) to partition nodes (and their dependents) for each valid combination of their dependencies. Combinations of dependencies that would be invalid (see the Constraints section) are not generated, which causes some pruning of the graph to happen during this phase.
- Continuing the example from above: the goal of monomorphize is to create one copy of `Rule` `x` per legal combination of its `DependencyKey`. Assuming that both of `x`'s dependencies remain legal (i.e. that all of `{A,B,C}` are still in scope in the dependents of `x`, etc), then two copies of `x` will be created: one that uses the first dependency and has an in-set of `{A,B}`, and another that uses the second dependency and has an in-set of `{B,C}`.
4. [prune_edges](https://github.com/pantsbuild/pants/blob/3a188a1e06d8c27ff86d8c311ff1b2bdea0d39ff/src/rust/engine/rule_graph/src/builder.rs#L836-L845) - Once the monomorphic graph has [converged](https://en.wikipedia.org/wiki/Data-flow_analysis#Convergence), each node in the graph will ideally have exactly one source of each `DependencyKey` (with the exception of `Query`s, which are not monomorphized). This phase validates that, and chooses the smallest input `Param` set to use for each `Query`. In cases where a node has more that one dependency per `DependencyKey`, it is because given a particular set of input `Params` there was more than one valid way to compute a dependency. This can happen either because there were too many `Param`s in scope, or because there were multiple `Rule`s with the same `Param` requirements.
4. [prune_edges](https://github.com/pantsbuild/pants/blob/3a188a1e06d8c27ff86d8c311ff1b2bdea0d39ff/src/rust/engine/rule_graph/src/builder.rs#L836-L845) - Once the monomorphic graph has [converged](https://en.wikipedia.org/wiki/Data-flow_analysis#Convergence), each node in the graph will ideally have exactly one source of each `DependencyKey` (except for `Query`s, which are not monomorphized). This phase validates that, and chooses the smallest input `Param` set to use for each `Query`. In cases where a node has more than one dependency per `DependencyKey`, it is because given a particular set of input `Params` there was more than one valid way to compute a dependency. This can happen either because there were too many `Param`s in scope, or because there were multiple `Rule`s with the same `Param` requirements.
- This phase is the only phase that renders errors: all of the other phases mark nodes and edges "deleted" for particular reasons, and this phase consumes that record. A node that has been deleted indicates that that node is unsatisfiable for some reason, while an edge that has been deleted indicates that the source node was not able to consume the target node for some reason.
- If a node has too many sources of a `DependencyKey`, this phase will recurse to attempt to locate the node in the `Rule` graph where the ambiguity was introduced. Likewise, if a node has no source of a `DependencyKey`, this phase will recurse on deleted nodes (which are preserved by the other phases) to attempt to locate the bottom-most `Rule` that was missing a `DependencyKey`.
5. [finalize](https://github.com/pantsbuild/pants/blob/3a188a1e06d8c27ff86d8c311ff1b2bdea0d39ff/src/rust/engine/rule_graph/src/builder.rs#L1064-L1068) - After `prune_edges` the graph is known to be valid, and this phase generates the final static `RuleGraph` for all `Rule`s reachable from `Query`s.
16 changes: 3 additions & 13 deletions docs/docs/contributions/development/style-guide.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -97,7 +97,7 @@ class OrderedSet:

### TODOs

When creating a TODO, first [create an issue](https://github.com/pantsbuild/pants/issues/new) in GitHub. Then, link to the issue # in parantheses and add a brief description.
When creating a TODO, first [create an issue](https://github.com/pantsbuild/pants/issues/new) in GitHub. Then, link to the issue # in parentheses and add a brief description.

For example:

Expand Down Expand Up @@ -418,17 +418,7 @@ It may be helpful to consider the following:
- if you experience `mypy` typing issues using the `softwrap` for documenting subclasses of `Field` and `Target` classes, consider using the `help_text` convenience function
- text inside the angle brackets (e.g. `<value>`) will be ignored when rendered if not wrapped in backticks
- to create a numbered or bullet list, use 2 space indentation (or use the `bullet_list` convenience function)
- to create a codeblock, use 4 space indentation (no need for triple backticks) and add one empty line between the code block and the text
- to create a codeblock, never use indentation. Only ever use triple-backtick blocks.
- make sure to use backticks to highlight config sections, command-line arguments, target names, and inline code examples.

It may be difficult to confirm the accuracy of text formatting in plain Python, so you may want to generate the relevant Markdown files to be able to preview them to confirm your help strings are rendered as expected. They can be converted into Markdown files for visual inspection using a custom build script.

You can run these commands to convert help strings to Markdown files for the `main` and your local feature branches to identify the changed files and then preview only the relevant files to confirm the rendering makes sense.

```text
$ git checkout main
$ pants run build-support/bin/generate_docs.py -- --no-prompt --output main-docs
$ git checkout <your-branch>
$ pants run build-support/bin/generate_docs.py -- --no-prompt --output branch-docs
$ diff -qr main-docs branch-docs
```
It may be difficult to confirm the accuracy of text formatting in plain Python, so you may want to run `pants help` on the relevant target/subsystem to see the resulting string.
32 changes: 3 additions & 29 deletions docs/docs/contributions/releases/release-process.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -97,33 +97,7 @@ $ git checkout -b 2.9.x
$ git push upstream 2.9.x
```

## Step 2: Update this docs site

Note that this step can currently only be performed by a subset of maintainers due to a paid maximum number of seats. If you do not have a readme.com account, contact someone in the `#maintainers-confidential` channel in Slack to help out.

### `dev0` - set up the new version

Go to the [documentation dashboard](https://dash.readme.com/). In the top left dropdown, where it says the current version, click "Manage versions". Click "Add new version" and use a "v" with the minor release number, e.g. "v2.9". Fork from the prior release. Mark this new version as public by clicking on "Is public?"

### Sync the `docs/` content

See the `docs/NOTES.md` for instructions setting up the necessary Node tooling your first time.
You'll need to 1st login as outlined there via some variant of `npx rdme login --2fa --project pants ...`.
On the relevant release branch, run `npx rdme docs docs/markdown --version v<pants major>.<pants minor>`; e.g: `npx rdme docs docs/markdown --version v2.8`.

### Regenerate the reference docs

Still on the relevant release branch, run `./pants run build-support/bin/generate_docs.py -- --sync --api-key <key>` with your key from [https://dash.readme.com/project/pants/v2.8/api-key](https://dash.readme.com/project/pants/v2.8/api-key).

### `stable` releases - Update the default docsite

The first stable release of a branch should update the "default" version of the docsite. For example: when releasing the stable `2.9.0`, the docsite would be changed to pointing from `v2.8` to pointing to `v2.9` by default.

:::caution Don't have edit access?
Ping someone in the `#maintainers-confidential` channel in Slack to be added. Alternatively, you can "Suggest edits" in the top right corner.
:::

## Step 3: Tag the release to trigger publishing
## Step 2: Tag the release to trigger publishing

Once you have merged the `VERSION` bump — which will be on `main` for `dev` and `a0` releases, and on the release branch for release candidates — tag the release commit to trigger wheel building and publishing.

Expand All @@ -141,7 +115,7 @@ Then, run:

This will tag the release with your PGP key, and push the tag to origin, which will kick off a [`Release` job](https://github.com/pantsbuild/pants/actions/workflows/release.yaml) to build the wheels and publish them to PyPI.

## Step 4: Test the release
## Step 3: Test the release

Run this script as a basic smoke test:

Expand All @@ -151,7 +125,7 @@ Run this script as a basic smoke test:

You should also check [GitHub Releases](https://github.com/pantsbuild/pants/releases) to ensure everything looks good. Find the version you released, then click it and confirm that the "Assets" list includes PEXes for macOS and Linux.

## Step 5: Run release testing on public repositories
## Step 4: Run release testing on public repositories

Manually trigger a run of the [public repositories testing workflow](https://github.com/pantsbuild/pants/actions/workflows/public_repos.yaml), specifying the version just published as the "Pants version".

Expand Down
30 changes: 13 additions & 17 deletions docs/docs/docker/index.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -182,38 +182,32 @@ Secrets should not be checked into version control. Use absolute paths to refere
See the example for the [`secrets`](../../reference/targets/docker_image.mdx#secrets) field.
:::

### External cache storage backends
### Buildx Support

BuildKit supports exporting build cache to an external location, making it possible to import in future builds. Cache backends can be configured using the [`cache_to`](../../reference/targets/docker_image.mdx#cache_to) and [`cache_from`](../../reference/targets/docker_image.mdx#cache_from) fields.
Buildx (using BuildKit) supports exporting build cache to an external location, making it possible to import in future builds. Cache backends can be configured using the [`cache_to`](../../reference/targets/docker_image.mdx#cache_to) and [`cache_from`](../../reference/targets/docker_image.mdx#cache_from) fields.

Create a builder using a [build driver](https://docs.docker.com/build/drivers/) that is compatible with the cache backend:

```
❯ docker buildx create --name container --driver=docker-container container
```

Use the builder:

```
❯ export BUILDX_BUILDER=container
To use BuildKit with Pants, enable the [Containerd Image Store](https://docs.docker.com/desktop/containerd/), either via [Docker Desktop settings](https://docs.docker.com/storage/containerd/) or by [setting daemon config](https://docs.docker.com/storage/containerd/#enable-containerd-image-store-on-docker-engine):
```json
{
"features": {
"containerd-snapshotter": true
}
}
```

Optionally, validate a build with the Docker CLI directly:
Optionally, run a build with the Docker CLI directly to validate buildx support on your system:

```
❯ docker buildx build -t pants-cache-test:latest \
--cache-to type=local,dest=/tmp/docker/pants-test-cache \
--cache-from type=local,src=/tmp/docker/pants-test-cache .
```

Configure Pants to use buildx and pass the BUILDX_BUILDER environment variable:
Configure Pants to use buildx:

```toml tab={"label":"pants.toml"}
[docker]
use_buildx = true
env_vars = [
"BUILDX_BUILDER"
]
```

```python tab={"label":"example/BUILD"}
Expand All @@ -230,6 +224,8 @@ docker_image(
)
```

For working examples, including multi-platform builds with Github Actions, refer to the [example-docker](https://github.com/pantsbuild/example-docker) repository.

### Build Docker image example

This example copies both a `file` and `pex_binary`. The file is specified as an explicit dependency in the `BUILD` file, whereas the `pex_binary` dependency is inferred from the path in the `Dockerfile`.
Expand Down
11 changes: 6 additions & 5 deletions docs/docs/docker/tagging-docker-images.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -130,9 +130,10 @@ docker_image(

The default placeholders are:

- `{directory}`: The directory the docker_image's BUILD file is in.
- `{parent_directory}`: The parent directory of `{directory}`.
- `{name}`: The name of the docker_image target.
- `{name}`: The name of the `docker_image` target.
- `{directory}`: The folder name of the docker_image's BUILD file.
- `{parent_directory}`: The parent folder name of `{directory}`.
- `{full_directory}`: The full path to the BUILD file.
- `{build_args.ARG_NAME}`: Each defined Docker build arg is available for interpolation under the `build_args.` prefix.
- `{default_repository}`: The default repository from configuration.
- `{target_repository}`: The repository on the `docker_image` if provided, otherwise the default repository.
Expand All @@ -148,7 +149,7 @@ See [String interpolation using placeholder values](./tagging-docker-images.mdx#

When Docker builds images, it can tag them with a set of tags. Pants will apply the tags listed in
the `image_tags` field of `docker_image`, and any additional tags if defined from the registry
configuration (see [Configuring registries](./tagging-docker-images.mdx#configuring-registries).
configuration (see [Configuring registries](./tagging-docker-images.mdx#configuring-registries)).

(Note that the field is named `image_tags` and not just `tags`, because Pants has [its own tags
concept](doc:reference-target#tags), which is unrelated.)
Expand Down Expand Up @@ -307,7 +308,7 @@ See [Setting a repository name](./tagging-docker-images.mdx#setting-a-repository
The calculated hash value _may_ change between stable versions of Pants for the otherwise same input sources.
:::

## Retrieving the tags of an packaged image
## Retrieving the tags of a packaged image

When a docker image is packaged, metadata about the resulting image is output to a JSON file artefact. This includes the image ID, as well as the full names that the image was tagged with. This file is written in the same manner as outputs of other packageable targets and available for later steps (for example, a test with `runtime_package_dependencies` including the docker image target) or in `dist/` after `pants package`. By default, this is available at `path.to.target/target_name.docker-info.json`.

Expand Down
Loading

0 comments on commit 7a10b38

Please sign in to comment.