Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Installation documentation updates. #525

Merged
merged 3 commits into from
Nov 12, 2020
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
5 changes: 4 additions & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -72,11 +72,14 @@ just-in-time (JIT) using [torch's JIT C++ extension loader that relies on
ninja](https://pytorch.org/docs/stable/cpp_extension.html) to build and
dynamically link them at runtime.

**Note:** [PyTorch](https://pytorch.org/) must be installed _before_ installing
DeepSpeed.

```bash
pip install deepspeed
```

After installation you can validate your install and see which extensions/ops
After installation, you can validate your install and see which extensions/ops
your machine is compatible with via the DeepSpeed environment report.

```bash
Expand Down
70 changes: 42 additions & 28 deletions docs/_tutorials/advanced-install.md
Original file line number Diff line number Diff line change
Expand Up @@ -11,11 +11,15 @@ just-in-time (JIT) using [torch's JIT C++ extension loader that relies on
ninja](https://pytorch.org/docs/stable/cpp_extension.html) to build and
dynamically link them at runtime.

**Note:** [PyTorch](https://pytorch.org/) must be installed _before_ installing
DeepSpeed.
{: .notice--info}

```bash
pip install deepspeed
```

After installation you can validate your install and see which ops your machine
After installation, you can validate your install and see which ops your machine
is compatible with via the DeepSpeed environment report with `ds_report` or
`python -m deepspeed.env_report`. We've found this report useful when debugging
DeepSpeed install or compatibility issues.
Expand All @@ -24,23 +28,6 @@ DeepSpeed install or compatibility issues.
ds_report
```

## Install DeepSpeed from source

After cloning the DeepSpeed repo from github you can install DeepSpeed in
JIT mode via pip (see below). This install should complete
quickly since it is not compiling any C++/CUDA source files.

```bash
pip install .
```

For installs spanning multiple nodes we find it useful to install DeepSpeed
using the
[install.sh](https://github.com/microsoft/DeepSpeed/blob/master/install.sh)
script in the repo. This will build a python wheel locally and copy it to all
the nodes listed in your hostfile (either given via --hostfile, or defaults to
/job/hostfile).

## Pre-install DeepSpeed Ops

Sometimes we have found it useful to pre-install either some or all DeepSpeed
Expand All @@ -53,23 +40,50 @@ want to attempt to install all of our ops by setting the `DS_BUILD_OPS`
environment variable to 1, for example:

```bash
DS_BUILD_OPS=1 pip install .
DS_BUILD_OPS=1 pip install deepspeed
```

DeepSpeed will only install any ops that are compatible with your machine.
For more details on which ops are compatible with your system please try our
`ds_report` tool described above.

If you want to install only a specific op (e.g., FusedLamb), you can toggle
with `DS_BUILD` environment variables at installation time. For example, to
install DeepSpeed with only the FusedLamb op use:

```bash
DS_BUILD_FUSED_LAMB=1 pip install deepspeed
```

We will only install any ops that are compatible with your machine, for more
details on which ops are compatible with your system please try our `ds_report`
tool described above.
Available `DS_BUILD` options include:
* `DS_BUILD_OPS` toggles all ops
* `DS_BUILD_CPU_ADAM` builds the CPUAdam op
* `DS_BUILD_FUSED_ADAM` builds the FusedAdam op (from [apex](https://github.com/NVIDIA/apex))
* `DS_BUILD_FUSED_LAMB` builds the FusedLamb op
* `DS_BUILD_SPARSE_ATTN` builds the sparse attention op
* `DS_BUILD_TRANSFORMER` builds the transformer op
* `DS_BUILD_STOCHASTIC_TRANSFORMER` builds the stochastic transformer op
* `DS_BUILD_UTILS` builds various optimized utilities


If you want to install only a specific op (e.g., FusedLamb) you can view the op
specific build environment variable (set as `BUILD_VAR`) in the corresponding
op builder class in the
[op\_builder](https://github.com/microsoft/DeepSpeed/tree/master/op_builder)
directory. For example to install only the Fused Lamb op you would install via:
## Install DeepSpeed from source

After cloning the DeepSpeed repo from GitHub, you can install DeepSpeed in
JIT mode via pip (see below). This install should complete
quickly since it is not compiling any C++/CUDA source files.

```bash
DS_BUILD_FUSED_LAMB=1 pip install .
pip install .
```

For installs spanning multiple nodes we find it useful to install DeepSpeed
using the
[install.sh](https://github.com/microsoft/DeepSpeed/blob/master/install.sh)
script in the repo. This will build a python wheel locally and copy it to all
the nodes listed in your hostfile (either given via --hostfile, or defaults to
/job/hostfile).


## Feature specific dependencies

Some DeepSpeed features require specific dependencies outside of the general
Expand Down