Skip to content
This repository has been archived by the owner on Jan 6, 2023. It is now read-only.

Commit

Permalink
v0.2.2 release + changelogs
Browse files Browse the repository at this point in the history
Summary: v0.2.2

Reviewed By: tierex

Differential Revision: D26502264

fbshipit-source-id: c47fde9239eae76fbe69b864243de70872e1f282
  • Loading branch information
Kiuk Chung authored and facebook-github-bot committed Feb 18, 2021
1 parent dae09a6 commit 3aa8fb8
Showing 1 changed file with 19 additions and 0 deletions.
19 changes: 19 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,24 @@
# CHANGELOG

## 0.2.2 (Feb 18, 2021)

> **_NOTE:_** This is the last release for torchelastic! We are upstreaming TorchElastic into
> pytorch. See [pytorch issue-50621](https://github.com/pytorch/pytorch/issues/50621).
### PyTorch Elastic

* (new) `torchelastic.multiprocessing`, drop in replacement for `torch.multiprocessing` that supports:
* both function and binary launches
* inter-process exception propagation
* piping worker stdout/stderr to separate log files
* tail worker log files to main console with `{role}_{rank}:` prefix on each line
* Improvements to `torchelastic.events`
* `NCCL_ASYNC_ERROR_HANDLING` set by default in torchelastic agent
* Implemented shutdown barrier on agent to reduce exit time variance
* Minor cosmetic improvements to rendezvous configuration
* Non functional refactoring of `EtcdRendezvous`
* TSM API improvements

## 0.2.1 (October 05, 2020)

### PyTorch Elastic
Expand Down

0 comments on commit 3aa8fb8

Please sign in to comment.