Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Fixing bug in Megatron BERT when loss mask is all zeros (#5424)
* Fixing bug when loss mask is fully zero Signed-off-by: Shanmugam Ramasamy <[email protected]> * [pre-commit.ci] auto fixes from pre-commit.com hooks for more information, see https://pre-commit.ci * Update megatron_bert_model.py Signed-off-by: Shanmugam Ramasamy <[email protected]> * Update dataset_utils.py Signed-off-by: Shanmugam Ramasamy <[email protected]> * [pre-commit.ci] auto fixes from pre-commit.com hooks for more information, see https://pre-commit.ci * Update dataset_utils.py Signed-off-by: Shanmugam Ramasamy <[email protected]> * Update dataset_utils.py Signed-off-by: Shanmugam Ramasamy <[email protected]> Signed-off-by: Shanmugam Ramasamy <[email protected]> Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com> Co-authored-by: Sandeep Subramanian <[email protected]>
- Loading branch information