Skip to content

Commit

Permalink
Merge remote-tracking branch 'nvidia/master' into u_switch_apex_ddp_t…
Browse files Browse the repository at this point in the history
…o_torch

Signed-off-by: Jason <[email protected]>
  • Loading branch information
blisc committed Feb 13, 2020
2 parents 4eb90b5 + f072029 commit 9fc14d7
Show file tree
Hide file tree
Showing 7 changed files with 33 additions and 44 deletions.
6 changes: 6 additions & 0 deletions docs/sources/source/nlp/bert_pretraining.rst
Original file line number Diff line number Diff line change
Expand Up @@ -6,6 +6,12 @@ Make sure you have ``nemo`` and ``nemo_nlp`` installed before starting this tuto

The code used in this tutorial can be found at ``examples/nlp/language_modeling/bert_pretraining.py``.

.. tip::
Pretrained BERT models can be found at
`https://ngc.nvidia.com/catalog/models/nvidia:bertlargeuncasedfornemo <https://ngc.nvidia.com/catalog/models/nvidia:bertlargeuncasedfornemo>`__
`https://ngc.nvidia.com/catalog/models/nvidia:bertbaseuncasedfornemo <https://ngc.nvidia.com/catalog/models/nvidia:bertbaseuncasedfornemo>`__
`https://ngc.nvidia.com/catalog/models/nvidia:bertbasecasedfornemo <https://ngc.nvidia.com/catalog/models/nvidia:bertbasecasedfornemo>`__

Introduction
------------

Expand Down
4 changes: 4 additions & 0 deletions docs/sources/source/nlp/joint_intent_slot_filling.rst
Original file line number Diff line number Diff line change
Expand Up @@ -9,6 +9,10 @@ There are four pre-trained BERT models that we can select from using the argumen
using the script for loading pre-trained models from `pytorch_transformers`. See the list of available pre-trained models
`here <https://huggingface.co/pytorch-transformers/pretrained_models.html>`__.

.. tip::

For pretraining BERT in NeMo and pretrained model checkpoints go to `BERT pretraining <https://nvidia.github.io/NeMo/nlp/bert_pretraining.html>`__.


Preliminaries
-------------
Expand Down
6 changes: 6 additions & 0 deletions docs/sources/source/nlp/ner.rst
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,12 @@ Tutorial
Make sure you have ``nemo`` and ``nemo_nlp`` installed before starting this
tutorial. See the :ref:`installation` section for more details.

.. tip::

For pretraining BERT in NeMo and pretrained model checkpoints go to `BERT pretraining <https://nvidia.github.io/NeMo/nlp/bert_pretraining.html>`__.



Introduction
------------

Expand Down
2 changes: 2 additions & 0 deletions docs/sources/source/nlp/punctuation.rst
Original file line number Diff line number Diff line change
Expand Up @@ -7,6 +7,8 @@ An ASR system typically generates text with no punctuation and capitalization of
.. tip::

We recommend you to try this example in Jupyter notebook examples/nlp/token_classification/PunctuationWithBERT.ipynb.
For pretraining BERT in NeMo and pretrained model checkpoints go to `BERT pretraining <https://nvidia.github.io/NeMo/nlp/bert_pretraining.html>`__.


Task Description
----------------
Expand Down
6 changes: 6 additions & 0 deletions docs/sources/source/nlp/question_answering.rst
Original file line number Diff line number Diff line change
Expand Up @@ -13,6 +13,12 @@ The pretrained back-bone models can be specified by `--model_type` and the speci
See the list of available pre-trained models
`here <https://huggingface.co/transformers/pretrained_models.html>`__.

.. tip::

For pretraining BERT in NeMo and pretrained model checkpoints go to `BERT pretraining <https://nvidia.github.io/NeMo/nlp/bert_pretraining.html>`__.



Preliminaries
-------------

Expand Down
9 changes: 9 additions & 0 deletions examples/nlp/language_modeling/bert_pretraining.py
Original file line number Diff line number Diff line change
Expand Up @@ -74,6 +74,15 @@
350000 iterations on a DGX1 with 8 V100 32GB GPUs with AMP O1 optimization
should finish under 5 days and yield an MRPC score of ACC/F1 85.05/89.35.
More information about BERT pretraining can be found at
https://nvidia.github.io/NeMo/nlp/bert_pretraining.html
Pretrained BERT models can be found at
https://ngc.nvidia.com/catalog/models/nvidia:bertlargeuncasedfornemo
https://ngc.nvidia.com/catalog/models/nvidia:bertbaseuncasedfornemo
https://ngc.nvidia.com/catalog/models/nvidia:bertbasecasedfornemo
"""
import argparse
import math
Expand Down
44 changes: 0 additions & 44 deletions tests/nlp/test_squad.py

This file was deleted.

0 comments on commit 9fc14d7

Please sign in to comment.