Skip to content

Commit

Permalink
Wrapping pytorch autolog with rank_zero_only decorator to avoid multi…
Browse files Browse the repository at this point in the history
…ple runs during multi gpu training

Signed-off-by: Shrinath Suresh <[email protected]>
  • Loading branch information
shrinath-suresh committed May 9, 2022
1 parent c577c7f commit c3b9c60
Showing 1 changed file with 2 additions and 0 deletions.
2 changes: 2 additions & 0 deletions mlflow/pytorch/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -54,6 +54,7 @@
)
from mlflow.tracking._model_registry import DEFAULT_AWAIT_MAX_SLEEP_SECONDS
from mlflow.utils.autologging_utils import autologging_integration, safe_patch
from pytorch_lightning.utilities import rank_zero_only

FLAVOR_NAME = "pytorch"

Expand Down Expand Up @@ -878,6 +879,7 @@ def load_state_dict(state_dict_uri, **kwargs):


@autologging_integration(FLAVOR_NAME)
@rank_zero_only
def autolog(
log_every_n_epoch=1,
log_every_n_step=None,
Expand Down

0 comments on commit c3b9c60

Please sign in to comment.