Skip to content

Commit

Permalink
Remove AcceleratorConnector.use_dp
Browse files Browse the repository at this point in the history
  • Loading branch information
DuYicong515 committed Feb 26, 2022
1 parent 7e2f9fb commit 60fe41d
Show file tree
Hide file tree
Showing 3 changed files with 5 additions and 6 deletions.
3 changes: 3 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -598,6 +598,9 @@ The format is based on [Keep a Changelog](http://keepachangelog.com/en/1.0.0/).
- Removed the `AcceleratorConnector.device_type` property ([#12081](https://github.com/PyTorchLightning/pytorch-lightning/pull/12081))


- Removed `AcceleratorConnector.use_dp` property ([#12112](https://github.com/PyTorchLightning/pytorch-lightning/pull/12112))


### Fixed

- Fixed an issue where `HorovodStrategy.teardown()` did not complete gracefully if an exception was thrown during callback setup [#11752](https://github.com/PyTorchLightning/pytorch-lightning/pull/11752)
Expand Down
3 changes: 2 additions & 1 deletion pytorch_lightning/trainer/configuration_validator.py
Original file line number Diff line number Diff line change
Expand Up @@ -12,6 +12,7 @@
# See the License for the specific language governing permissions and
# limitations under the License.
import pytorch_lightning as pl
from pytorch_lightning.strategies import DataParallelStrategy
from pytorch_lightning.trainer.states import TrainerFn
from pytorch_lightning.utilities.exceptions import MisconfigurationException
from pytorch_lightning.utilities.model_helpers import is_overridden
Expand Down Expand Up @@ -208,7 +209,7 @@ def __verify_dp_batch_transfer_support(trainer: "pl.Trainer", model: "pl.Lightni
batch_transfer_hooks = ("on_before_batch_transfer", "transfer_batch_to_device", "on_after_batch_transfer")
datahook_selector = trainer._data_connector._datahook_selector
for hook in batch_transfer_hooks:
if trainer._accelerator_connector.use_dp and (
if isinstance(trainer.strategy, DataParallelStrategy) and (
is_overridden(hook, datahook_selector.model) or is_overridden(hook, datahook_selector.datamodule)
):
raise MisconfigurationException(f"Overriding `{hook}` is not supported in DP mode.")
Expand Down
5 changes: 0 additions & 5 deletions pytorch_lightning/trainer/connectors/accelerator_connector.py
Original file line number Diff line number Diff line change
Expand Up @@ -47,7 +47,6 @@
TorchElasticEnvironment,
)
from pytorch_lightning.strategies import (
DataParallelStrategy,
DDP2Strategy,
DDPFullyShardedStrategy,
DDPShardedStrategy,
Expand Down Expand Up @@ -859,7 +858,3 @@ def use_ipu(self) -> bool:
@property
def has_tpu(self) -> bool:
return isinstance(self.accelerator, TPUAccelerator)

@property
def use_dp(self) -> bool:
return isinstance(self.strategy, DataParallelStrategy)

0 comments on commit 60fe41d

Please sign in to comment.