Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Structured Configs with URL for torch.optim and torch.utils.data #18

Open
wants to merge 9 commits into
base: main
Choose a base branch
from
6 changes: 5 additions & 1 deletion config/torch/optim/adadelta.py
Original file line number Diff line number Diff line change
@@ -1,4 +1,5 @@
# Copyright (c) Facebook, Inc. and its affiliates. All Rights Reserved
# Copyright (c) 2020, Facebook, Inc. and its affiliates. All Rights Reserved
# SPDX-License-Identifier: MIT
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please do not change the headers until we resolve the discussion about it.

#
# Generated by configen, do not edit.
# See https://github.com/facebookresearch/hydra/tree/master/tools/configen
Expand All @@ -13,6 +14,9 @@

@dataclass
class AdadeltaConf:
"""For more details on parameteres please refer to the original documentation:
https://pytorch.org/docs/stable/optim.html#torch.optim.Adadelta
"""
_target_: str = "torch.optim.adadelta.Adadelta"
params: Any = MISSING
lr: Any = 1.0
Expand Down
6 changes: 5 additions & 1 deletion config/torch/optim/adagrad.py
Original file line number Diff line number Diff line change
@@ -1,4 +1,5 @@
# Copyright (c) Facebook, Inc. and its affiliates. All Rights Reserved
# Copyright (c) 2020, Facebook, Inc. and its affiliates. All Rights Reserved
# SPDX-License-Identifier: MIT
#
# Generated by configen, do not edit.
# See https://github.com/facebookresearch/hydra/tree/master/tools/configen
Expand All @@ -13,6 +14,9 @@

@dataclass
class AdagradConf:
"""For more details on parameteres please refer to the original documentation:
https://pytorch.org/docs/stable/optim.html#torch.optim.Adagrad
"""
Comment on lines +17 to +19
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Let's hold on with this change until we finalize the change to configen. This is not urgent and we can do it with configen once it's ready.

_target_: str = "torch.optim.adagrad.Adagrad"
params: Any = MISSING
lr: Any = 0.01
Expand Down
6 changes: 5 additions & 1 deletion config/torch/optim/adam.py
Original file line number Diff line number Diff line change
@@ -1,4 +1,5 @@
# Copyright (c) Facebook, Inc. and its affiliates. All Rights Reserved
# Copyright (c) 2020, Facebook, Inc. and its affiliates. All Rights Reserved
# SPDX-License-Identifier: MIT
#
# Generated by configen, do not edit.
# See https://github.com/facebookresearch/hydra/tree/master/tools/configen
Expand All @@ -13,6 +14,9 @@

@dataclass
class AdamConf:
"""For more details on parameteres please refer to the original documentation:
https://pytorch.org/docs/stable/optim.html#torch.optim.Adam
"""
_target_: str = "torch.optim.adam.Adam"
params: Any = MISSING
lr: Any = 0.001
Expand Down
6 changes: 5 additions & 1 deletion config/torch/optim/adamax.py
Original file line number Diff line number Diff line change
@@ -1,4 +1,5 @@
# Copyright (c) Facebook, Inc. and its affiliates. All Rights Reserved
# Copyright (c) 2020, Facebook, Inc. and its affiliates. All Rights Reserved
# SPDX-License-Identifier: MIT
#
# Generated by configen, do not edit.
# See https://github.com/facebookresearch/hydra/tree/master/tools/configen
Expand All @@ -13,6 +14,9 @@

@dataclass
class AdamaxConf:
"""For more details on parameteres please refer to the original documentation:
https://pytorch.org/docs/stable/optim.html#torch.optim.Adamax
"""
_target_: str = "torch.optim.adamax.Adamax"
params: Any = MISSING
lr: Any = 0.002
Expand Down
6 changes: 5 additions & 1 deletion config/torch/optim/adamw.py
Original file line number Diff line number Diff line change
@@ -1,4 +1,5 @@
# Copyright (c) Facebook, Inc. and its affiliates. All Rights Reserved
# Copyright (c) 2020, Facebook, Inc. and its affiliates. All Rights Reserved
# SPDX-License-Identifier: MIT
#
# Generated by configen, do not edit.
# See https://github.com/facebookresearch/hydra/tree/master/tools/configen
Expand All @@ -13,6 +14,9 @@

@dataclass
class AdamWConf:
"""For more details on parameteres please refer to the original documentation:
https://pytorch.org/docs/stable/optim.html#torch.optim.AdamW
"""
_target_: str = "torch.optim.adamw.AdamW"
params: Any = MISSING
lr: Any = 0.001
Expand Down
6 changes: 5 additions & 1 deletion config/torch/optim/asgd.py
Original file line number Diff line number Diff line change
@@ -1,4 +1,5 @@
# Copyright (c) Facebook, Inc. and its affiliates. All Rights Reserved
# Copyright (c) 2020, Facebook, Inc. and its affiliates. All Rights Reserved
# SPDX-License-Identifier: MIT
#
# Generated by configen, do not edit.
# See https://github.com/facebookresearch/hydra/tree/master/tools/configen
Expand All @@ -13,6 +14,9 @@

@dataclass
class ASGDConf:
"""For more details on parameteres please refer to the original documentation:
https://pytorch.org/docs/stable/optim.html#torch.optim.ASGD
"""
_target_: str = "torch.optim.asgd.ASGD"
params: Any = MISSING
lr: Any = 0.01
Expand Down
6 changes: 5 additions & 1 deletion config/torch/optim/lbfgs.py
Original file line number Diff line number Diff line change
@@ -1,4 +1,5 @@
# Copyright (c) Facebook, Inc. and its affiliates. All Rights Reserved
# Copyright (c) 2020, Facebook, Inc. and its affiliates. All Rights Reserved
# SPDX-License-Identifier: MIT
#
# Generated by configen, do not edit.
# See https://github.com/facebookresearch/hydra/tree/master/tools/configen
Expand All @@ -13,6 +14,9 @@

@dataclass
class LBFGSConf:
"""For more details on parameteres please refer to the original documentation:
https://pytorch.org/docs/stable/optim.html#torch.optim.LBFGS
"""
_target_: str = "torch.optim.lbfgs.LBFGS"
params: Any = MISSING
lr: Any = 1
Expand Down
13 changes: 7 additions & 6 deletions config/torch/optim/lr_scheduler.py
Original file line number Diff line number Diff line change
@@ -1,4 +1,5 @@
# Copyright (c) Facebook, Inc. and its affiliates. All Rights Reserved
# Copyright (c) 2020, Facebook, Inc. and its affiliates. All Rights Reserved
# SPDX-License-Identifier: MIT
#
# Generated by configen, do not edit.
# See https://github.com/facebookresearch/hydra/tree/master/tools/configen
Expand Down Expand Up @@ -66,12 +67,12 @@ class CosineAnnealingLRConf:
class ReduceLROnPlateauConf:
_target_: str = "torch.optim.lr_scheduler.ReduceLROnPlateau"
optimizer: Any = MISSING
mode: str = 'min'
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hey @romesco pulled latest hydra changes and regenerated configs - ` -> "

mode: Any = min
tkornuta-nvidia marked this conversation as resolved.
Show resolved Hide resolved
factor: Any = 0.1
patience: Any = 10
verbose: Any = False
threshold: Any = 0.0001
threshold_mode: str = 'rel'
threshold_mode: Any = rel
cooldown: Any = 0
min_lr: Any = 0
eps: Any = 1e-08
Expand All @@ -85,10 +86,10 @@ class CyclicLRConf:
max_lr: Any = MISSING
step_size_up: Any = 2000
step_size_down: Any = None
mode: str = 'triangular'
mode: Any = triangular
gamma: Any = 1.0
scale_fn: Any = None
scale_mode: str = 'cycle'
scale_mode: Any = cycle
cycle_momentum: Any = True
base_momentum: Any = 0.8
max_momentum: Any = 0.9
Expand All @@ -114,7 +115,7 @@ class OneCycleLRConf:
epochs: Any = None
steps_per_epoch: Any = None
pct_start: Any = 0.3
anneal_strategy: str = 'cos'
anneal_strategy: Any = cos
cycle_momentum: Any = True
base_momentum: Any = 0.85
max_momentum: Any = 0.95
Expand Down
6 changes: 5 additions & 1 deletion config/torch/optim/rmsprop.py
Original file line number Diff line number Diff line change
@@ -1,4 +1,5 @@
# Copyright (c) Facebook, Inc. and its affiliates. All Rights Reserved
# Copyright (c) 2020, Facebook, Inc. and its affiliates. All Rights Reserved
# SPDX-License-Identifier: MIT
#
# Generated by configen, do not edit.
# See https://github.com/facebookresearch/hydra/tree/master/tools/configen
Expand All @@ -13,6 +14,9 @@

@dataclass
class RMSpropConf:
"""For more details on parameteres please refer to the original documentation:
https://pytorch.org/docs/stable/optim.html#torch.optim.RMSprop
"""
_target_: str = "torch.optim.rmsprop.RMSprop"
params: Any = MISSING
lr: Any = 0.01
Expand Down
6 changes: 5 additions & 1 deletion config/torch/optim/rprop.py
Original file line number Diff line number Diff line change
@@ -1,4 +1,5 @@
# Copyright (c) Facebook, Inc. and its affiliates. All Rights Reserved
# Copyright (c) 2020, Facebook, Inc. and its affiliates. All Rights Reserved
# SPDX-License-Identifier: MIT
#
# Generated by configen, do not edit.
# See https://github.com/facebookresearch/hydra/tree/master/tools/configen
Expand All @@ -13,6 +14,9 @@

@dataclass
class RpropConf:
"""For more details on parameteres please refer to the original documentation:
https://pytorch.org/docs/stable/optim.html#torch.optim.Rprop
"""
_target_: str = "torch.optim.rprop.Rprop"
params: Any = MISSING
lr: Any = 0.01
Expand Down
6 changes: 5 additions & 1 deletion config/torch/optim/sgd.py
Original file line number Diff line number Diff line change
@@ -1,4 +1,5 @@
# Copyright (c) Facebook, Inc. and its affiliates. All Rights Reserved
# Copyright (c) 2020, Facebook, Inc. and its affiliates. All Rights Reserved
# SPDX-License-Identifier: MIT
#
# Generated by configen, do not edit.
# See https://github.com/facebookresearch/hydra/tree/master/tools/configen
Expand All @@ -13,6 +14,9 @@

@dataclass
class SGDConf:
"""For more details on parameteres please refer to the original documentation:
https://pytorch.org/docs/stable/optim.html#torch.optim.SGD
"""
_target_: str = "torch.optim.sgd.SGD"
params: Any = MISSING
lr: Any = MISSING # _RequiredParameter
Expand Down
6 changes: 5 additions & 1 deletion config/torch/optim/sparse_adam.py
Original file line number Diff line number Diff line change
@@ -1,4 +1,5 @@
# Copyright (c) Facebook, Inc. and its affiliates. All Rights Reserved
# Copyright (c) 2020, Facebook, Inc. and its affiliates. All Rights Reserved
# SPDX-License-Identifier: MIT
#
# Generated by configen, do not edit.
# See https://github.com/facebookresearch/hydra/tree/master/tools/configen
Expand All @@ -13,6 +14,9 @@

@dataclass
class SparseAdamConf:
"""For more details on parameteres please refer to the original documentation:
https://pytorch.org/docs/stable/optim.html#torch.optim.SparseAdam
"""
_target_: str = "torch.optim.sparse_adam.SparseAdam"
params: Any = MISSING
lr: Any = 0.001
Expand Down
33 changes: 33 additions & 0 deletions config/torch/utils/data.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,33 @@
# Copyright (c) 2020, Facebook, Inc. and its affiliates. All Rights Reserved
# SPDX-License-Identifier: MIT
#
# Generated by configen, do not edit.
# See https://github.com/facebookresearch/hydra/tree/master/tools/configen
# fmt: off
# isort:skip_file
# flake8: noqa

from dataclasses import dataclass, field
from omegaconf import MISSING
from typing import Any


@dataclass
class DataLoaderConf:
"""For more details on parameteres please refer to the original documentation:
https://pytorch.org/docs/stable/data.html#torch.utils.data.DataLoader
"""
_target_: str = "torch.utils.data.DataLoader"
dataset: Any = MISSING
batch_size: Any = 1
shuffle: Any = False
sampler: Any = None
batch_sampler: Any = None
num_workers: Any = 0
collate_fn: Any = None
pin_memory: Any = False
drop_last: Any = False
timeout: Any = 0
worker_init_fn: Any = None
multiprocessing_context: Any = None
generator: Any = None
18 changes: 17 additions & 1 deletion conf/configen.yaml → sources/torch.optim/configen.yaml
Original file line number Diff line number Diff line change
@@ -1,9 +1,13 @@
# Copyright (c) 2020, Facebook, Inc. and its affiliates. All Rights Reserved
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Regarding the change from conf directory to sources directory, I think this is a good idea. I think the term conf/config/configen is very overloaded in this setting. (something I commented on a while back too). It made knowing where to look for things imo overly difficult until I got used to the lay of the land. I might even argue this should be propagated to the default dir for configen.

# SPDX-License-Identifier: MIT

configen:
# output directory
output_dir: ${hydra:runtime.cwd}

header: |
# Copyright (c) Facebook, Inc. and its affiliates. All Rights Reserved
# Copyright (c) 2020, Facebook, Inc. and its affiliates. All Rights Reserved
# SPDX-License-Identifier: MIT
#
# Generated by configen, do not edit.
# See https://github.com/facebookresearch/hydra/tree/master/tools/configen
Expand All @@ -16,61 +20,73 @@ configen:
# list of modules to generate configs for
modules:
- name: torch.optim.adadelta
url: https://pytorch.org/docs/stable/optim.html#torch.optim.Adadelta
# for each module, a list of classes
classes:
- Adadelta

- name: torch.optim.adagrad
url: https://pytorch.org/docs/stable/optim.html#torch.optim.Adagrad
# for each module, a list of classes
classes:
- Adagrad

- name: torch.optim.adam
url: https://pytorch.org/docs/stable/optim.html#torch.optim.Adam
# for each module, a list of classes
classes:
- Adam

- name: torch.optim.adamw
url: https://pytorch.org/docs/stable/optim.html#torch.optim.AdamW
# for each module, a list of classes
classes:
- AdamW

- name: torch.optim.sparse_adam
url: https://pytorch.org/docs/stable/optim.html#torch.optim.SparseAdam
# for each module, a list of classes
classes:
- SparseAdam

- name: torch.optim.adamax
url: https://pytorch.org/docs/stable/optim.html#torch.optim.Adamax
# for each module, a list of classes
classes:
- Adamax

- name: torch.optim.asgd
url: https://pytorch.org/docs/stable/optim.html#torch.optim.ASGD
# for each module, a list of classes
classes:
- ASGD

- name: torch.optim.sgd
url: https://pytorch.org/docs/stable/optim.html#torch.optim.SGD
# for each module, a list of classes
classes:
- SGD

- name: torch.optim.rprop
url: https://pytorch.org/docs/stable/optim.html#torch.optim.Rprop
# for each module, a list of classes
classes:
- Rprop

- name: torch.optim.rmsprop
url: https://pytorch.org/docs/stable/optim.html#torch.optim.RMSprop
# for each module, a list of classes
classes:
- RMSprop

- name: torch.optim.lbfgs
url: https://pytorch.org/docs/stable/optim.html#torch.optim.LBFGS
# for each module, a list of classes
classes:
- LBFGS

- name: torch.optim.lr_scheduler
url:
classes:
# Schedulers
- LambdaLR
Expand Down
26 changes: 26 additions & 0 deletions sources/torch.utils.data/configen.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,26 @@
# Copyright (c) 2020, Facebook, Inc. and its affiliates. All Rights Reserved
# SPDX-License-Identifier: MIT

configen:
# output directory
output_dir: ${hydra:runtime.cwd}

header: |
# Copyright (c) 2020, Facebook, Inc. and its affiliates. All Rights Reserved
# SPDX-License-Identifier: MIT
#
# Generated by configen, do not edit.
# See https://github.com/facebookresearch/hydra/tree/master/tools/configen
# fmt: off
# isort:skip_file
# flake8: noqa

module_path_pattern: 'config/{{module_path}}.py'

# list of modules to generate configs for
modules:
- name: torch.utils.data
url: https://pytorch.org/docs/stable/data.html#torch.utils.data.DataLoader
# for each module, a list of classes
classes:
- DataLoader