Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[dev] Config Registration Functions in __init__.py #72

Open
wants to merge 7 commits into
base: main
Choose a base branch
from

Conversation

romesco
Copy link
Contributor

@romesco romesco commented Apr 22, 2021

implements config registration for optim via __init__.py

If this looks good, I'll broadcast the same strategy across the rest of the configs we currently have.

Partially Addressing #53

@facebook-github-bot facebook-github-bot added the CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. label Apr 22, 2021
@romesco romesco changed the title [WIP] optim config registration function [WIP][dev] optim config registration function Apr 22, 2021
@romesco romesco changed the title [WIP][dev] optim config registration function [WIP][dev] Config Registration Functions in __init__ Apr 22, 2021
@romesco romesco changed the title [WIP][dev] Config Registration Functions in __init__ [WIP][dev] Config Registration Functions in __init__.py Apr 22, 2021
@romesco romesco requested a review from omry April 22, 2021 22:05
hydra-configs-torch/hydra_configs/torch/optim/__init__.py Outdated Show resolved Hide resolved
examples/register_configs.py Outdated Show resolved Hide resolved
@romesco romesco changed the title [WIP][dev] Config Registration Functions in __init__.py [dev] Config Registration Functions in __init__.py Apr 23, 2021
@romesco romesco marked this pull request as ready for review April 23, 2021 21:59
@romesco
Copy link
Contributor Author

romesco commented Apr 23, 2021

Before merging, I just want to point out that because of the way pytorch structures its files (and __init__.py for certain modules), the optimizer register can be done like this:

hydra_configs.torch.optim.register()

however, losses and data-related must be done like this:

hydra_configs.torch.nn.modules.register()
hydra_configs.torch.utils.data.register()

For now it's not possible to do:

hydra_configs.torch.nn.modules.loss.register()

because loss.py is an auto-generated file and we aren't putting logic in those.

Semantically, registering optimizers with optim.register() is nicer than the case for loss which is registered by modules.register().

I'm open to ideas for ameliorating this semantic ambiguity.

@romesco romesco requested a review from omry April 23, 2021 22:16
Comment on lines +3 to +20
from .loss import BCELossConf
from .loss import BCEWithLogitsLossConf
from .loss import CosineEmbeddingLossConf
from .loss import CTCLossConf
from .loss import L1LossConf
from .loss import HingeEmbeddingLossConf
from .loss import KLDivLossConf
from .loss import MarginRankingLossConf
from .loss import MSELossConf
from .loss import MultiLabelMarginLossConf
from .loss import MultiLabelSoftMarginLossConf
from .loss import MultiMarginLossConf
from .loss import NLLLossConf
from .loss import NLLLoss2dConf
from .loss import PoissonNLLLossConf
from .loss import SmoothL1LossConf
from .loss import SoftMarginLossConf
from .loss import TripletMarginLossConf
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

feels like this would better be inside loss.py and __init__.py would just call it from its register function.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It makes sense there, but loss.py is a configen output. This is manual code.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

maybe we should create a convention:

loss.py # generated
loss_handcoded.py # manual additions.

At this point, the __init__.py file can import them both (and possibly also export from both, making them appear like one file).

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

not a bad idea. I think we need a convention for handling manual configs anyways.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

here is an even better idea.
We can add support to custom import to configen at the top and the bottom of the generated configs.
Allowing automatic import of user provided files if they exist.

Copy link
Contributor Author

@romesco romesco Apr 26, 2021

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Could you elaborate on this?

Do you mean chain the 'manual' file to the 'generated' one through importing that is automatically added by configen if a field is passed in the configen conf yaml?

Copy link
Contributor

@omry omry Apr 27, 2021

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

not exactly.
I mean importing if a file with a specific name exists.

generated/foo.py

try:
 from . import foo_header
except ImportError:
  pass

# generated code here

try:
 from . import foo_footer
except ImportError:
  pass

This way, you can add manual foo_footer.py and foo_header.py files that will get imported automatically when the user imports foo.py.

Copy link
Contributor

@omry omry left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Overall looks okay.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants