Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add jit.ignore to prototype optimizers #2958

Closed
wants to merge 1 commit into from
Closed

Conversation

spcyppt
Copy link
Contributor

@spcyppt spcyppt commented Aug 9, 2024

Summary:
torch.compile doesn't seem to cause errors if we deprecate an optimizer that is no longer used, but torch.jit.script will. torch.jit.script seems to check and ensure all decision branches are alive.

To make prototype optimizers easily deprecated once included in production, we wrap the invoker function with torch.jit.ignore. This means that we need to always keep auto-generating the lookup_{}.py even the optimizers are deprecated and their backends are removed.

simplified Bento example

Differential Revision: D60943180

@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D60943180

Copy link

netlify bot commented Aug 9, 2024

Deploy Preview for pytorch-fbgemm-docs ready!

Name Link
🔨 Latest commit bf2ea1e
🔍 Latest deploy log https://app.netlify.com/sites/pytorch-fbgemm-docs/deploys/66b58047887bad0008ee15a2
😎 Deploy Preview https://deploy-preview-2958--pytorch-fbgemm-docs.netlify.app
📱 Preview on mobile
Toggle QR Code...

QR Code

Use your smartphone camera to open QR code link.

To edit notification comments on pull requests, go to your Netlify site configuration.

@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D60943180

spcyppt added a commit that referenced this pull request Aug 9, 2024
Summary:
X-link: facebookresearch/FBGEMM#58

Pull Request resolved: #2958

`torch.compile` doesn't seem to cause errors if we deprecate an optimizer that is no longer used, but `torch.jit.script` will. `torch.jit.script` seems to check and ensure all decision branches are alive.

To make prototype optimizers easily deprecated once included in production, we wrap the invoker function with `torch.jit.ignore`. This means that we need to always keep auto-generating the `lookup_{}.py` even the optimizers are deprecated and their backends are removed.

[simplified Bento example](https://fburl.com/anp/rbktkl08)

Reviewed By: q10

Differential Revision: D60943180
Summary:
`torch.compile` doesn't seem to cause errors if we deprecate an optimizer that is no longer used, but `torch.jit.script` will. `torch.jit.script` seems to check and ensure all decision branches are alive. See [simplified Bento example](https://fburl.com/anp/rbktkl08)

To make prototype optimizers easily deprecated once included in production, we wrap the invoker function with `torch.jit.ignore`. This means that we need to always keep auto-generating the `lookup_{}.py` even the optimizers are deprecated and their backends are removed.

**Usage**
Add  `"is_prototype_optimizer": True` for the optimizer in `/codegen/genscript/optimizers.py`
Example:
```
def ensemble_rowwise_adagrad_optimizer:
   return {
      "optimizer": "ensemble_rowwise_adagrad",
      "is_prototype_optimizer": True,
   }
```

Reviewed By: q10

Differential Revision: D60943180
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D60943180

@facebook-github-bot
Copy link
Contributor

This pull request has been merged in 9c0aa2a.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants