Skip to content

Commit

Permalink
Add jit.ignore to prototype optimizers (#2958)
Browse files Browse the repository at this point in the history
Summary:
X-link: facebookresearch/FBGEMM#58

Pull Request resolved: #2958

`torch.compile` doesn't seem to cause errors if we deprecate an optimizer that is no longer used, but `torch.jit.script` will. `torch.jit.script` seems to check and ensure all decision branches are alive.

To make prototype optimizers easily deprecated once included in production, we wrap the invoker function with `torch.jit.ignore`. This means that we need to always keep auto-generating the `lookup_{}.py` even the optimizers are deprecated and their backends are removed.

[simplified Bento example](https://fburl.com/anp/rbktkl08)

Reviewed By: q10

Differential Revision: D60943180
  • Loading branch information
spcyppt authored and facebook-github-bot committed Aug 9, 2024
1 parent 3c559ab commit 8a5c3fb
Showing 1 changed file with 6 additions and 1 deletion.
Original file line number Diff line number Diff line change
Expand Up @@ -38,7 +38,12 @@ torch.ops.load_library("//deeplearning/fbgemm/fbgemm_gpu:embedding_inplace_updat

{%- endif %}


{%- if is_prototype_optimizer %}
# Decorate the prototype optimizers which may be deprecated in the future with jit.ignore to avoid
# possible errors from torch.jit.script.
# Note that backends can be removed but the lookup invoker is still needed for backward compatibility
@torch.jit.ignore
{%- endif %}
def invoke(
common_args: CommonArgs,
optimizer_args: OptimizerArgs,
Expand Down

0 comments on commit 8a5c3fb

Please sign in to comment.