Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Register optimizers in a centralized location #7157

Conversation

mattsoulanille
Copy link
Member

@mattsoulanille mattsoulanille commented Dec 9, 2022

Optimizers are currently registered in the same file they are defined in, which is a side effect. This would make it impossible to tree-shake them in a custom bundle when sideEffects is removed from the package.json.

This PR moves Optimizer registration to the register_optimizers.ts file. The files that the Optimizers are defined in no longer have side effects. To exclude Optimizers from a custom bundle, the registerOptimizers function is called from index.ts. Custom bundles replace index.ts with a different file that does not call this function.

To see the logs from the Cloud Build CI, please join either our discussion or announcement mailing list.


This change is Reviewable

@mattsoulanille mattsoulanille force-pushed the optimizer_registration_central branch from f7ed2ce to a1dcb23 Compare December 9, 2022 23:05
@mattsoulanille
Copy link
Member Author

mattsoulanille commented Dec 9, 2022

Depends on #7158
Depends on #7160

Copy link
Collaborator

@pyu10055 pyu10055 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Reviewed 11 of 11 files at r1, all commit messages.
Reviewable status: :shipit: complete! 1 of 1 approvals obtained (waiting on @Linchenn)

@mattsoulanille mattsoulanille force-pushed the optimizer_registration_central branch from a1dcb23 to b68b939 Compare December 10, 2022 01:22
Optimizers are currently registered in the same file they are defined in, which is a side effect. This would make it impossible to tree-shake them in a custom bundle when `sideEffects` is removed from the package.json.

This PR moves Optimizer registration to the `register_optimizers.ts` file. The files that the Optimizers are defined in no longer have side effects. To exclude Optimizers from a custom bundle, the `registerOptimizers` function is called from `index.ts`. Custom bundles replace `index.ts` with a different file that does not call this function.
@mattsoulanille mattsoulanille force-pushed the optimizer_registration_central branch from b68b939 to d69ec8d Compare December 12, 2022 18:03
@mattsoulanille mattsoulanille merged commit c98656d into tensorflow:master Dec 12, 2022
AdamLang96 added a commit to CodeSmithDSMLProjects/tfjs that referenced this pull request Dec 13, 2022
* started resize bicubic

* started padding algorithm for bicubic forward pass in cpu backend

* started padding algorithm for bicubic forward pass in cpu backend

* Mark all calls to 'op()' as pure (tensorflow#7155)

Mark calls to the `op()` function that creates the exported op as pure by using [`/* @__PURE__ */` annotations](https://esbuild.github.io/api/#ignore-annotations) (this also works for Rollup, but I can't find the docs). This comment instructs bundlers that the function call has no side-effects, so it can be removed if the result is not used.

This is okay for the `op` function because, although it references ENGINE, it does so [in a closure](https://github.com/tensorflow/tfjs/blob/master/tfjs-core/src/ops/operation.ts#L48-L61) that it never calls, so while its return value may cause side effects when called, it itself does not.

This has no immediate effect because we still maintain a list of `sideEffects` in the package.json, but it is a step towards removing that list.

Co-authored-by: Linchenn <[email protected]>

* need to fix padding algo

* Update rules_python to 0.16.1 (tensorflow#7160)

This update includes new lock files for pypi packages that make sure their versions don't change between builds. These lock files can be generated with the update_locked_deps.sh script.

As part of this update, the PR pins flax to 0.6.2.

Additionally, python dependencies will only be fetched when a build requires them, so first time javascript-only builds should see a speedup.

* Register optimizers in a centralized location (tensorflow#7157)

Optimizers are currently registered in the same file they are defined in, which is a side effect. This would make it impossible to tree-shake them in a custom bundle when `sideEffects` is removed from the package.json.

This PR moves Optimizer registration to the `register_optimizers.ts` file. The files that the Optimizers are defined in no longer have side effects. To exclude Optimizers from a custom bundle, the `registerOptimizers` function is called from `index.ts`. Custom bundles replace `index.ts` with a different file that does not call this function.

* Simplify how Optimizers are re-exported in train.ts (tensorflow#7156)

`train.ts` exports optimizers by copying them from the `OptimizerConstructors` class onto a `train` object. This is unnecessary because the `OptimizerConstructors` class constructor is a subtype of the `train` object's type (i.e. it has all the properties that `train` has). Instead of creating a new `train` object, this PR re-exports `OptimizerConstructors` as `train`.

This has no direct effect now, but if / when re remove the `sideEffects` field from `package.json`, it helps some bundlers (esbuild) do tree-shaking.

* Use static getters to get optimizer class names (tensorflow#7168)

Each `Optimizer` lists its class name as a static property of the class so it can be serialized and deserialized. This prevents the class from being tree-shaken because bundlers will compile it like this:

```
class SomeOptimizer {
  ...
}

// The bundler can not remove this assignment because
// SomeOptimizer.className could be a setter with a side effect.
SomeOptimizer.className = 'SomeOptimizer';
```

This PR uses a static getter for the class name instead, which bundlers can tree-shake properly.

* need corners

* padding is functional

* debugging padding tool for multiple channels

Co-authored-by: Matthew Soulanille <[email protected]>
Co-authored-by: Linchenn <[email protected]>
Linchenn pushed a commit to Linchenn/tfjs that referenced this pull request Jan 9, 2023
Optimizers are currently registered in the same file they are defined in, which is a side effect. This would make it impossible to tree-shake them in a custom bundle when `sideEffects` is removed from the package.json.

This PR moves Optimizer registration to the `register_optimizers.ts` file. The files that the Optimizers are defined in no longer have side effects. To exclude Optimizers from a custom bundle, the `registerOptimizers` function is called from `index.ts`. Custom bundles replace `index.ts` with a different file that does not call this function.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants