Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add missing algorithms, add stylistic improvements, update with spec conventions #446

Merged
merged 160 commits into from
Aug 30, 2023

Conversation

zolkis
Copy link
Collaborator

@zolkis zolkis commented Aug 10, 2023

A series of changes including:

  • add collapsible stylistic boxes for algorithms and other stylistic improvements
  • add missing algorithms
  • some white space alignments
  • consistent args and variable names across the spec.

Preview | Diff

zolkis and others added 30 commits May 16, 2023 18:28
Squashed from the following commits:
    Replace MLOperand.[[descriptor]] with type and dimensions
    Clarify the algorithm for only setting up the op
    Improve the clamp() algorithm, use the prose assuming the create steps for MLOperand and MLActivation
    Rework clamp with polymorphic behavior. Update for changes in MLOperand.
    Rework clamp() like constant(), polymorphic forms in separate sections, argument and return descriptions as notes.
    Fix platform related steps and reference to internal slots
    Address review, remove note
    Remove back quotes from title

Signed-off-by: Zoltan Kis <[email protected]>
Squashed from the following commits:
    Add reference to the platform operand object
    Address review comment, change note.

Signed-off-by: Zoltan Kis <[email protected]>
…ithm

Add the batch normalization algorithm
Squashed from the following commits:
    Add reference to the platform operand object
    Address review comment, change note.

Signed-off-by: Zoltan Kis <[email protected]>
Squashed from the following commits:
    Replace MLOperand.[[descriptor]] with type and dimensions
    Clarify the algorithm for only setting up the op
    Improve the clamp() algorithm, use the prose assuming the create steps for MLOperand and MLActivation
    Rework clamp with polymorphic behavior. Update for changes in MLOperand.
    Rework clamp() like constant(), polymorphic forms in separate sections, argument and return descriptions as notes.
    Fix platform related steps and reference to internal slots
    Address review, remove note
    Remove back quotes from title

Signed-off-by: Zoltan Kis <[email protected]>
…, clamp, concat, batch norm

Signed-off-by: Zoltan Kis <[email protected]>
@zolkis zolkis requested a review from huningxin August 24, 2023 09:11
index.bs Show resolved Hide resolved
Copy link

@lisa0314 lisa0314 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@zolkis Leave some comments for remaining issue. PTAL. Thanks!

index.bs Outdated Show resolved Hide resolved
index.bs Outdated Show resolved Hide resolved
index.bs Show resolved Hide resolved
@zolkis zolkis force-pushed the zk-conventions-integration branch from d04f364 to 6818a1c Compare August 25, 2023 07:47
Copy link

@miaobin miaobin left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Just checked again and found an issue, PTAL.

index.bs Outdated Show resolved Hide resolved
index.bs Outdated Show resolved Hide resolved
Signed-off-by: Zoltan Kis <[email protected]>
Copy link

@lisa0314 lisa0314 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@zolkis LGTM with one nit. PTAL. Thanks!

<div class=algorithm-steps>
1. [=Assert=]: |op| is one of "reduceL1", "reduceL2", "reduceLogSum", "reduceLogSumExp", "reduceMax", "reduceMean", "reduceMin", "reduceProduct", "reduceSum", "reduceSumSquare".
1. [=Assert=]: the type of |input| is {{MLOperand}}.
1. If |options|.{{MLReduceOptions/axes}} [=map/exists=], if any of its elements is not in [=the range=] 0 to the [=rank=] of |input|, exclusive, then [=exception/throw=] a "{{DataError}}" {{DOMException}}.

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

in [=the range=] 0 to N-1 where N is the [=rank=] of the input tensor.

Copy link
Collaborator Author

@zolkis zolkis Aug 25, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The "exclusive" word does that. :)
The full expression is

in [=the range=] 0 to the [=rank=] of |input|, exclusive

Copy link
Contributor

@huningxin huningxin left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

After the final check with other reviewers, all the comments are either addressed or tracked by issues. With that, this PR is ready for merging.

Thanks all for the productive review! Special thanks to @zolkis 's great work!

@wchao1115 , you may want to take another look before merging. Thanks!

@anssiko
Copy link
Member

anssiko commented Aug 28, 2023

Thank you everyone for taking this major effort to completion with special thanks to @zolkis!

I asked @zolkis to acknowledge the contributors in 18442ad -- if any names are missing (or misspelled), please let us know and we'll fix.

@wchao1115, feel free to proceed at your earliest convenience.

@anssiko
Copy link
Member

anssiko commented Aug 29, 2023

@wchao1115 @huningxin thank you for your approval. As the editors, I will let you discuss among yourself whether you would like to squash this big PR or retain the full commit history. You're free to merge when you are decided on the best path forward.

I have a slight preference for retaining the commits to help future spec archeologists. There's this idiom "how the sausage is made". :)

@huningxin
Copy link
Contributor

I suppose even if we squash this PR and merge it as one commit, the full commit history in this PR would still be preserved, correct?

@anssiko
Copy link
Member

anssiko commented Aug 29, 2023

@huningxin, if that works it would be best of both worlds: keep the main commit history clean while retain the detailed commit history in the PR.

I don't remember whether we need to keep the integration branch around (not delete it) for this to work. Perhaps you want to test this in a test repo to find out. Sometimes we do a clean up of stale branches and this branch might accidentally get deleted in the future.

@zolkis
Copy link
Collaborator Author

zolkis commented Aug 29, 2023

I suppose even if we squash this PR and merge it as one commit, the full commit history in this PR would still be preserved, correct?

I'd keep the separate commits because 1. there are different authors, and 2. one such big commit would be impractical for its sheer size. Up to the editors, though.

@wchao1115
Copy link
Collaborator

@wchao1115 @huningxin thank you for your approval. As the editors, I will let you discuss among yourself whether you would like to squash this big PR or retain the full commit history. You're free to merge when you are decided on the best path forward.

I have a slight preference for retaining the commits to help future spec archeologists. There's this idiom "how the sausage is made". :)

I do not have a strong preference one way or the other. Either works for me.

@huningxin
Copy link
Contributor

Thanks for the feedbacks. Let's merge this pull request with all commits history.

@huningxin huningxin merged commit 34a4d80 into webmachinelearning:main Aug 30, 2023
github-actions bot added a commit that referenced this pull request Aug 30, 2023
SHA: 34a4d80
Reason: push, by huningxin

Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.