Skip to content

Commit

Permalink
New content: Add definition for shape broadcasting
Browse files Browse the repository at this point in the history
This change introduces a new section for Algorithms, following APIs,
to collect algorithms referenced throughout the specification.

A section for Broadcasting is introduced, which defines broadcasting
shapes and gives an explicit algorithm matching WebNN implementations
of NumPy's General Broadcasting Rules. Definitions for "broadcastable"
and "unidirectionally broadcastable" are introduced. The previous
definition of "broadcast-shapes" is removed in favor of these new
algorithms.

For webmachinelearning#324, webmachinelearning#378, webmachinelearning#462, and potentially webmachinelearning#523.
  • Loading branch information
inexorabletash committed Feb 7, 2024
1 parent d1a02f2 commit 0b8e499
Showing 1 changed file with 56 additions and 28 deletions.
84 changes: 56 additions & 28 deletions index.bs
Original file line number Diff line number Diff line change
Expand Up @@ -2384,8 +2384,8 @@ partial interface MLGraphBuilder {
1. If |a|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dataType}} is not equal to |b|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dataType}}, then [=exception/throw=] a "{{DataError}}" {{DOMException}}.
1. Let |descriptor| be a new {{MLOperandDescriptor}}.
1. Set |descriptor|.{{MLOperandDescriptor/dataType}} to |a|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dataType}}.
1. Set |descriptor|.{{MLOperandDescriptor/dimensions}} to the result of running the [=MLGraphBuilder/broadcast-shapes=] steps given |a|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}} and |b|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}.
1. If that [=exception/throws=] an error, re-[=exception/throw=] the error.
1. Set |descriptor|.{{MLOperandDescriptor/dimensions}} to the result of [=bidirectionally broadcasting the shapes=] |a|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}} and |b|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}.
1. If that returns failure, then [=exception/throw=] a "{{DataError}}" {{DOMException}}.
1. If any of the following sub-steps fail, [=exception/throw=] an "{{OperationError}}" {{DOMException}}.
1. Let |output| be the result of [=creating an MLOperand=] given [=this=] and |descriptor|.
1. Make a request to the underlying platform to:
Expand All @@ -2399,21 +2399,6 @@ partial interface MLGraphBuilder {
</div>
</details>

<details open algorithm>
<summary>
To <dfn for="MLGraphBuilder">broadcast-shapes</dfn> given [=/list=] |shape1| and [=/list=] |shape2|, run the following steps:
</summary>
<div class=algorithm-steps>
1. [=Assert=]: The type of |shape1| and |shape2| is `sequence of unsigned long`.
1. Let |output| be the result of invoking the [=implementation-defined=] shape broadcast on |shape1| and |shape2|.
1. If that fails, then [=exception/throw=] a "{{DataError}}" {{DOMException}}.
1. Return |output|.
<div class = "note">
The most common implementation is that two shapes are compatible, when each of their corresponding dimensions are equal, or one of them is 1. The output shape consists of the maximum of the corresponding dimensions.
</div>
</div>
</details>

<details open>
<summary>
The element-wise binary operation algorithms invoke the [=MLGraphBuilder/element-wise-binary-op | create element-wise binary operation=] steps as follows.
Expand Down Expand Up @@ -2520,8 +2505,8 @@ Although operations *greaterOrEqual* and *lesserOrEqual* can each be implemented
1. If |a|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dataType}} is not equal to |b|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dataType}}, then [=exception/throw=] a "{{DataError}}" {{DOMException}}.
1. Let |descriptor| be a new {{MLOperandDescriptor}}.
1. Set |descriptor|.{{MLOperandDescriptor/dataType}} to {{MLOperandDataType/"uint8"}}.
1. Set |descriptor|.{{MLOperandDescriptor/dimensions}} to the result of running the [=MLGraphBuilder/broadcast-shapes=] steps given |a|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}} and |b|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}.
1. If that [=exception/throws=] an error, re-[=exception/throw=] the error.
1. Set |descriptor|.{{MLOperandDescriptor/dimensions}} to the result of [=bidirectionally broadcasting the shapes=] |a|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}} and |b|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}.
1. If that returns failure, then [=exception/throw=] a "{{DataError}}" {{DOMException}}.
1. If any of the following sub-steps fail, [=exception/throw=] an "{{OperationError}}" {{DOMException}}.
1. Let |output| be the result of [=creating an MLOperand=] given [=this=] and |descriptor|.
1. Make a request to the underlying platform to:
Expand Down Expand Up @@ -2859,6 +2844,7 @@ partial interface MLGraphBuilder {
1. If the sequence length of |newShape| is not equal to the [=rank=] of |inputDesc|, then [=exception/throw=] a "{{DataError}}" {{DOMException}}.
1. Let |outputDesc| be a copy of |inputDesc|.
1. [=list/For each=] |index| in [=the range=] 0 to the [=rank=] of |input|, exclusive:
Issue: Can this be replaced with [=unidirectionally broadcasting the shapes=] |inputDesc| and |newShape|.
1. Let |size| be the |input|.{{MLOperand/shape()}}[|index|].
1. If |size| is not equal to 1 and not equal to |newShape|[index], then [=exception/throw=] a "{{DataError}}" {{DOMException}}.
1. If |size| is equal to 1, then let |outputDesc|.{{MLOperandDescriptor/dimensions}}[|index|] be |newShape|[|index|].
Expand Down Expand Up @@ -3006,7 +2992,7 @@ partial interface MLGraphBuilder {
</div>

### gemm ### {#api-mlgraphbuilder-gemm}
Calculate the [general matrix multiplication of the Basic Linear Algebra Subprograms](https://en.wikipedia.org/wiki/Basic_Linear_Algebra_Subprograms#Level_3). The calculation follows the expression `alpha * A * B + beta * C`, where `A` is a 2-D tensor with shape [M, K] or [K, M], `B` is a 2-D tensor with shape [K, N] or [N, K], and `C` is broadcastable to the shape [M, N]. `A` and `B` may optionally be transposed prior to the calculation.
Calculate the [general matrix multiplication of the Basic Linear Algebra Subprograms](https://en.wikipedia.org/wiki/Basic_Linear_Algebra_Subprograms#Level_3). The calculation follows the expression `alpha * A * B + beta * C`, where `A` is a 2-D tensor with shape [M, K] or [K, M], `B` is a 2-D tensor with shape [K, N] or [N, K], and `C` is [=unidirectionally broadcastable=] to the shape [M, N]. `A` and `B` may optionally be transposed prior to the calculation.

<script type=idl>
dictionary MLGemmOptions {
Expand All @@ -3026,7 +3012,7 @@ partial interface MLGraphBuilder {
<dl dfn-type=dict-member dfn-for=MLGemmOptions>
: <dfn>c</dfn>
::
The third input tensor. It is either a scalar, or of the shape that is unidirectionally broadcastable to the shape [M, N] according to [[!numpy-broadcasting-rule]]. When it is not specified, the computation is done as if *c* is a scalar 0.0.
The third input tensor. It is either a scalar, or of the shape that is [=unidirectionally broadcastable=] to the shape [M, N]. When it is not specified, the computation is done as if *c* is a scalar 0.0.

: <dfn>alpha</dfn>
::
Expand Down Expand Up @@ -3066,7 +3052,7 @@ partial interface MLGraphBuilder {
1. If |options|.{{MLGemmOptions/aTranspose}} is true, then let |shapeA| be the reverse array of |shapeA|.
1. If |options|.{{MLGemmOptions/bTranspose}} is true, then let |shapeB| be the reverse array of |shapeB|.
1. If |shapeA|[1] is not equal to |shapeB|[0], then [=exception/throw=] a "{{DataError}}" {{DOMException}}.
1. If |options|.{{MLGemmOptions/c}} [=map/exists=] and is not unidirectionally broadcastable to the shape [|shapeA|[0], |shapeB|[1]] according to the [[!numpy-broadcasting-rule]], then [=exception/throw=] a "{{DataError}}" {{DOMException}}.
1. If |options|.{{MLGemmOptions/c}} [=map/exists=] and is not [=unidirectionally broadcastable=] to the shape [|shapeA|[0], |shapeB|[1]], then [=exception/throw=] a "{{DataError}}" {{DOMException}}.
<div class="note">
Type compatibility between |a|, |b| and |options|.{{MLGemmOptions/c}} can be also checked.
</div>
Expand Down Expand Up @@ -4866,7 +4852,7 @@ partial interface MLGraphBuilder {
<div>
**Arguments:**
- *input*: an {{MLOperand}}. The input tensor.
- *slope*: an {{MLOperand}}. The slope tensor. Its shape is either the same as, or unidirectionally broadcastable to the shape of input tensor *input* according to [[!numpy-broadcasting-rule]].
- *slope*: an {{MLOperand}}. The slope tensor. Its shape is either the same as, or [=unidirectionally broadcastable=] to the shape of input tensor *input*.

**Returns:**
- an {{MLOperand}}. The output tensor of the same shape as *input*.
Expand All @@ -4880,8 +4866,8 @@ partial interface MLGraphBuilder {
<div class=algorithm-steps>
1. Let |descriptor| be a new {{MLOperandDescriptor}}.
1. Set |descriptor|.{{MLOperandDescriptor/dataType}} to |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dataType}}.
1. Set |descriptor|.{{MLOperandDescriptor/dimensions}} to the result of running the [=MLGraphBuilder/broadcast-shapes=] steps given |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}} and |slope|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}.
1. If that [=exception/throws=] an error, re-[=exception/throw=] the error.
1. Set |descriptor|.{{MLOperandDescriptor/dimensions}} to the result of [=unidirectionally broadcasting the shapes=] |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}} and |slope|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}.
1. If that returns failure, then [=exception/throw=] a "{{DataError}}" {{DOMException}}.
1. If any of the following sub-steps fail, [=exception/throw=] an "{{OperationError}}" {{DOMException}}.
1. Let |output| be the result of [=creating an MLOperand=] given [=this=] and |descriptor|.
1. Make a request to the underlying platform to:
Expand Down Expand Up @@ -6004,9 +5990,9 @@ partial interface MLGraphBuilder {
1. If |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dataType}} is not equal to |other|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dataType}}, then [=exception/throw=] a "{{DataError}}" {{DOMException}}.
1. Let |descriptor| be a new {{MLOperandDescriptor}}.
1. Set |descriptor|.{{MLOperandDescriptor/dataType}} to |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dataType}}.
1. Set |descriptor|.{{MLOperandDescriptor/dimensions}} to the result of running the [=MLGraphBuilder/broadcast-shapes=] steps given |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}} and |other|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}.
1. If that [=exception/throws=] an error, re-[=exception/throw=] the error.
1. If |condition| is not unidirectionally broadcastable to |descriptor|.{{MLOperandDescriptor/dimensions}} according to the [[!numpy-broadcasting-rule]], then [=exception/throw=] a "{{DataError}}" {{DOMException}}.
1. Set |descriptor|.{{MLOperandDescriptor/dimensions}} to the result of [=bidirectionally broadcasting the shapes=] |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}} and |other|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}.
1. If that returns failure, then [=exception/throw=] a "{{DataError}}" {{DOMException}}.
1. If |condition| is not [=unidirectionally broadcastable=] to |descriptor|.{{MLOperandDescriptor/dimensions}}, then [=exception/throw=] a "{{DataError}}" {{DOMException}}.
1. If any of the following sub-steps fail, [=exception/throw=] an "{{OperationError}}" {{DOMException}}.
1. Let |output| be the result of [=creating an MLOperand=] given [=this=] and |descriptor|.
1. Make a request to the underlying platform to:
Expand Down Expand Up @@ -6219,6 +6205,48 @@ dictionary MLOperandDescriptor {
</div>
</details>

Algorithms {#algorithms}
=====================

## Broadcasting ## {#algorithms-broadcasting}

Broadcasting refers to how operations treat tensors with different shapes, and follow the precedent set by [[!numpy-broadcasting-rule]].

<div algorithm>
To <dfn data-lt="unidirectionally broadcasting the shapes">unidirectionally broadcast the shapes</dfn> |A| and |B|, perform the following steps. |A| and |B| are [=/lists=] of positive integers, representing the dimensions of tensors, and the steps return a new [=/list=] of positive integers, or failure.

1. Let |sizeA| be the [=list/size=] of |A|.
1. Let |sizeB| be the [=list/size=] of |B|.
1. Let |output| be a new [=/list=].
1. [=list/For each=] |index| in [=the range=] 0 to |sizeA|, exclusive:
1. Let |dimA| be |A|[|sizeA| - |index| - 1] if |index| < |sizeA|, or 1 otherwise.
1. Let |dimB| be |B|[|sizeB| - |index| - 1] if |index| < |sizeB|, or 1 otherwise.
1. If |dimA| is not equal to |dimB| and |dimA| is not equal to 1, then return failure.
1. [=list/Prepend=] |dimA| to |output|.
1. Return |output|.

</div>

<div algorithm>
|A| is <dfn>unidirectionally broadcastable</dfn> to |B| if [=unidirectionally broadcasting the shapes=] |A| and |B| does not result in failure.
</div>

<div algorithm>
To <dfn data-lt="bidirectionally broadcasting the shapes">bidirectionally broadcast the shapes</dfn> |A| and |B|, perform the following steps. |A| and |B| are [=/lists=] of positive integers, representing the dimensions of tensors, and the steps return a new [=/list=] of positive integers, or failure.

1. Let |sizeA| be the [=list/size=] of |A|.
1. Let |sizeB| be the [=list/size=] of |B|.
1. Let |outputSize| be the maximum of |sizeA| and |sizeB|.
1. Let |output| be a new [=/list=].
1. [=list/For each=] |index| in [=the range=] 0 to |outputSize|, exclusive:
1. Let |dimA| be |A|[|sizeA| - |index| - 1] if |index| < |sizeA|, or 1 otherwise.
1. Let |dimB| be |B|[|sizeB| - |index| - 1] if |index| < |sizeB|, or 1 otherwise.
1. If |dimA| is not equal to |dimB|, and |dimA| is not equal to 1, and |dimB| is not equal to 1, then return failure.
1. [=list/Prepend=] the maximum of |dimA| and |dimB| to |output|.
1. Return |output|.

</div>

Examples {#examples}
=====================

Expand Down

0 comments on commit 0b8e499

Please sign in to comment.