From 0eea13ef3145cbd90f1dcd1ebd8bb8e74aff5e97 Mon Sep 17 00:00:00 2001 From: Zoltan Kis Date: Tue, 24 Jan 2023 21:50:47 +0200 Subject: [PATCH 001/112] Add the batch normalization algorithm Signed-off-by: Zoltan Kis --- index.bs | 87 ++++++++++++++++++++++++++++++++++++++++++++++++++++---- 1 file changed, 81 insertions(+), 6 deletions(-) diff --git a/index.bs b/index.bs index 544514bd..efdde6cf 100644 --- a/index.bs +++ b/index.bs @@ -131,6 +131,36 @@ div.validusage { content: "Valid Usage"; } +/* Box for Informal steps. */ +div.informalsteps { + padding: .5em; + border: thin solid #88e !important; + border-radius: .5em; +} + +/* + * Stylistic labels, for clarity of presentation of these blocks. + * + * NOTE: This text is non-accessible and non-selectable; surrounding + * text must also explain the context. + */ +.informalsteps { + position: relative; +} +.informalsteps::after { + font-weight: bold; + font-style: italic; + font-size: 130%; + color: rgba(0, 0, 0, 0.15); + color: var(--watermark-text); + position: absolute; + right: .3em; + bottom: .1em; +} +.informalsteps::after { + content: "Non-normative"; +} + /* * Ensure that argumentdef blocks don't overflow algorithm section borders. This is made far harder * than it needs to be because the top-level W3C stylesheet has several @media + min-width variants @@ -1313,6 +1343,7 @@ The {{MLGraphBuilder/constant(value, type)}} steps are: ### The batchNormalization() method ### {#api-mlgraphbuilder-batchnorm} Normalize the tensor values of input features across the batch dimension using [[Batch-Normalization]]. For each input feature, the mean and variance values of that feature supplied in this calculation as parameters are previously computed across the batch dimension of the input during the model training phase of this operation. + -
+ +{{MLBatchNormalizationOptions}} has the following members: +
+ : scale + :: + An {{MLOperand}}. Specifies the 1-D tensor of the scaling values whose length is equal to the size of the input dimension denoted by {{MLBatchNormalizationOptions/axis}}. + + : bias + :: + An {{MLOperand}}. Specifies the 1-D tensor of the bias values whose length is equal to the size of the input dimension denoted by {{MLBatchNormalizationOptions/axis}}. + + : axis + :: + A {{long}} scalar. Specifies the index to the feature count dimension of the input shape for which the mean and variance values are. Its value must be in the range [0, N-1] where N is the rank of input tensor. The default value is 1, corresponding to the channel (*"c"*) dimension in the *"nchw"* data layout. + + : epsilon + :: + A {{float}} scalar. Specifies A small value to prevent computational error due to divide-by-zero. + + : activation + :: + An {{MLActivation}} object. Specifies the optional activation function that immediately follows the normalization operation. +
+ +
**Arguments:** - *input*: an {{MLOperand}}. The input N-D tensor. - - *mean*: an {{MLOperand}}. The 1-D tensor of the mean values of the input features across the batch whose length is equal to the size of the input dimension denoted by *options.axis*. - - *variance*: an {{MLOperand}}. The 1-D tensor of the variance values of the input features across the batch whose length is equal to the size of the input dimension denoted by *options.axis*. - - *options*: an optional {{MLBatchNormalizationOptions}}. The optional parameters of the operation. + - *mean*: an {{MLOperand}}. Specifies the 1-D tensor of the mean values of the input features across the batch whose length is equal to the size of the input dimension denoted by {{MLBatchNormalizationOptions/axis}}. + - *variance*: an {{MLOperand}}. The 1-D tensor of the variance values of the input features across the batch whose length is equal to the size of the input dimension denoted by {{MLBatchNormalizationOptions/axis}}. + - *options*: an optional {{MLBatchNormalizationOptions}}. Specifies the optional parameters of the operation. - *scale*: an {{MLOperand}}. The 1-D tensor of the scaling values whose length is equal to the size of the input dimension denoted by *options.axis*. - *bias*: an {{MLOperand}}. The 1-D tensor of the bias values whose length is equal to the size of the input dimension denoted by *options.axis*. - *axis*: an {{unsigned long}} scalar. The index to the feature count dimension of the input shape for which the mean and variance values are. Its value must be in the range [0, N-1] where N is the rank of input tensor. When it's not specified, the default value is 1. @@ -1340,10 +1395,30 @@ partial interface MLGraphBuilder { - *activation*: an {{MLActivation}}. The optional activation function that immediately follows the normalization operation. **Returns:** an {{MLOperand}}. The batch-normalized N-D tensor of the same shape as the input tensor. +
- When *input* is a 4-D tensor of the *"nchw"* or *"nhwc"* layout, *options.axis* should be set to 1 or 3 respectively. The axis value designates the feature or channel count dimension of the input tensor. +
+ The {{MLGraphBuilder/batchNormalization()}} method steps are: + 1. Let |input| be the first argument. To validate |input|, run these substeps: + 1. If |input| is not an [=object=] that [=implements=] {{MLOperand}}, then throw a "{{TypeError}}" {{DOMException}} and abort these steps. + 1. Let |mean| be the second argument, representing a vector with the moving mean values for |input|. To validate |mean|, run the following substeps: + 1. If |mean| is not an [=object=] that [=implements=] {{MLOperand}}, then throw a "{{TypeError}}" {{DOMException}} and abort these steps. + 1. If |mean|.{{MLOperand/[[descriptor]]}}.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}} is not equal with |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}} from which the dimension represented by |options|.axis is removed, then throw a "{{TypeError}}" {{DOMException}} and abort these steps. + 1. Let |variance| be the third argument, representing the moving variance values of |input|. + 1. Let |options| be the fourth argument. To validate |options|, run these substeps: + 1. If |options|.axis does not [=map/exist=], let |options|."axis" be 1. + 1. If |options|.axis is not a number between 0 and the rank of |input|, then throw a "{{TypeError}}" {{DOMException}} and abort these steps. + 1. If |input| is a 4-D tensor of the *"nchw"* layout, set |options|.axis to 1. + 1. If |input| is a 4-D tensor of the *"nhwc"* layout, set |options|.axis to 3. + 1. Let |result| be an {{MLOperand}} representing the results. It may use the same underlying data as |input|. + 1. Issue a request to the underlying platform to initialize the batch normalization, given |input|, |mean|, |variance|, |options| and |result| to store the results and |options|. Wait for completion. +
+ 1. If |options|.activation [=map/exists=], implementations MAY use it to optimize the operation flow. +
+ 1. Return |result|. +
-
+
The behavior of this operation when the input tensor is 4-D of the *"nchw"* layout and the activation is of operator type *relu* can be generically emulated from the usage of other operations as follow. However, user agents typically have a more efficient implementation for it, therefore its usage is encouraged from the performance standpoint.
     const shape = [1,null,1,1];

From 5973cd9fcb2b69d593aec6a09650a3ea02aa8a59 Mon Sep 17 00:00:00 2001
From: Zoltan Kis 
Date: Wed, 15 Feb 2023 21:04:24 +0200
Subject: [PATCH 002/112] Add the clamp() algorithm

Squashed from the following commits:
    Replace MLOperand.[[descriptor]] with type and dimensions
    Clarify the algorithm for only setting up the op
    Improve the clamp() algorithm, use the prose assuming the create steps for MLOperand and MLActivation
    Rework clamp with polymorphic behavior. Update for changes in MLOperand.
    Rework clamp() like constant(), polymorphic forms in separate sections, argument and return descriptions as notes.
    Fix platform related steps and reference to internal slots
    Address review, remove note
    Remove back quotes from title

Signed-off-by: Zoltan Kis 
---
 index.bs | 63 ++++++++++++++++++++++++++++++++++++++++++++------------
 1 file changed, 50 insertions(+), 13 deletions(-)

diff --git a/index.bs b/index.bs
index 14778685..63084178 100644
--- a/index.bs
+++ b/index.bs
@@ -1371,22 +1371,12 @@ dictionary MLClampOptions {
 };
 
 partial interface MLGraphBuilder {
-  MLOperand clamp(MLOperand x, optional MLClampOptions options = {});
+  MLOperand clamp(MLOperand operand, optional MLClampOptions options = {});
   MLActivation clamp(optional MLClampOptions options = {});
 };
 
-
- **Arguments:** - - *x*: an {{MLOperand}}. The input tensor. - - *options*: an optional {{MLClampOptions}}. The optional parameters of the operation. - - *minValue*: a {{float}} scalar. Specifies the minimum value of the range. When it is not specified, the clamping is not performed on the lower limit of the range. - - *maxValue*: a {{float}} scalar. Specifies the maximum value of the range. When it is not specified, the clamping is not performed on the upper limit of the range. - - **Returns:** - - an {{MLOperand}}. The output tensor of the same shape as *x*. - - an {{MLActivation}}. The activation function representing the clamp operation. -
+
The behavior of this operation can be generically emulated from the usage of other operations as follow. However, user agents typically have a more efficient implementation for it, therefore its usage is encouraged from the @@ -1408,7 +1398,54 @@ partial interface MLGraphBuilder { } }
-
+
+ +To check clamp options given |options|, run the following steps: + 1. If |options| is not an object that [=implements=] {{MLClampOptions}}, then return `false`. + 1. If |options|.{{MLClampOptions/minValue}} and |options|.{{MLClampOptions/maxValue}} are not a [=numeric type=], then then return `false`. + 1. If |options|.{{MLClampOptions/minValue}} is greater than |options|.{{MLClampOptions/maxValue}}, then return `false`. + 1. Return `true`. + +#### The {{MLGraphBuilder/clamp(operand, options)}} method #### {#api-mlgraphbuilder-clamp-operand-options} +
+ **Arguments:** + - *operand*: an {{MLOperand}}. The input tensor. + - *options*: an optional {{MLClampOptions}}. The optional parameters of the operation. + - *minValue*: a {{float}} scalar. Specifies the minimum value of the range. When it is not specified, the clamping is not performed on the lower limit of the range. + - *maxValue*: a {{float}} scalar. Specifies the maximum value of the range. When it is not specified, the clamping is not performed on the upper limit of the range. + **Returns:** + - an {{MLOperand}}. The output tensor of the same shape as *operand*. +
+
+ The {{MLGraphBuilder/clamp(operand, options)}} method steps are: + 1. Let |operand| be the first argument. + 1. Let |options| be the second argument. + 1. If running the check clamp options steps with |options| returns `false`, then throw a "{{TypeError}}" {{DOMException}} and abort these steps. + 1. Let |result| be the result of invoking the copy MLOperand steps given |operand|. + 1. If that throws an error, re-throw the error and abort these steps. + 1. Make a request to the underlying platform to connect |result| with the [=implementation-defined=] platform operator for clamp, and store a reference to the resulting [=implementation-defined=] platform operand object in |result|.{{MLOperand/[[operand]]}}. + 1. If that fails, throw an "{{OperationError}}" {{DOMException}} and stop. + 1. Return |result|. +
+ +#### The {{MLGraphBuilder/clamp(options)}} method #### {#api-mlgraphbuilder-clamp-options} +
+ **Arguments:** + - *options*: an optional {{MLClampOptions}}. The optional parameters of the operation. + - *minValue*: a {{float}} scalar. Specifies the minimum value of the range. When it is not specified, the clamping is not performed on the lower limit of the range. + - *maxValue*: a {{float}} scalar. Specifies the maximum value of the range. When it is not specified, the clamping is not performed on the upper limit of the range. + **Returns:** + - an {{MLActivation}}. The operator representing the clamp operation. +
+
+ The {{MLGraphBuilder/clamp(options)}} method steps are: + 1. Let |options| be the first argument. + 1. If running the check clamp options steps with |options| returns `false`, then throw a "{{TypeError}}" {{DOMException}} and abort these steps. + 1. Let |op| be the result of invoking the create MLActivation steps with `"clamp"` and |options|. + 1. If that throws an error, re-throw the error and abort these steps. + 1. Make a request to the underlying platform to connect |op| with the [=implementation-defined=] platform operator for clamp, and store a reference to the resulting [=implementation-defined=] platform operator object in |op|.{{MLActivation/[[operator]]}}. + 1. If that fails, throw an "{{OperationError}}" {{DOMException}} and stop. + 1. Return |op|.
### The concat() method ### {#api-mlgraphbuilder-concat} From d68bd87cf2ceaf7cfb23a121b61d1f7fe2824cba Mon Sep 17 00:00:00 2001 From: Zoltan Kis Date: Thu, 25 May 2023 19:10:35 +0300 Subject: [PATCH 003/112] Add the 'copy MLOperand' and 'create MLActivation' steps Signed-off-by: Zoltan Kis --- index.bs | 53 ++++++++++++++++++++++++++++++++++++++++++++++++++--- 1 file changed, 50 insertions(+), 3 deletions(-) diff --git a/index.bs b/index.bs index 63084178..666979e1 100644 --- a/index.bs +++ b/index.bs @@ -754,6 +754,16 @@ To create MLOperand given |builder| and |desc|, run the following ste 1. Return |operand|.
+To copy MLOperand given |operand|, run the following steps: +
+ 1. If |operand| is not an instance of {{MLOperand}}, then throw a "{{TypeError}}" and stop. + 1. Let |result| be a new [=object=]. + 1. Set |result|.{{MLOperand/[[builder]]}} to |operand|.{{MLOperand/[[builder]]}}. + 1. Set |result|.{{MLOperand/[[descriptor]]}} to |operand|.{{MLOperand/[[descriptor]]}}. + 1. Set |result|.{{MLOperand/[[name]]}} to |operand|.{{MLOperand/[[name]]}}. + 1. Return |result|. +
+ To check dimensions given |dimensions| and |type|, run the following steps:
1. If |dimensions| is not an array of positive numbers, return `false`; @@ -769,16 +779,53 @@ Objects implementing the {{MLActivation}} interface represent activation functio +
+{{MLActivation}} has the following internal slots: +
+ : \[[name]] of type [=string=] + :: + The {{MLActivation}}'s name. + : \[[builder]] of type {{MLGraphBuilder}} + :: + The graph builder object this {{MLActivation}} belongs to. + : \[[options]] of type [=object=] + :: + A dictionary containing {{MLActivation}} options. + : \[[operator]] of type [=object=] + :: + Reference to {{MLActivation}}'s corresponding [=implementation-defined=] platform operator object. +
+
+
These activations function types are used to create other operations. One such use of this interface is for when an activation function is fused into another operation such as [[#api-mlgraphbuilder-conv2d]] or [[#api-mlgraphbuilder-batchnorm]] during a graph construction session. Such fused activation functions can provide a significant performance improvement when supported natively by the underlying implementation. This is intended as an optimization opportunity for implementers.
+#### Creating {{MLActivation}} #### {#api-mlactivation-create}
-The implementation of the {{MLActivation}} interface can simply be a struct that holds a string type of the activation function along with other properties needed. The actual creation of the activation function e.g. a [[#api-mlgraphbuilder-sigmoid]] or [[#api-mlgraphbuilder-relu]] can then be deferred until when the rest of the graph is ready to connect with it such as during the construction of [[#api-mlgraphbuilder-conv2d]] for example. -
+The {{MLActivation}} objects (including the ones passed as input to methods) are created by the methods of {{MLGraphBuilder}} and are identified by their name. The |options| dictionary is defined by those methods. The actual creation of the activation function e.g. a [[#api-mlgraphbuilder-sigmoid]] or [[#api-mlgraphbuilder-relu]] can then be deferred until when the rest of the graph is ready to connect with it such as during the construction of [[#api-mlgraphbuilder-conv2d]] for example. +
+ +
+ + To create MLActivation given |builder|, |name| and |options|, run the following steps: + +
+ 1. If |builder| is not an instance of {{MLGraphBuilder}}, throw a "{{TypeError}}" and abort these steps. + 1. If |name| is `undefined` or `null`, throw a "{{TypeError}}" and abort these steps. + 1. Let |activation| be a new [=object=]. + 1. Set |activation|.{{MLActivation/[[builder]]}} to |builder|. + 1. Set |activation|.{{MLActivation/[[name]]}} to |name|. + 1. If |options| is an [=object=], set |activation|.{{MLActivation/[[options]]}} to |options|. + 1. Make a request to the underlying platform to bind the [=implementation-defined=] platform operator for |name| to |activation|.{{MLActivation/[[operator]]}}. + 1. If that fails, throw a "{{TypeError}}" and abort these steps. + 1. Return |activation|. +
+
## The MLContext interface ## {#api-mlcontext} The {{MLContext}} interface represents a global state of neural network compute workload and execution processes. Each {{MLContext}} object has associated [=context type=], [=device type=] and [=power preference=]. From 7082ee744e9a9bfa73c24a2cf71f0eadf9ce653a Mon Sep 17 00:00:00 2001 From: Zoltan Kis Date: Wed, 22 Mar 2023 22:41:52 +0200 Subject: [PATCH 004/112] Add the concat algorithm Squashed from the following commits: Add reference to the platform operand object Address review comment, change note. Signed-off-by: Zoltan Kis --- index.bs | 36 ++++++++++++++++++++++++++++++++++-- 1 file changed, 34 insertions(+), 2 deletions(-) diff --git a/index.bs b/index.bs index 14778685..44a04651 100644 --- a/index.bs +++ b/index.bs @@ -1411,14 +1411,14 @@ partial interface MLGraphBuilder { -### The concat() method ### {#api-mlgraphbuilder-concat} +### The {{MLGraphBuilder/concat()}} method ### {#api-mlgraphbuilder-concat} Concatenates the input tensors along a given axis. -
+
**Arguments:** - *inputs*: a sequence of {{MLOperand}}. All input tensors must have the same shape, except for the size of the dimension to concatenate on. @@ -1429,6 +1429,38 @@ partial interface MLGraphBuilder { that all the inputs concatenated along. The size of that dimension is computed as the sum of all the input sizes of the same dimension.
+
+ The {{MLGraphBuilder/concat(inputs, axis)}} steps are: +
+ The permissions and context validity have been checked by [[#api-mlgraphbuilder-constructor]] steps. +
+ 1. Let |inputs| be the first argument. + 1. [=Assert=]: the type of |inputs| is sequence of {{MLOperand}} objects. + 1. [=Assert=]: the type of |axis| is `unsigned long`. + 1. [=Assert=]: the shape, i.e. {{MLOperandDescriptor/dimensions}}) of each operand in |inputs| is the same, except on the dimension given by |axis| on which they are concatenated. + 1. [=Assert=]: the {{MLOperandDescriptor/type}} of each operand in |inputs| is the same. + 1. If any of the following steps fail, then throw a "{{DataError}}" {{DOMException}} and stop. + 1. If |inputs| is not an array of [=objects=], fail. + 1. If |axis| is not a positive integer [=number=], fail. + 1. If |axis| is greater than or equal to the rank of |inputs|, fail. + 1. Let |desc| be |inputs|[0].{{MLOperand/[[descriptor]]}}. + 1. Let |desc|.{{MLOperandDescriptor/dimensions}}[|axis|] be `0`. + 1. For each |index| between 0 and the rank of |inputs|: + 1. If running validate MLOperand given |inputs|[|index|] and [=this=] returns `false`, then fail. + 1. For each |dim| between 0 and the rank of |inputs|[|index|]: +
+ If the shape of each corresponding dimension and type of the operands, except for those of the dimension given by |axis|, is not the same, fail. +
+ 1. If |dim| is not equal to |axis| and if |inputs|[|index|].{{MLOperandDescriptor/dimensions}}[|dim|] is not equal to |inputs|[0].{{MLOperandDescriptor/dimensions}}[|dim|], fail. + 1. If |inputs|[|dim|].{{MLOperandDescriptor/type}} is not equal to |inputs|[0].{{MLOperandDescriptor/type}}. + 1. If |dim| is equal to |axis|, add to |desc|.{{MLOperandDescriptor/dimensions}}[|axis|] the value of |inputs|[|index|].{{MLOperandDescriptor/dimensions}}[|dim|]. + 1. Let |kind| be |inputs|[0].{{MLOperand/[[kind]]}}. + 1. Let |output| be the result of invoking the create MLOperand steps given [=this=], |kind| and |desc|. + 1. If that throws an error, re-throw the error and stop. + 1. Make a request to the underlying platform to create an operator for this method with |inputs| connected as input and |output| connected as output and store a reference to the [=implementation-defined=] platform object to |operand|.{{MLOperand/[[operand]]}}. + 1. If that fails, throw a "{{DataError}}" {{DOMException}} and stop. + 1. Return |output|. +
### The conv2d() method ### {#api-mlgraphbuilder-conv2d} Compute a 2-D convolution given 4-D input and filter tensors From 1651b73e1218d631a3cd7aae33c1841450ac50fb Mon Sep 17 00:00:00 2001 From: Zoltan Kis Date: Thu, 25 May 2023 19:27:06 +0300 Subject: [PATCH 005/112] Add steps for 'rank', 'validate MLOperand'. Fix the concat() steps Signed-off-by: Zoltan Kis --- index.bs | 20 +++++++++++++++++--- 1 file changed, 17 insertions(+), 3 deletions(-) diff --git a/index.bs b/index.bs index 44a04651..adae24f9 100644 --- a/index.bs +++ b/index.bs @@ -739,6 +739,11 @@ interface MLOperand {};
+To get the rank of an {{MLOperand}} |operand|, run the following steps: +
+ 1. Return the size of |operand|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}. +
+ Since the {{MLOperand/[[builder]]}} object is bound by the {{MLGraphBuilder/constructor()}} constructor to an {{MLContext}} object, an {{MLOperand}} is also always bound to the same {{MLContext}} object. #### Creating {{MLOperand}} #### {#api-mloperand-create} @@ -763,6 +768,16 @@ To check dimensions given |dimensions| and |type|, run the following 1. Return `true`. +To validate MLOperand given |operand| and |builder|, run the following steps: +
+ 1. If |operand|.{{MLOperand/[[builder]]}} is not an instance of {{MLGraphBuilder}}, return `false`. + 1. If |builder| is not `undefined` and is not equal to |operand|.{{MLOperand/[[builder]]}}, return `false`. + 1. Let |desc| be |operand|.{{MLOperand/[[descriptor]]}}. + 1. If |desc| is not an [=object=] that [=implements=] {{MLOperandDescriptor}}, return `false`. + 1. If |desc|.{{MLOperandDescriptor/dimensions}} [=map/exists=] and invoking check dimensions given |desc|.{{MLOperandDescriptor/dimensions}} and |desc|.{{MLOperandDescriptor/type}} returns `false`, then return `false`. + 1. Return `true`. +
+ ### The MLActivation interface ### {#api-mlactivation} Objects implementing the {{MLActivation}} interface represent activation function types. @@ -1454,10 +1469,9 @@ partial interface MLGraphBuilder { 1. If |dim| is not equal to |axis| and if |inputs|[|index|].{{MLOperandDescriptor/dimensions}}[|dim|] is not equal to |inputs|[0].{{MLOperandDescriptor/dimensions}}[|dim|], fail. 1. If |inputs|[|dim|].{{MLOperandDescriptor/type}} is not equal to |inputs|[0].{{MLOperandDescriptor/type}}. 1. If |dim| is equal to |axis|, add to |desc|.{{MLOperandDescriptor/dimensions}}[|axis|] the value of |inputs|[|index|].{{MLOperandDescriptor/dimensions}}[|dim|]. - 1. Let |kind| be |inputs|[0].{{MLOperand/[[kind]]}}. - 1. Let |output| be the result of invoking the create MLOperand steps given [=this=], |kind| and |desc|. + 1. Let |output| be the result of invoking the create MLOperand steps given [=this=] and |desc|. 1. If that throws an error, re-throw the error and stop. - 1. Make a request to the underlying platform to create an operator for this method with |inputs| connected as input and |output| connected as output and store a reference to the [=implementation-defined=] platform object to |operand|.{{MLOperand/[[operand]]}}. + 1. Make a request to the underlying platform to create an operator for this method with |inputs| connected as input and |output| connected as output and store a reference to the [=implementation-defined=] platform object to |output|.{{MLOperand/[[operand]]}}. 1. If that fails, throw a "{{DataError}}" {{DOMException}} and stop. 1. Return |output|. From dc4d3d63d2cddb3b48ef02ac1dcb785c2910cc22 Mon Sep 17 00:00:00 2001 From: Zoltan Kis Date: Fri, 2 Jun 2023 17:26:56 +0300 Subject: [PATCH 006/112] clamp(): improve platform related steps Signed-off-by: Zoltan Kis --- index.bs | 16 +++++++++++----- 1 file changed, 11 insertions(+), 5 deletions(-) diff --git a/index.bs b/index.bs index 666979e1..850564bd 100644 --- a/index.bs +++ b/index.bs @@ -760,7 +760,7 @@ To copy MLOperand given |operand|, run the following steps: 1. Let |result| be a new [=object=]. 1. Set |result|.{{MLOperand/[[builder]]}} to |operand|.{{MLOperand/[[builder]]}}. 1. Set |result|.{{MLOperand/[[descriptor]]}} to |operand|.{{MLOperand/[[descriptor]]}}. - 1. Set |result|.{{MLOperand/[[name]]}} to |operand|.{{MLOperand/[[name]]}}. + 1. If |operand|.{{MLOperand/[[name]]}} [=map/exists=], then set |result|.{{MLOperand/[[name]]}} to |operand|.{{MLOperand/[[name]]}}. 1. Return |result|. @@ -1470,8 +1470,13 @@ To check clamp options given |options|, run the following steps: 1. If running the check clamp options steps with |options| returns `false`, then throw a "{{TypeError}}" {{DOMException}} and abort these steps. 1. Let |result| be the result of invoking the copy MLOperand steps given |operand|. 1. If that throws an error, re-throw the error and abort these steps. - 1. Make a request to the underlying platform to connect |result| with the [=implementation-defined=] platform operator for clamp, and store a reference to the resulting [=implementation-defined=] platform operand object in |result|.{{MLOperand/[[operand]]}}. - 1. If that fails, throw an "{{OperationError}}" {{DOMException}} and stop. + 1. If any of the following sub-steps fail, throw an "{{OperationError}}" {{DOMException}} and stop. + 1. Make a request to the underlying platform to create an [=implementation-defined=] platform operand |operandImpl| given |result|.{{MLOperand/[[descriptor]]}}. + 1. Store a reference to |operandImpl| in |result|.{{MLOperand/[[operand]]}}. + 1. Make a request to the underlying platform to create an [=implementation-defined=] platform operator |operatorImpl| for clamp with |options|.{{MLClampOptions/minValue}} and |options|.{{MLClampOptions/minValue}}. + 1. Register the |operand|.{{MLOperand/[[operand]]}} as an input to |operatorImpl|. + 1. Register the |result|.{{MLOperand/[[operand]]}} as output to |operatorImpl|. + 1. Store a reference to |operatorImpl| in |result|.{{MLOperand/[[operator]]}}. 1. Return |result|. @@ -1490,8 +1495,9 @@ To check clamp options given |options|, run the following steps: 1. If running the check clamp options steps with |options| returns `false`, then throw a "{{TypeError}}" {{DOMException}} and abort these steps. 1. Let |op| be the result of invoking the create MLActivation steps with `"clamp"` and |options|. 1. If that throws an error, re-throw the error and abort these steps. - 1. Make a request to the underlying platform to connect |op| with the [=implementation-defined=] platform operator for clamp, and store a reference to the resulting [=implementation-defined=] platform operator object in |op|.{{MLActivation/[[operator]]}}. - 1. If that fails, throw an "{{OperationError}}" {{DOMException}} and stop. + 1. If any of the following sub-steps fail, throw an "{{OperationError}}" {{DOMException}} and stop. + 1. Make a request to the underlying platform to connect |op| with the [=implementation-defined=] platform operator for clamp |operatorImpl|. + 1. Store a reference to |operatorImpl| in |op|.{{MLActivation/[[operator]]}}. 1. Return |op|. From bce29de5a20bde042d1f78c91a859085af41a567 Mon Sep 17 00:00:00 2001 From: Zoltan Kis Date: Tue, 6 Jun 2023 11:12:47 +0300 Subject: [PATCH 007/112] concat: remove style from title Signed-off-by: Zoltan Kis --- index.bs | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/index.bs b/index.bs index adae24f9..b4b2544b 100644 --- a/index.bs +++ b/index.bs @@ -1426,7 +1426,7 @@ partial interface MLGraphBuilder { -### The {{MLGraphBuilder/concat()}} method ### {#api-mlgraphbuilder-concat} +### The concat() method ### {#api-mlgraphbuilder-concat} Concatenates the input tensors along a given axis. -
+
**Arguments:** - *inputs*: a sequence of {{MLOperand}}. All input tensors must have the same shape, except for the size of the dimension to concatenate on. From 20499ba71b9650e39007f911de07fc03a26a8f35 Mon Sep 17 00:00:00 2001 From: Zoltan Kis Date: Mon, 19 Jun 2023 21:45:27 +0300 Subject: [PATCH 009/112] Add stylistic definitions for hiding algorithms, stylistic boxes Signed-off-by: Zoltan Kis --- index.bs | 190 +++++++++++++++++++++++++++++++++++++++++++++++++++---- 1 file changed, 176 insertions(+), 14 deletions(-) diff --git a/index.bs b/index.bs index 5ad2643c..03f6ad0d 100644 --- a/index.bs +++ b/index.bs @@ -104,19 +104,19 @@ p, ul, ol, dl { margin: 1em 0; } -/* Box for Valid Usage requirements. */ -div.validusage { - padding: .5em; - border: thin solid #88e !important; - border-radius: .5em; -} - /* * Stylistic labels, for clarity of presentation of these blocks. * * NOTE: This text is non-accessible and non-selectable; surrounding * text must also explain the context. */ + +/* Box for Valid Usage requirements. */ +div.validusage { + padding: .5em; + border: thin solid #88e !important; + border-radius: .5em; +} .validusage { position: relative; } @@ -134,19 +134,51 @@ div.validusage { content: "Valid Usage"; } -/* Box for Informal steps. */ +details { + padding: .5em; + border: thin solid #88e !important; + border-radius: .5em; +} + +summary { + font-weight: bold; + margin: -0.5em -0.5em 0; + padding: 0.5em; +} + +/* Box for algorithm steps. */ + +div.algorithm-steps { + padding: .5em; + background-color: ghostwhite; +} + +.algorithm-steps { + position: relative; + overflow: hidden; +} +.algorithm-steps::after { + font-weight: bold; + font-style: italic; + font-size: 130%; + color: rgba(0, 0, 0, 0.15); + color: var(--watermark-text); + position: absolute; + right: .3em; + bottom: .1em; +} +.algorithm-steps::after { + content: "Algorithm"; +} + +/* Informal steps */ div.informalsteps { padding: .5em; border: thin solid #88e !important; border-radius: .5em; + background-color: ghostwhite; } -/* - * Stylistic labels, for clarity of presentation of these blocks. - * - * NOTE: This text is non-accessible and non-selectable; surrounding - * text must also explain the context. - */ .informalsteps { position: relative; } @@ -164,6 +196,28 @@ div.informalsteps { content: "Non-normative"; } +/* Internal slots */ +div.internal-slots { + padding: .5em; + border: thin solid #88e !important; + border-radius: .5em; + background-color: aliceblue; +} + +.internal-slots { + position: relative; +} +.internal-slots::after { + font-weight: bold; + font-style: italic; + font-size: 130%; + color: rgba(0, 0, 0, 0.15); + color: var(--watermark-text); + position: absolute; + right: .3em; + bottom: .1em; +} + /* * Ensure that argumentdef blocks don't overflow algorithm section borders. This is made far harder * than it needs to be because the top-level W3C stylesheet has several @media + min-width variants @@ -262,8 +316,116 @@ th, td { } } + +/* Floating button for collapse/expand all details elements */ + +.collapse-expand-button { + position: fixed; + bottom: 40px; + right: 40px; + width: 40px; + height: 40px; + border: none; + border-radius: 50%; + background-color: green; + color: ghostwhite; + font-size: 32px; + text-align: center; + align-items:center; + justify-content:center; + cursor: pointer; +} + +.collapse-expand-button:hover { + background-color: green; +} + +.collapse-expand-button.expand { + background-color: red; +} + +.collapse-expand-button.expand::before { + content: "+"; +} + +.collapse-expand-button.collapse { + background-color: green; +} + +.collapse-expand-button.collapse::before { + content: "-"; +} + +.collapse-expand-button .tooltiptext { + visibility: hidden; + bottom: 20px; + right: 20px; + width: 120px; + background-color: ghostwhite; + color: black; + font-size: 18px; + text-align: center; + align-items:center; + justify-content:center; + padding: 5px 0; + border-radius: 5px; + + /* position */ + position: absolute; + z-index: 1; + bottom: 100%; + left: 50%; + margin-left: -60px; + /* Use half of the width (120/2 = 60), to center the tooltip */ +} + +.collapse-expand-button:hover .tooltiptext { + visibility: visible; + opacity: 0.75; +} + +/* end of floating collapse/expand button */ + + + + + Introduction {#intro} ===================== From 257bcad4aacb24a268a79834d578187304c4b67f Mon Sep 17 00:00:00 2001 From: Zoltan Kis Date: Mon, 19 Jun 2023 22:45:56 +0300 Subject: [PATCH 010/112] Adapt the existing main version to new style, without indenting existing algorithms Signed-off-by: Zoltan Kis --- index.bs | 263 ++++++++++++++++++++++++++++++++++++++++--------------- 1 file changed, 190 insertions(+), 73 deletions(-) diff --git a/index.bs b/index.bs index 03f6ad0d..ab7740e6 100644 --- a/index.bs +++ b/index.bs @@ -809,7 +809,11 @@ string "webnn". Its default allowlist is 'self'. ### The {{ML/createContext()}} method ### {#api-ml-createcontext} +
+ The {{ML/createContext()}} method steps are: + +
1. If [=this=]'s [=relevant global object=]'s [=associated Document=] is not [=allowed to use=] the [=webnn-feature|webnn=] feature, return [=a new promise=] [=rejected=] with a "{{SecurityError}}" {{DOMException}} and abort these steps. 1. Let |promise| be [=a new promise=]. 1. Return |promise| and run the following steps [=in parallel=]. @@ -826,14 +830,22 @@ The {{ML/createContext()}} method steps are: 1. If |options|["{{powerPreference}}"] [=map/exists=], then set |context|.{{[[powerPreference]]}} to |options|["{{powerPreference}}"]. Otherwise, set |context|.{{[[powerPreference]]}} to "[=power-preference-default|default=]". 1. If the validate MLContext steps given |context| return `false`, [=reject=] |promise| with a "{{NotSupportedError}}" {{DOMException}} and abort these steps. 1. [=Resolve=] |promise| with |context|. +
+
### The {{ML/createContextSync()}} method ### {#api-ml-createcontextsync} +
+ The {{ML/createContextSync()}} method steps are: + +
1. If [=this=]'s [=relevant global object=]'s [=associated Document=] is not [=allowed to use=] the [=webnn-feature|webnn=] feature, throw a "{{SecurityError}}" {{DOMException}} and abort these steps. 1. Let |options| be the first argument. 1. Let |context| be the result of running the create context steps given |options|. 1. If the validate MLContext steps given |context| return `false`, throw a "{{NotSupportedError}}" {{DOMException}} and abort these steps. 1. Return |context|. +
+
## The MLGraph interface ## {#api-mlgraph} The {{MLGraph}} interface represents a compiled computational graph. A compiled graph once constructed is immutable and cannot be subsequently changed. @@ -843,9 +855,9 @@ The {{MLGraph}} interface represents a compiled computational graph. A compiled interface MLGraph {}; +
{{MLGraph}} has the following internal slots: - -
+
: \[[context]] of type {{MLContext}} :: The context of type {{MLContext}} associated with this {{MLGraph}}. @@ -861,7 +873,8 @@ interface MLGraph {}; : \[[implementation]] :: The underlying implementation provided by the User Agent. -
+
+
### The MLOperandDescriptor dictionary ### {#api-mloperanddescriptor} -
+
+ The byte length of an {{MLOperandDescriptor}} |desc| is the value returned by the following steps: - + +
1. Let |elementLength| be 1. 1. For each |dimension| of |desc|.{{MLOperandDescriptor/dimensions}}: 1. Set |elementLength| to |elementLength| × |dimension|. 1. Let |elementSize| be the [=element size=] of one of the {{ArrayBufferView}} types that matches |desc|.{{MLOperandDescriptor/type}} according to [this table](#appendices-mloperandtype-arraybufferview-compatibility). 1. Return |elementLength| × |elementSize|. -
+
+ ### The MLOperand interface ### {#api-mloperand} @@ -909,9 +925,9 @@ For instance, an {{MLOperand}} may represent a constant feeding to an operation interface MLOperand {}; -{{MLOperand}} has the following internal slots:
-
+{{MLOperand}} has the following internal slots: +
: \[[builder]] of type {{MLGraphBuilder}} :: The {{MLOperand}}'s associated builder object. @@ -931,7 +947,7 @@ interface MLOperand {}; : \[[operator]] of type [=object=] :: Reference to {{MLOperand}}'s corresponding [=implementation-defined=] platform operator object. -
+
To get the rank of an {{MLOperand}} |operand|, run the following steps: @@ -944,44 +960,60 @@ Since the {{MLOperand/[[builder]]}} object is bound by the {{MLGraphBuilder/cons #### Creating {{MLOperand}} #### {#api-mloperand-create} The {{MLOperand}} objects are created by the methods of {{MLGraphBuilder}}, internally using the following algorithms. -To create MLOperand given |builder| and |desc|, run the following steps: -
+
+ + To create MLOperand given |builder| and |desc|, run the following steps: + +
1. If |builder| is not an instance of {{MLGraphBuilder}}, then throw a "{{TypeError}}" {{DOMException}} and stop. 1. If |desc| is not an [=object=] that [=implements=] {{MLOperandDescriptor}}, then throw a "{{TypeError}}" {{DOMException}} and stop. 1. Let |operand| be a new [=object=]. 1. Set |operand|.{{MLOperand/[[builder]]}} to |builder|. 1. Set |operand|.{{MLOperand/[[descriptor]]}} to |desc|. 1. Return |operand|. -
+
+ -To copy MLOperand given |operand|, run the following steps: -
+
+ + To copy MLOperand given |operand|, run the following steps: + +
1. If |operand| is not an instance of {{MLOperand}}, then throw a "{{TypeError}}" and stop. 1. Let |result| be a new [=object=]. 1. Set |result|.{{MLOperand/[[builder]]}} to |operand|.{{MLOperand/[[builder]]}}. 1. Set |result|.{{MLOperand/[[descriptor]]}} to |operand|.{{MLOperand/[[descriptor]]}}. 1. If |operand|.{{MLOperand/[[name]]}} [=map/exists=], then set |result|.{{MLOperand/[[name]]}} to |operand|.{{MLOperand/[[name]]}}. 1. Return |result|. -
+
+ -To check dimensions given |dimensions| and |type|, run the following steps: -
+
+ + To check dimensions given |dimensions| and |type|, run the following steps: + +
1. If |dimensions| is not an array of positive numbers, return `false`; 1. If |dimensions|.length is 0, return `false`. 1. If |dimensions|.length is too large to be supported by the implementation, return `false`. 1. If any element of |dimensions| is not a positive number, or it is too large to be supported by the implementation given |type|, return `false`. 1. Return `true`. -
+
+ -To validate MLOperand given |operand| and |builder|, run the following steps: -
+
+ + To validate MLOperand given |operand| and |builder|, run the following steps: + +
1. If |operand|.{{MLOperand/[[builder]]}} is not an instance of {{MLGraphBuilder}}, return `false`. 1. If |builder| is not `undefined` and is not equal to |operand|.{{MLOperand/[[builder]]}}, return `false`. 1. Let |desc| be |operand|.{{MLOperand/[[descriptor]]}}. 1. If |desc| is not an [=object=] that [=implements=] {{MLOperandDescriptor}}, return `false`. 1. If |desc|.{{MLOperandDescriptor/dimensions}} [=map/exists=] and invoking check dimensions given |desc|.{{MLOperandDescriptor/dimensions}} and |desc|.{{MLOperandDescriptor/type}} returns `false`, then return `false`. 1. Return `true`. -
+
+ ### The MLActivation interface ### {#api-mlactivation} @@ -1073,9 +1105,9 @@ typedef record MLNamedArrayBufferViews; interface MLContext {}; +
{{MLContext}} has the following internal slots: - -
+
: \[[contextType]] of type [=context type=] :: The {{MLContext}}'s [=context type=]. @@ -1085,19 +1117,26 @@ interface MLContext {}; : \[[powerPreference]] of type [=power preference=] :: The {{MLContext}}'s [=power preference=]. -
+
+
When the {{[[contextType]]}} is set to [=default-context|default=] with the {{MLContextOptions}}.{{deviceType}} set to [=device-type-gpu|gpu=], the user agent is responsible for creating an internal GPU device that operates within the context and is capable of ML workload submission on behalf of the calling application. In this setting however, only {{ArrayBufferView}} inputs and outputs are allowed in and out of the graph execution since the application has no way to know what type of internal GPU device is being created on their behalf. In this case, the user agent is responsible for automatic uploads and downloads of the inputs and outputs to and from the GPU memory using this said internal device.
### The {{MLContext}} validation algorithm ### {#api-mlcontext-validate} +
+ To validate {{MLContext}}, given |context|, run these steps: + +
1. If |context|.{{[[contextType]]}} is not "[=webgpu-context|webgpu=]" or "[=default-context|default=], return `false`. 1. If |context|.{{[[deviceType]]}} is not "[=device-type-cpu|cpu=]" or "[=device-type-gpu|gpu=]", return `false`. 1. If |context|.{{[[powerPreference]]}} is not "[=power-preference-default|default=]" or "[=power-preference-high-performance|high-performance=]" or "[=power-preference-low-power|low-power=]", return `false`. 1. If the user agent cannot support |context|.{{[[contextType]]}}, |context|.{{[[deviceType]]}} and |context|.{{[[powerPreference]]}}, return `false`. 1. Return `true`; +
+
### Synchronous Execution ### {#api-mlcontext-sync-execution} Synchronously carries out the computational workload of a compiled graph {{MLGraph}} on the calling thread, which must be a worker thread, to produce results as defined by the operations in the graph. This method of execution requires an {{MLContext}} created with {{MLContextOptions}}. Otherwise, it throws an "{{OperationError}}" {{DOMException}}. @@ -1164,7 +1203,10 @@ To validate buffer with descriptor given |bufferView| and |descriptor #### Examples #### {#api-mlcontext-sync-execution-examples}
+
+ The following code showcases the synchronous computation with optional outputs in a worker. +
 const context = navigator.ml.createContextSync();
 
@@ -1196,9 +1238,14 @@ context.computeSync(graph, inputs, {'e': bufferE});
 console.log(`values: ${bufferE}`);
 
+ -### The {{MLNamedArrayBufferViews}} transfer algorithm ### {#mlnamedarraybufferviews-transfer} +### The {{MLNamedArrayBufferViews}} transfer algorithm ### {#mlnamedarraybufferviews-transfer-alg} +
+ To transfer an {{MLNamedArrayBufferViews}} |views|: + +
1. Let |transferredViews| be a new {{MLNamedArrayBufferViews}}. 1. For each |key| -> |value| of |views|: 1. Let |transferredBuffer| be the result of [=ArrayBuffer/transfer|transferring=] the [=underlying buffer=] of |value|. @@ -1207,6 +1254,8 @@ To transfer an {{MLNamedArrayBufferView 1. Let |transferredView| be [=Construct=](|constructor|, |transferredBuffer|, |value|.\[[ByteOffset]], |elementsNumber|). 1. Set |transferredViews|[|key|] to |transferredView|. 1. Return |transferredViews|. +
+
### Asynchronous Execution ### {#api-mlcontext-async-execution} Asynchronously carries out the computational workload of a compiled graph {{MLGraph}} on a separate timeline, either on a worker thread for the CPU execution, or on a GPU timeline for the submission of GPU workload on the command queue. The asynchronous nature of this call avoids blocking the calling thread while the computation for result is ongoing. This method of execution requires an {{MLContext}} created with {{MLContextOptions}}. Otherwise, it throws an "{{OperationError}}" {{DOMException}}. @@ -1284,7 +1333,10 @@ partial interface MLContext { #### Examples #### {#api-mlcontext-async-execution-examples}
+
+ The following code showcases the asynchronous computation. +
 const operandType = {type: 'float32', dimensions: [2, 2]};
 const context = await navigator.ml.createContext();
@@ -1309,6 +1361,7 @@ console.log('Output value: ' + result.outputs.C);
 // Note: the result.outputs.C buffer is different from the bufferC, but it
 // shares the same backing memory allocation.
 
+
### WebGPU Interoperability ### {#api-mlcontext-webgpu-interop} @@ -1336,9 +1389,9 @@ typedef record MLNamedGPUResources; interface MLCommandEncoder {}; +
{{MLCommandEncoder}} has the following internal slots: - -
+
: \[[context]] of type {{MLContext}} :: The context of type {{MLContext}} associated with this {{MLCommandEncoder}}. @@ -1346,7 +1399,8 @@ interface MLCommandEncoder {}; : \[[implementation]] :: The underlying implementation provided by the User Agent. -
+
+
### Graph Initialization ### {#api-mlcommandencoder-graph-initialization} Record the initialization of the {{MLGraph}}. This is a necessary step for optimal performance during graph execution as it gives the platform an opportunity to prepare and optimize constant input data for the subsequent execution of the graph. This method should only be called once per graph. @@ -1364,9 +1418,16 @@ partial interface MLCommandEncoder { **Returns:** {{undefined}}.
-
-Graph initialization stage typically involves a process known as "weight preprocessing" where all the constant inputs to the graph are preprocessed and cached at the operating system level for subsequent graph execution calls. The initializing inputs are typically the constant weight data specified through the {{MLGraphBuilder/constant(descriptor, bufferView)|MLGraphBuilder/constant()}} method as constant operands during graph construction time. -
+
+ + The {{MLCommandEncoder/initializeGraph(graph)}} steps are: + +
+
+ Graph initialization stage typically involves a process known as "weight preprocessing" where all the constant inputs to the graph are preprocessed and cached at the operating system level for subsequent graph execution calls. The initializing inputs are typically the constant weight data specified through the {{MLGraphBuilder/constant(descriptor, bufferView)|MLGraphBuilder/constant()}} method as constant operands during graph construction time. +
+
+
### Dispatch Execution Commands ### {#api-mlcommandencoder-dispatch-commands} Record the {{MLGraph}} execution with the inputs {{MLNamedGPUResources}} and outputs {{MLNamedGPUResources}}. @@ -1475,22 +1536,33 @@ Both {{MLGraphBuilder}}.{{MLGraphBuilder/build()}} and {{MLGraphBuilder}}.{{MLGr ### The {{MLGraphBuilder}} constructor ### {#api-mlgraphbuilder-constructor} -The [=new=] {{MLGraphBuilder}} constructor steps are: -1. If [=this=]'s [=relevant global object=]'s [=associated Document=] is not [=allowed to use=] the [=webnn-feature|webnn=] feature, throw a "{{SecurityError}}" {{DOMException}} and abort these steps. -1. Let |context| be the first argument. -1. If the validate MLContext steps given |context| return `false`, throw a "{{TypeError}}" and abort these steps. -1. Set {{MLGraphBuilder/[[context]]}} to |context|. +
+ + The [=new=] {{MLGraphBuilder}} constructor steps are: + +
+ 1. If [=this=]'s [=relevant global object=]'s [=associated Document=] is not [=allowed to use=] the [=webnn-feature|webnn=] feature, throw a "{{SecurityError}}" {{DOMException}} and abort these steps. + 1. Let |context| be the first argument. + 1. If the validate MLContext steps given |context| return `false`, throw a "{{TypeError}}" and abort these steps. + 1. Set {{MLGraphBuilder/[[context]]}} to |context|. +
+
### The {{MLGraphBuilder/input()}} method ### {#api-mlgraphbuilder-input} Create a named {{MLOperand}} based on a descriptor, that can be used as an input. -
+ +
**Arguments:** - *name*: a [=string=] name of the input. - *descriptor*: an {{MLOperandDescriptor}} object. **Returns:**: an {{MLOperand}} object.
-
+ +
+ The {{MLGraphBuilder/input(name, descriptor)}} steps are: + +
The permissions and context validity have been checked by [[#api-mlgraphbuilder-constructor]] steps.
@@ -1502,25 +1574,31 @@ Create a named {{MLOperand}} based on a descriptor, that can be used as an input 1. If |descriptor|.{{MLOperandDescriptor/dimensions}} [=map/exists=]: 1. If the [=check dimensions=] steps given |descriptor|.{{MLOperandDescriptor/type}} and |descriptor|.{{MLOperandDescriptor/dimensions}} return `false`, throw a "{{DataError}}" {{DOMException}} and stop. 1. If the [=byte length=] of |descriptor| is not supported by the underlying platform, then throw a "{{DataError}}" {{DOMException}} and stop. - 1. Let |operand| be the result of invoking the create MLOperand steps with [=this=], `"input"` and |descriptor|. + 1. Let |operand| be the result of invoking the create MLOperand steps with [=this=] and |descriptor|. 1. If that throws, re-throw the exception and stop. 1. Set |operand|.{{MLOperand/[[name]]}} to |name|. 1. Make a request to the underlying platform to register |operand| as an input and store a reference to the corresponding [=implementation-defined=] platform object in |operand|.{{MLOperand/[[operand]]}}. 1. If that fails, throw an "{{OperationError}}" {{DOMException}} and abort these steps. + 1. Return |operand|. +
+
### The constant() method ### {#api-mlgraphbuilder-constant-method} Create a constant {{MLOperand}} that can be used in {{MLGraphBuilder}} methods. #### The {{MLGraphBuilder/constant(descriptor, bufferView)}} method #### {#api-mlgraphbuilder-constant} -
+
**Arguments:** - *descriptor*: an {{MLOperandDescriptor}} object - *bufferView*: an {{MLBufferView}} **Returns:**: an {{MLOperand}} object.
-
-The {{MLGraphBuilder/constant(descriptor, bufferView)}} steps are: +
+ + The {{MLGraphBuilder/constant(descriptor, bufferView)}} steps are: + +
The permissions and context validity have been checked by [[#api-mlgraphbuilder-constructor]] steps.
@@ -1536,19 +1614,23 @@ The {{MLGraphBuilder/constant(descriptor, bufferView)}} steps are: 1. Make a request to the underlying platform to register |operand| as a tensor constant with |bytes| as value and store a reference to the corresponding [=implementation-defined=] object to |operand|.{{MLOperand/[[operand]]}}. 1. If that fails, throw an "{{OperationError}}" {{DOMException}} and stop. 1. Return |operand|. -
+
+ #### The {{MLGraphBuilder/constant(value, type)}} method #### {#api-mlgraphbuilder-constant-value-type} -
+
**Arguments:** - *value*: a number - *type*: an optional {{MLOperandType}}, by default *"float32"*. **Returns:**: an {{MLOperand}} object.
-
-The {{MLGraphBuilder/constant(value, type)}} steps are: +
+ + The {{MLGraphBuilder/constant(value, type)}} steps are: + +
The permissions and context validity have been checked by [[#api-mlgraphbuilder-constructor]] steps.
@@ -1566,7 +1648,8 @@ The {{MLGraphBuilder/constant(value, type)}} steps are: 1. Make a request to the underlying platform to register |operand| as a scalar constant with |value| as value and store a reference of the [=implementation-defined=] platform object for the corresponding (scalar or tensor constant) operand to |operand|.{{MLOperand/[[operand]]}}. 1. If that throws, re-throw the error and stop. 1. Return |operand|. -
+
+ ### The batchNormalization() method ### {#api-mlgraphbuilder-batchnorm} Normalize the tensor values of input features across the batch dimension using [[Batch-Normalization]]. For each input feature, the mean and variance values of that feature supplied in this calculation as parameters are previously computed across the batch dimension of the input during the model training phase of this operation. @@ -1609,23 +1692,21 @@ partial interface MLGraphBuilder { An {{MLActivation}} object. Specifies the optional activation function that immediately follows the normalization operation. -
+
**Arguments:** - *input*: an {{MLOperand}}. The input N-D tensor. - *mean*: an {{MLOperand}}. Specifies the 1-D tensor of the mean values of the input features across the batch whose length is equal to the size of the input dimension denoted by {{MLBatchNormalizationOptions/axis}}. - *variance*: an {{MLOperand}}. The 1-D tensor of the variance values of the input features across the batch whose length is equal to the size of the input dimension denoted by {{MLBatchNormalizationOptions/axis}}. - *options*: an optional {{MLBatchNormalizationOptions}}. Specifies the optional parameters of the operation. - - *scale*: an {{MLOperand}}. The 1-D tensor of the scaling values whose length is equal to the size of the input dimension denoted by *options.axis*. - - *bias*: an {{MLOperand}}. The 1-D tensor of the bias values whose length is equal to the size of the input dimension denoted by *options.axis*. - - *axis*: an {{unsigned long}} scalar. The index to the feature count dimension of the input shape for which the mean and variance values are. Its value must be in the range [0, N-1] where N is the rank of input tensor. When it's not specified, the default value is 1. - - *epsilon*: a {{float}} scalar. A small value to prevent computational error due to divide-by-zero. The default value is 0.00001 when not specified. - - *activation*: an {{MLActivation}}. The optional activation function that immediately follows the normalization operation. **Returns:** an {{MLOperand}}. The batch-normalized N-D tensor of the same shape as the input tensor.
-
+
+ The {{MLGraphBuilder/batchNormalization()}} method steps are: + +
1. Let |input| be the first argument. To validate |input|, run these substeps: 1. If |input| is not an [=object=] that [=implements=] {{MLOperand}}, then throw a "{{TypeError}}" {{DOMException}} and abort these steps. 1. Let |mean| be the second argument, representing a vector with the moving mean values for |input|. To validate |mean|, run the following substeps: @@ -1638,12 +1719,13 @@ partial interface MLGraphBuilder { 1. If |input| is a 4-D tensor of the *"nchw"* layout, set |options|.axis to 1. 1. If |input| is a 4-D tensor of the *"nhwc"* layout, set |options|.axis to 3. 1. Let |result| be an {{MLOperand}} representing the results. It may use the same underlying data as |input|. - 1. Issue a request to the underlying platform to initialize the batch normalization, given |input|, |mean|, |variance|, |options| and |result| to store the results and |options|. Wait for completion. + 1. Issue a request to the underlying platform to initialize the batch normalization, given |result| to store the results and |options|. Wait for completion.
1. If |options|.activation [=map/exists=], implementations MAY use it to optimize the operation flow.
1. Return |result|. -
+
+
The behavior of this operation when the input tensor is 4-D of the *"nchw"* layout and the activation is of operator type *relu* can be generically emulated from the usage of other operations as follow. However, user agents typically have a more efficient implementation for it, therefore its usage is encouraged from the performance standpoint. @@ -1679,11 +1761,14 @@ partial interface MLGraphBuilder {
+
+ The behavior of this operation can be generically emulated from the usage of other operations as follow. However, user agents typically have a more efficient implementation for it, therefore its usage is encouraged from the performance standpoint. -
+  
+
     if (options.minValue === undefined) {
       if (options.maxValue === undefined) {
         return x;
@@ -1699,17 +1784,23 @@ partial interface MLGraphBuilder {
             builder.constant(options.maxValue));
       }
     }
-    
+
-To check clamp options given |options|, run the following steps: - 1. If |options| is not an object that [=implements=] {{MLClampOptions}}, then return `false`. +
+ + To check clamp options given |options|, run the following steps: + +
+ 1. If |options| is not an [=object=] that [=implements=] {{MLClampOptions}}, then return `false`. 1. If |options|.{{MLClampOptions/minValue}} and |options|.{{MLClampOptions/maxValue}} are not a [=numeric type=], then then return `false`. 1. If |options|.{{MLClampOptions/minValue}} is greater than |options|.{{MLClampOptions/maxValue}}, then return `false`. 1. Return `true`. +
+
#### The {{MLGraphBuilder/clamp(operand, options)}} method #### {#api-mlgraphbuilder-clamp-operand-options} -
+
**Arguments:** - *operand*: an {{MLOperand}}. The input tensor. - *options*: an optional {{MLClampOptions}}. The optional parameters of the operation. @@ -1718,8 +1809,12 @@ To check clamp options given |options|, run the following steps: **Returns:** - an {{MLOperand}}. The output tensor of the same shape as *operand*.
-
+ +
+ The {{MLGraphBuilder/clamp(operand, options)}} method steps are: + +
1. Let |operand| be the first argument. 1. Let |options| be the second argument. 1. If running the check clamp options steps with |options| returns `false`, then throw a "{{TypeError}}" {{DOMException}} and abort these steps. @@ -1733,10 +1828,11 @@ To check clamp options given |options|, run the following steps: 1. Register the |result|.{{MLOperand/[[operand]]}} as output to |operatorImpl|. 1. Store a reference to |operatorImpl| in |result|.{{MLOperand/[[operator]]}}. 1. Return |result|. -
+
+ #### The {{MLGraphBuilder/clamp(options)}} method #### {#api-mlgraphbuilder-clamp-options} -
+
**Arguments:** - *options*: an optional {{MLClampOptions}}. The optional parameters of the operation. - *minValue*: a {{float}} scalar. Specifies the minimum value of the range. When it is not specified, the clamping is not performed on the lower limit of the range. @@ -1744,8 +1840,12 @@ To check clamp options given |options|, run the following steps: **Returns:** - an {{MLActivation}}. The operator representing the clamp operation.
-
+ +
+ The {{MLGraphBuilder/clamp(options)}} method steps are: + +
1. Let |options| be the first argument. 1. If running the check clamp options steps with |options| returns `false`, then throw a "{{TypeError}}" {{DOMException}} and abort these steps. 1. Let |op| be the result of invoking the create MLActivation steps with `"clamp"` and |options|. @@ -1754,7 +1854,8 @@ To check clamp options given |options|, run the following steps: 1. Make a request to the underlying platform to connect |op| with the [=implementation-defined=] platform operator for clamp |operatorImpl|. 1. Store a reference to |operatorImpl| in |op|.{{MLActivation/[[operator]]}}. 1. Return |op|. -
+
+ ### The concat() method ### {#api-mlgraphbuilder-concat} Concatenates the input tensors along a given axis. @@ -1774,8 +1875,12 @@ partial interface MLGraphBuilder { that all the inputs concatenated along. The size of that dimension is computed as the sum of all the input sizes of the same dimension.
-
+ +
+ The {{MLGraphBuilder/concat(inputs, axis)}} steps are: + +
The permissions and context validity have been checked by [[#api-mlgraphbuilder-constructor]] steps.
@@ -1804,7 +1909,8 @@ partial interface MLGraphBuilder { 1. Make a request to the underlying platform to create an operator for this method with |inputs| connected as input and |output| connected as output and store a reference to the [=implementation-defined=] platform object to |output|.{{MLOperand/[[operand]]}}. 1. If that fails, throw a "{{DataError}}" {{DOMException}} and stop. 1. Return |output|. -
+
+ ### The conv2d() method ### {#api-mlgraphbuilder-conv2d} Compute a 2-D convolution given 4-D input and filter tensors @@ -3339,12 +3445,15 @@ partial interface MLGraphBuilder { **Returns:** a sequence of {{MLOperand}}. The splitted output tensors. If *splits* is an {{unsigned long}}, the length of the output sequence equals to *splits*. The shape of each output tensor is the same as *input* except the dimension size of *axis* equals to the quotient of dividing the dimension size of *input* along *axis* by *splits*. If *splits* is a sequence of {{unsigned long}}, the length of the output sequence equals to the length of *splits*. The shape of the i-th output tensor is the same as as *input* except along *axis* where the dimension size is *splits[i]*. -
+
+
+ The behavior of this operation can be generically emulated from the usage of other operations as follow. However, user agents typically have a more efficient implementation for it, therefore its usage is encouraged from the performance standpoint. -
+  
+
     // This sample shows the case that the splits parameter is an array.
     const outputs = [];
     let starts = Array(input_rank).fill(0);
@@ -3357,8 +3466,8 @@ partial interface MLGraphBuilder {
       start += size;
     }
     return outputs;
-    
-
+ +
### The squeeze() method ### {#api-mlgraphbuilder-squeeze} @@ -3441,6 +3550,8 @@ const context = await navigator.ml.createContext({powerPreference: 'low-power'})
+
+ The following code builds a graph as:
 constant1 ---+
@@ -3451,6 +3562,7 @@ constant2 ---+                                    |
              +--- Add ---> intermediateOutput2 ---+
 input2    ---+
 
+
 // Use tensors in 4 dimensions.
 const TENSOR_DIMS = [1, 2, 2, 2];
@@ -3484,6 +3596,7 @@ const intermediateOutput2 = builder.add(constant2, input2);
 // output is the output MLOperand of the Mul operation.
 const output = builder.mul(intermediateOutput1, intermediateOutput2);
 
+
@@ -3495,7 +3608,10 @@ const graph = await builder.build({'output': output});
-The following code executes the compiled graph. +
+ + The following code executes the compiled graph. +
 // Setup the input buffers with value 1.
 const inputBuffer1 = new Float32Array(TENSOR_SIZE).fill(1);
@@ -3513,6 +3629,7 @@ const result = await context.compute(graph, inputs, outputs);
 console.log('Output value: ' + result.outputs.output);
 // Output value: 2.25,2.25,2.25,2.25,2.25,2.25,2.25,2.25
 
+
# Appendices # {#appendices} From e260d9f2bbaa09068d1e0fbebac420667a3669e2 Mon Sep 17 00:00:00 2001 From: Zoltan Kis Date: Mon, 19 Jun 2023 23:06:33 +0300 Subject: [PATCH 011/112] Indent algorithms and fix make errors Signed-off-by: Zoltan Kis --- index.bs | 389 ++++++++++++++++++++++++++++--------------------------- 1 file changed, 198 insertions(+), 191 deletions(-) diff --git a/index.bs b/index.bs index ab7740e6..f6fafdc0 100644 --- a/index.bs +++ b/index.bs @@ -67,19 +67,19 @@ urlPrefix: https://tc39.es/proposal-float16array/; spec: float16array
 {
-	"WEBGPU": {
-		"authors": [
-			"Dzmitry Malyshau",
-			"Kai Ninomiya"
-		],
-		"href": "https://gpuweb.github.io/gpuweb/",
-		"title": "WebGPU",
-		"status": "ED",
-		"publisher": "W3C",
-		"deliveredBy": [
-			"https://www.w3.org/2020/gpu/"
-		]
-	}
+    "WEBGPU": {
+        "authors": [
+            "Dzmitry Malyshau",
+            "Kai Ninomiya"
+        ],
+        "href": "https://gpuweb.github.io/gpuweb/",
+        "title": "WebGPU",
+        "status": "ED",
+        "publisher": "W3C",
+        "deliveredBy": [
+            "https://www.w3.org/2020/gpu/"
+        ]
+    }
 }
 
@@ -811,40 +811,40 @@ Its default allowlist is 'self'. ### The {{ML/createContext()}} method ### {#api-ml-createcontext}
-The {{ML/createContext()}} method steps are: + The {{ML/createContext()}} method steps are:
-1. If [=this=]'s [=relevant global object=]'s [=associated Document=] is not [=allowed to use=] the [=webnn-feature|webnn=] feature, return [=a new promise=] [=rejected=] with a "{{SecurityError}}" {{DOMException}} and abort these steps. -1. Let |promise| be [=a new promise=]. -1. Return |promise| and run the following steps [=in parallel=]. -1. Let |options| be the first argument. -1. Run the create context steps given |options|: - 1. Let |context| be a new {{MLContext}} object. - 1. If |options| is a {{GPUDevice}} object, - 1. Set |context|.{{[[contextType]]}} to "[=webgpu-context|webgpu=]". - 1. Set |context|.{{[[deviceType]]}} to "[=device-type-gpu|gpu=]". - 1. Set |context|.{{[[powerPreference]]}} to "[=power-preference-default|default=]". - 1. Otherwise, - 1. Set |context|.{{[[contextType]]}} to "[=default-context|default=]". - 1. If |options|["{{deviceType}}"] [=map/exists=], then set |context|.{{[[deviceType]]}} to |options|["{{deviceType}}"]. Otherwise, set |context|.{{[[deviceType]]}} to "[=device-type-cpu|cpu=]". - 1. If |options|["{{powerPreference}}"] [=map/exists=], then set |context|.{{[[powerPreference]]}} to |options|["{{powerPreference}}"]. Otherwise, set |context|.{{[[powerPreference]]}} to "[=power-preference-default|default=]". -1. If the validate MLContext steps given |context| return `false`, [=reject=] |promise| with a "{{NotSupportedError}}" {{DOMException}} and abort these steps. -1. [=Resolve=] |promise| with |context|. + 1. If [=this=]'s [=relevant global object=]'s [=associated Document=] is not [=allowed to use=] the [=webnn-feature|webnn=] feature, return [=a new promise=] [=rejected=] with a "{{SecurityError}}" {{DOMException}} and abort these steps. + 1. Let |promise| be [=a new promise=]. + 1. Return |promise| and run the following steps [=in parallel=]. + 1. Let |options| be the first argument. + 1. Run the create context steps given |options|: + 1. Let |context| be a new {{MLContext}} object. + 1. If |options| is a {{GPUDevice}} object, + 1. Set |context|.{{[[contextType]]}} to "[=webgpu-context|webgpu=]". + 1. Set |context|.{{[[deviceType]]}} to "[=device-type-gpu|gpu=]". + 1. Set |context|.{{[[powerPreference]]}} to "[=power-preference-default|default=]". + 1. Otherwise, + 1. Set |context|.{{[[contextType]]}} to "[=default-context|default=]". + 1. If |options|["{{deviceType}}"] [=map/exists=], then set |context|.{{[[deviceType]]}} to |options|["{{deviceType}}"]. Otherwise, set |context|.{{[[deviceType]]}} to "[=device-type-cpu|cpu=]". + 1. If |options|["{{powerPreference}}"] [=map/exists=], then set |context|.{{[[powerPreference]]}} to |options|["{{powerPreference}}"]. Otherwise, set |context|.{{[[powerPreference]]}} to "[=power-preference-default|default=]". + 1. If the validate MLContext steps given |context| return `false`, [=reject=] |promise| with a "{{NotSupportedError}}" {{DOMException}} and abort these steps. + 1. [=Resolve=] |promise| with |context|.
### The {{ML/createContextSync()}} method ### {#api-ml-createcontextsync}
- -The {{ML/createContextSync()}} method steps are: - -
-1. If [=this=]'s [=relevant global object=]'s [=associated Document=] is not [=allowed to use=] the [=webnn-feature|webnn=] feature, throw a "{{SecurityError}}" {{DOMException}} and abort these steps. -1. Let |options| be the first argument. -1. Let |context| be the result of running the create context steps given |options|. -1. If the validate MLContext steps given |context| return `false`, throw a "{{NotSupportedError}}" {{DOMException}} and abort these steps. -1. Return |context|. -
+ + The {{ML/createContextSync()}} method steps are: + +
+ 1. If [=this=]'s [=relevant global object=]'s [=associated Document=] is not [=allowed to use=] the [=webnn-feature|webnn=] feature, throw a "{{SecurityError}}" {{DOMException}} and abort these steps. + 1. Let |options| be the first argument. + 1. Let |context| be the result of running the create context steps given |options|. + 1. If the validate MLContext steps given |context| return `false`, throw a "{{NotSupportedError}}" {{DOMException}} and abort these steps. + 1. Return |context|. +
## The MLGraph interface ## {#api-mlgraph} @@ -1126,16 +1126,16 @@ When the {{[[contextType]]}} is set to [=default-context|default=] with the {{ML ### The {{MLContext}} validation algorithm ### {#api-mlcontext-validate}
- -To validate {{MLContext}}, given |context|, run these steps: - -
-1. If |context|.{{[[contextType]]}} is not "[=webgpu-context|webgpu=]" or "[=default-context|default=], return `false`. -1. If |context|.{{[[deviceType]]}} is not "[=device-type-cpu|cpu=]" or "[=device-type-gpu|gpu=]", return `false`. -1. If |context|.{{[[powerPreference]]}} is not "[=power-preference-default|default=]" or "[=power-preference-high-performance|high-performance=]" or "[=power-preference-low-power|low-power=]", return `false`. -1. If the user agent cannot support |context|.{{[[contextType]]}}, |context|.{{[[deviceType]]}} and |context|.{{[[powerPreference]]}}, return `false`. -1. Return `true`; -
+ + To validate {{MLContext}}, given |context|, run these steps: + +
+ 1. If |context|.{{[[contextType]]}} is not "[=webgpu-context|webgpu=]" or "[=default-context|default=], return `false`. + 1. If |context|.{{[[deviceType]]}} is not "[=device-type-cpu|cpu=]" or "[=device-type-gpu|gpu=]", return `false`. + 1. If |context|.{{[[powerPreference]]}} is not "[=power-preference-default|default=]" or "[=power-preference-high-performance|high-performance=]" or "[=power-preference-low-power|low-power=]", return `false`. + 1. If the user agent cannot support |context|.{{[[contextType]]}}, |context|.{{[[deviceType]]}} and |context|.{{[[powerPreference]]}}, return `false`. + 1. Return `true`; +
### Synchronous Execution ### {#api-mlcontext-sync-execution} @@ -1204,57 +1204,57 @@ To validate buffer with descriptor given |bufferView| and |descriptor
- -The following code showcases the synchronous computation with optional outputs in a worker. - -
-const context = navigator.ml.createContextSync();
-
-// Build a graph with two outputs.
-const builder = new MLGraphBuilder(context);
-const descA = {type: 'float32', dimensions: [3, 4]};
-const a = builder.input('a', descA);
-const descB = {type: 'float32', dimensions: [4, 3]};
-const bufferB = new Float32Array(sizeOfShape(descB.dimensions)).fill(0.5);
-const b = builder.constant(descB, bufferB);
-const descC = {type: 'float32', dimensions: [3, 3]};
-const bufferC = new Float32Array(sizeOfShape(descC.dimensions)).fill(1);
-const c = builder.constant(descC, bufferC);
-const d = builder.matmul(a, b);
-const e = builder.add(d, c);
-const graph = builder.buildSync({'d': d, 'e': e});
-
-const bufferA = new Float32Array(sizeOfShape(descA.dimensions)).fill(0.5);
-const inputs = {'a': bufferA};
-
-// Compute d.
-const bufferD = new Float32Array(sizeOfShape([3, 3]));
-context.computeSync(graph, inputs, {'d': bufferD});
-console.log(`values: ${bufferD}`);
-
-// Compute e.
-const bufferE = new Float32Array(sizeOfShape([3, 3]));
-context.computeSync(graph, inputs, {'e': bufferE});
-console.log(`values: ${bufferE}`);
-
-
+ + The following code showcases the synchronous computation with optional outputs in a worker. + +
+    const context = navigator.ml.createContextSync();
+
+    // Build a graph with two outputs.
+    const builder = new MLGraphBuilder(context);
+    const descA = {type: 'float32', dimensions: [3, 4]};
+    const a = builder.input('a', descA);
+    const descB = {type: 'float32', dimensions: [4, 3]};
+    const bufferB = new Float32Array(sizeOfShape(descB.dimensions)).fill(0.5);
+    const b = builder.constant(descB, bufferB);
+    const descC = {type: 'float32', dimensions: [3, 3]};
+    const bufferC = new Float32Array(sizeOfShape(descC.dimensions)).fill(1);
+    const c = builder.constant(descC, bufferC);
+    const d = builder.matmul(a, b);
+    const e = builder.add(d, c);
+    const graph = builder.buildSync({'d': d, 'e': e});
+
+    const bufferA = new Float32Array(sizeOfShape(descA.dimensions)).fill(0.5);
+    const inputs = {'a': bufferA};
+
+    // Compute d.
+    const bufferD = new Float32Array(sizeOfShape([3, 3]));
+    context.computeSync(graph, inputs, {'d': bufferD});
+    console.log(`values: ${bufferD}`);
+
+    // Compute e.
+    const bufferE = new Float32Array(sizeOfShape([3, 3]));
+    context.computeSync(graph, inputs, {'e': bufferE});
+    console.log(`values: ${bufferE}`);
+  
+
### The {{MLNamedArrayBufferViews}} transfer algorithm ### {#mlnamedarraybufferviews-transfer-alg}
- -To transfer an {{MLNamedArrayBufferViews}} |views|: - -
-1. Let |transferredViews| be a new {{MLNamedArrayBufferViews}}. -1. For each |key| -> |value| of |views|: - 1. Let |transferredBuffer| be the result of [=ArrayBuffer/transfer|transferring=] the [=underlying buffer=] of |value|. - 1. Let |constructor| be the appropriate [=view constructor=] for the type of {{ArrayBufferView}} |value|. - 1. Let |elementsNumber| be the result of the [=buffer byte length|byte length=] of |value| ÷ [=element size=] of |value|. - 1. Let |transferredView| be [=Construct=](|constructor|, |transferredBuffer|, |value|.\[[ByteOffset]], |elementsNumber|). - 1. Set |transferredViews|[|key|] to |transferredView|. -1. Return |transferredViews|. -
+ + To transfer an {{MLNamedArrayBufferViews}} |views|: + +
+ 1. Let |transferredViews| be a new {{MLNamedArrayBufferViews}}. + 1. For each |key| -> |value| of |views|: + 1. Let |transferredBuffer| be the result of [=ArrayBuffer/transfer|transferring=] the [=underlying buffer=] of |value|. + 1. Let |constructor| be the appropriate [=view constructor=] for the type of {{ArrayBufferView}} |value|. + 1. Let |elementsNumber| be the result of the [=buffer byte length|byte length=] of |value| ÷ [=element size=] of |value|. + 1. Let |transferredView| be [=Construct=](|constructor|, |transferredBuffer|, |value|.\[[ByteOffset]], |elementsNumber|). + 1. Set |transferredViews|[|key|] to |transferredView|. + 1. Return |transferredViews|. +
### Asynchronous Execution ### {#api-mlcontext-async-execution} @@ -1334,33 +1334,33 @@ partial interface MLContext {
- -The following code showcases the asynchronous computation. - -
-const operandType = {type: 'float32', dimensions: [2, 2]};
-const context = await navigator.ml.createContext();
-const builder = new MLGraphBuilder(context);
-// 1. Create a computational graph 'C = 0.2 * A + B'.
-const constant = builder.constant(0.2);
-const A = builder.input('A', operandType);
-const B = builder.input('B', operandType);
-const C = builder.add(builder.mul(A, constant), B);
-// 2. Compile it into an executable.
-const graph = await builder.build({'C': C});
-// 3. Bind inputs to the graph and execute for the result.
-const bufferA = new Float32Array(4).fill(1.0);
-const bufferB = new Float32Array(4).fill(0.8);
-const bufferC = new Float32Array(4);
-const inputs = {'A': bufferA, 'B': bufferB};
-const outputs = {'C': bufferC};
-const result = await context.compute(graph, inputs, outputs);
-// The computed result of [[1, 1], [1, 1]] is in the buffer associated with
-// the output operand.
-console.log('Output value: ' + result.outputs.C);
-// Note: the result.outputs.C buffer is different from the bufferC, but it
-// shares the same backing memory allocation.
-
+ + The following code showcases the asynchronous computation. + +
+    const operandType = {type: 'float32', dimensions: [2, 2]};
+    const context = await navigator.ml.createContext();
+    const builder = new MLGraphBuilder(context);
+    // 1. Create a computational graph 'C = 0.2 * A + B'.
+    const constant = builder.constant(0.2);
+    const A = builder.input('A', operandType);
+    const B = builder.input('B', operandType);
+    const C = builder.add(builder.mul(A, constant), B);
+    // 2. Compile it into an executable.
+    const graph = await builder.build({'C': C});
+    // 3. Bind inputs to the graph and execute for the result.
+    const bufferA = new Float32Array(4).fill(1.0);
+    const bufferB = new Float32Array(4).fill(0.8);
+    const bufferC = new Float32Array(4);
+    const inputs = {'A': bufferA, 'B': bufferB};
+    const outputs = {'C': bufferC};
+    const result = await context.compute(graph, inputs, outputs);
+    // The computed result of [[1, 1], [1, 1]] is in the buffer associated with
+    // the output operand.
+    console.log('Output value: ' + result.outputs.C);
+    // Note: the result.outputs.C buffer is different from the bufferC, but it
+    // shares the same backing memory allocation.
+  
@@ -1411,7 +1411,7 @@ partial interface MLCommandEncoder { }; -
+
**Arguments:** - *graph*: an {{MLGraph}}. The compiled graph to be initialized with graph constant inputs. @@ -1438,16 +1438,22 @@ partial interface MLCommandEncoder { }; -
+
**Arguments:** - *graph*: an {{MLGraph}}. The compiled graph to be executed. - *inputs*: an {{MLNamedGPUResources}}. The resources of inputs. - *outputs*: an {{MLNamedGPUResources}}. The pre-allocated resources of required outputs. **Returns:** {{undefined}}. +
- 1. If any of the following requirements are unmet, then throw a "{{DataError}}" {{DOMException}} and stop. -
+
+ + The {{MLCommandEncoder/dispatch(graph, inputs, outputs)}} steps are: + +
+ 1. If any of the following requirements are unmet, then throw a "{{DataError}}" {{DOMException}} and stop. +
1. For each |key| -> |value| of |inputs|: 1. |graph|.{{MLGraph/[[inputDescriptors]]}}[|key|] must [=map/exist=]. 1. Let |inputDesc| be |graph|.{{MLGraph/[[inputDescriptors]]}}[|key|]. @@ -1459,16 +1465,16 @@ partial interface MLCommandEncoder { 1. If |value| is a {{GPUBuffer}}, then: 1. |value|.{{GPUBuffer/size}} must equal to [=byte length=] of |outputDesc|.
- - 1. For each |key| -> |value| of |inputs|: - 1. Set the input of |graph|.{{MLGraph/[[implementation]]}} that is associated with |key| to |value|. - 1. For each |key| -> |value| of |outputs|: - 1. Set the output of |graph|.{{MLGraph/[[implementation]]}} that is associated with |key| to |value|. - 1. Issue a compute request of |graph|.{{MLGraph/[[implementation]]}}. - 1. If there is an error returned by |graph|.{{MLGraph/[[implementation]]}}, then: - 1. Throw an "{{OperationError}}" {{DOMException}} and stop. - 1. Return {{undefined}}. -
+ 1. For each |key| -> |value| of |inputs|: + 1. Set the input of |graph|.{{MLGraph/[[implementation]]}} that is associated with |key| to |value|. + 1. For each |key| -> |value| of |outputs|: + 1. Set the output of |graph|.{{MLGraph/[[implementation]]}} that is associated with |key| to |value|. + 1. Issue a compute request of |graph|.{{MLGraph/[[implementation]]}}. + 1. If there is an error returned by |graph|.{{MLGraph/[[implementation]]}}, then: + 1. Throw an "{{OperationError}}" {{DOMException}} and stop. + 1. Return {{undefined}}. +
+ ### Generate GPU Command Buffer ### {#api-mlcommandencoder-generate-gpu-command-buffer} Complete the recording of ML workload and return a WebGPU-compatible {{GPUCommandBuffer}} containing the recorded workload. @@ -1719,7 +1725,7 @@ partial interface MLGraphBuilder { 1. If |input| is a 4-D tensor of the *"nchw"* layout, set |options|.axis to 1. 1. If |input| is a 4-D tensor of the *"nhwc"* layout, set |options|.axis to 3. 1. Let |result| be an {{MLOperand}} representing the results. It may use the same underlying data as |input|. - 1. Issue a request to the underlying platform to initialize the batch normalization, given |result| to store the results and |options|. Wait for completion. + 1. Issue a request to the underlying platform to initialize the batch normalization, passing the arguments |input|, |mean|, |variance| and |options| and given |result| to store the results and |options|. Wait for completion.
1. If |options|.activation [=map/exists=], implementations MAY use it to optimize the operation flow.
@@ -3550,52 +3556,53 @@ const context = await navigator.ml.createContext({powerPreference: 'low-power'})
-
- -The following code builds a graph as: +Given the following build graph:
-constant1 ---+
-             +--- Add ---> intermediateOutput1 ---+
-input1    ---+                                    |
-                                                  +--- Mul---> output
-constant2 ---+                                    |
-             +--- Add ---> intermediateOutput2 ---+
-input2    ---+
+    constant1 ---+
+                +--- Add ---> intermediateOutput1 ---+
+    input1    ---+                                    |
+                                                    +--- Mul---> output
+    constant2 ---+                                    |
+                +--- Add ---> intermediateOutput2 ---+
+    input2    ---+
 
-
-
-// Use tensors in 4 dimensions.
-const TENSOR_DIMS = [1, 2, 2, 2];
-const TENSOR_SIZE = 8;
+
+ + The following code implements the graph: + +
+    // Use tensors in 4 dimensions.
+    const TENSOR_DIMS = [1, 2, 2, 2];
+    const TENSOR_SIZE = 8;
 
-const builder = new MLGraphBuilder(context);
+    const builder = new MLGraphBuilder(context);
 
-// Create MLOperandDescriptor object.
-const desc = {type: 'float32', dimensions: TENSOR_DIMS};
+    // Create MLOperandDescriptor object.
+    const desc = {type: 'float32', dimensions: TENSOR_DIMS};
 
-// constant1 is a constant MLOperand with the value 0.5.
-const constantBuffer1 = new Float32Array(TENSOR_SIZE).fill(0.5);
-const constant1 = builder.constant(desc, constantBuffer1);
+    // constant1 is a constant MLOperand with the value 0.5.
+    const constantBuffer1 = new Float32Array(TENSOR_SIZE).fill(0.5);
+    const constant1 = builder.constant(desc, constantBuffer1);
 
-// input1 is one of the input MLOperands. Its value will be set before execution.
-const input1 = builder.input('input1', desc);
+    // input1 is one of the input MLOperands. Its value will be set before execution.
+    const input1 = builder.input('input1', desc);
 
-// constant2 is another constant MLOperand with the value 0.5.
-const constantBuffer2 = new Float32Array(TENSOR_SIZE).fill(0.5);
-const constant2 = builder.constant(desc, constantBuffer2);
+    // constant2 is another constant MLOperand with the value 0.5.
+    const constantBuffer2 = new Float32Array(TENSOR_SIZE).fill(0.5);
+    const constant2 = builder.constant(desc, constantBuffer2);
 
-// input2 is another input MLOperand. Its value will be set before execution.
-const input2 = builder.input('input2', desc);
+    // input2 is another input MLOperand. Its value will be set before execution.
+    const input2 = builder.input('input2', desc);
 
-// intermediateOutput1 is the output of the first Add operation.
-const intermediateOutput1 = builder.add(constant1, input1);
+    // intermediateOutput1 is the output of the first Add operation.
+    const intermediateOutput1 = builder.add(constant1, input1);
 
-// intermediateOutput2 is the output of the second Add operation.
-const intermediateOutput2 = builder.add(constant2, input2);
+    // intermediateOutput2 is the output of the second Add operation.
+    const intermediateOutput2 = builder.add(constant2, input2);
 
-// output is the output MLOperand of the Mul operation.
-const output = builder.mul(intermediateOutput1, intermediateOutput2);
-
+ // output is the output MLOperand of the Mul operation. + const output = builder.mul(intermediateOutput1, intermediateOutput2); +
@@ -3612,23 +3619,23 @@ const graph = await builder.build({'output': output}); The following code executes the compiled graph. -
-// Setup the input buffers with value 1.
-const inputBuffer1 = new Float32Array(TENSOR_SIZE).fill(1);
-const inputBuffer2 = new Float32Array(TENSOR_SIZE).fill(1);
-const outputBuffer = new Float32Array(TENSOR_SIZE);
-
-// Execute the compiled graph with the specified inputs.
-const inputs = {
-  'input1': inputBuffer1,
-  'input2': inputBuffer2,
-};
-const outputs = {'output': outputBuffer};
-const result = await context.compute(graph, inputs, outputs);
-
-console.log('Output value: ' + result.outputs.output);
-// Output value: 2.25,2.25,2.25,2.25,2.25,2.25,2.25,2.25
-
+
+    // Setup the input buffers with value 1.
+    const inputBuffer1 = new Float32Array(TENSOR_SIZE).fill(1);
+    const inputBuffer2 = new Float32Array(TENSOR_SIZE).fill(1);
+    const outputBuffer = new Float32Array(TENSOR_SIZE);
+
+    // Execute the compiled graph with the specified inputs.
+    const inputs = {
+    'input1': inputBuffer1,
+    'input2': inputBuffer2,
+    };
+    const outputs = {'output': outputBuffer};
+    const result = await context.compute(graph, inputs, outputs);
+
+    console.log('Output value: ' + result.outputs.output);
+    // Output value: 2.25,2.25,2.25,2.25,2.25,2.25,2.25,2.25
+  
From a6178e9b52ea9c4568142b3bbfa2e34ba2c54222 Mon Sep 17 00:00:00 2001 From: Zoltan Kis Date: Tue, 24 Jan 2023 21:50:47 +0200 Subject: [PATCH 012/112] Add the batch normalization algorithm Signed-off-by: Zoltan Kis --- index.bs | 87 ++++++++++++++++++++++++++++++++++++++++++++++++++++---- 1 file changed, 81 insertions(+), 6 deletions(-) diff --git a/index.bs b/index.bs index 131e9b5e..d2e52831 100644 --- a/index.bs +++ b/index.bs @@ -134,6 +134,36 @@ div.validusage { content: "Valid Usage"; } +/* Box for Informal steps. */ +div.informalsteps { + padding: .5em; + border: thin solid #88e !important; + border-radius: .5em; +} + +/* + * Stylistic labels, for clarity of presentation of these blocks. + * + * NOTE: This text is non-accessible and non-selectable; surrounding + * text must also explain the context. + */ +.informalsteps { + position: relative; +} +.informalsteps::after { + font-weight: bold; + font-style: italic; + font-size: 130%; + color: rgba(0, 0, 0, 0.15); + color: var(--watermark-text); + position: absolute; + right: .3em; + bottom: .1em; +} +.informalsteps::after { + content: "Non-normative"; +} + /* * Ensure that argumentdef blocks don't overflow algorithm section borders. This is made far harder * than it needs to be because the top-level W3C stylesheet has several @media + min-width variants @@ -1316,6 +1346,7 @@ The {{MLGraphBuilder/constant(value, type)}} steps are: ### The batchNormalization() method ### {#api-mlgraphbuilder-batchnorm} Normalize the tensor values of input features across the batch dimension using [[Batch-Normalization]]. For each input feature, the mean and variance values of that feature supplied in this calculation as parameters are previously computed across the batch dimension of the input during the model training phase of this operation. + -
+ +{{MLBatchNormalizationOptions}} has the following members: +
+ : scale + :: + An {{MLOperand}}. Specifies the 1-D tensor of the scaling values whose length is equal to the size of the input dimension denoted by {{MLBatchNormalizationOptions/axis}}. + + : bias + :: + An {{MLOperand}}. Specifies the 1-D tensor of the bias values whose length is equal to the size of the input dimension denoted by {{MLBatchNormalizationOptions/axis}}. + + : axis + :: + A {{long}} scalar. Specifies the index to the feature count dimension of the input shape for which the mean and variance values are. Its value must be in the range [0, N-1] where N is the rank of input tensor. The default value is 1, corresponding to the channel (*"c"*) dimension in the *"nchw"* data layout. + + : epsilon + :: + A {{float}} scalar. Specifies A small value to prevent computational error due to divide-by-zero. + + : activation + :: + An {{MLActivation}} object. Specifies the optional activation function that immediately follows the normalization operation. +
+ +
**Arguments:** - *input*: an {{MLOperand}}. The input N-D tensor. - - *mean*: an {{MLOperand}}. The 1-D tensor of the mean values of the input features across the batch whose length is equal to the size of the input dimension denoted by *options.axis*. - - *variance*: an {{MLOperand}}. The 1-D tensor of the variance values of the input features across the batch whose length is equal to the size of the input dimension denoted by *options.axis*. - - *options*: an optional {{MLBatchNormalizationOptions}}. The optional parameters of the operation. + - *mean*: an {{MLOperand}}. Specifies the 1-D tensor of the mean values of the input features across the batch whose length is equal to the size of the input dimension denoted by {{MLBatchNormalizationOptions/axis}}. + - *variance*: an {{MLOperand}}. The 1-D tensor of the variance values of the input features across the batch whose length is equal to the size of the input dimension denoted by {{MLBatchNormalizationOptions/axis}}. + - *options*: an optional {{MLBatchNormalizationOptions}}. Specifies the optional parameters of the operation. - *scale*: an {{MLOperand}}. The 1-D tensor of the scaling values whose length is equal to the size of the input dimension denoted by *options.axis*. - *bias*: an {{MLOperand}}. The 1-D tensor of the bias values whose length is equal to the size of the input dimension denoted by *options.axis*. - *axis*: an {{unsigned long}} scalar. The index to the feature count dimension of the input shape for which the mean and variance values are. Its value must be in the range [0, N-1] where N is the rank of input tensor. When it's not specified, the default value is 1. @@ -1343,10 +1398,30 @@ partial interface MLGraphBuilder { - *activation*: an {{MLActivation}}. The optional activation function that immediately follows the normalization operation. **Returns:** an {{MLOperand}}. The batch-normalized N-D tensor of the same shape as the input tensor. +
- When *input* is a 4-D tensor of the *"nchw"* or *"nhwc"* layout, *options.axis* should be set to 1 or 3 respectively. The axis value designates the feature or channel count dimension of the input tensor. +
+ The {{MLGraphBuilder/batchNormalization()}} method steps are: + 1. Let |input| be the first argument. To validate |input|, run these substeps: + 1. If |input| is not an [=object=] that [=implements=] {{MLOperand}}, then throw a "{{TypeError}}" {{DOMException}} and abort these steps. + 1. Let |mean| be the second argument, representing a vector with the moving mean values for |input|. To validate |mean|, run the following substeps: + 1. If |mean| is not an [=object=] that [=implements=] {{MLOperand}}, then throw a "{{TypeError}}" {{DOMException}} and abort these steps. + 1. If |mean|.{{MLOperand/[[descriptor]]}}.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}} is not equal with |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}} from which the dimension represented by |options|.axis is removed, then throw a "{{TypeError}}" {{DOMException}} and abort these steps. + 1. Let |variance| be the third argument, representing the moving variance values of |input|. + 1. Let |options| be the fourth argument. To validate |options|, run these substeps: + 1. If |options|.axis does not [=map/exist=], let |options|."axis" be 1. + 1. If |options|.axis is not a number between 0 and the rank of |input|, then throw a "{{TypeError}}" {{DOMException}} and abort these steps. + 1. If |input| is a 4-D tensor of the *"nchw"* layout, set |options|.axis to 1. + 1. If |input| is a 4-D tensor of the *"nhwc"* layout, set |options|.axis to 3. + 1. Let |result| be an {{MLOperand}} representing the results. It may use the same underlying data as |input|. + 1. Issue a request to the underlying platform to initialize the batch normalization, given |input|, |mean|, |variance|, |options| and |result| to store the results and |options|. Wait for completion. +
+ 1. If |options|.activation [=map/exists=], implementations MAY use it to optimize the operation flow. +
+ 1. Return |result|. +
-
+
The behavior of this operation when the input tensor is 4-D of the *"nchw"* layout and the activation is of operator type *relu* can be generically emulated from the usage of other operations as follow. However, user agents typically have a more efficient implementation for it, therefore its usage is encouraged from the performance standpoint.
     const shape = [1,null,1,1];

From 013250ebba0ff8c9a6766a98b0a11ed17e6d6ef0 Mon Sep 17 00:00:00 2001
From: Zoltan Kis 
Date: Wed, 22 Mar 2023 22:41:52 +0200
Subject: [PATCH 013/112] Add the concat algorithm

Squashed from the following commits:
    Add reference to the platform operand object
    Address review comment, change note.

Signed-off-by: Zoltan Kis 
---
 index.bs | 36 ++++++++++++++++++++++++++++++++++--
 1 file changed, 34 insertions(+), 2 deletions(-)

diff --git a/index.bs b/index.bs
index d2e52831..957c6bc3 100644
--- a/index.bs
+++ b/index.bs
@@ -1489,14 +1489,14 @@ partial interface MLGraphBuilder {
     
-### The concat() method ### {#api-mlgraphbuilder-concat} +### The {{MLGraphBuilder/concat()}} method ### {#api-mlgraphbuilder-concat} Concatenates the input tensors along a given axis. -
+
**Arguments:** - *inputs*: a sequence of {{MLOperand}}. All input tensors must have the same shape, except for the size of the dimension to concatenate on. @@ -1507,6 +1507,38 @@ partial interface MLGraphBuilder { that all the inputs concatenated along. The size of that dimension is computed as the sum of all the input sizes of the same dimension.
+
+ The {{MLGraphBuilder/concat(inputs, axis)}} steps are: +
+ The permissions and context validity have been checked by [[#api-mlgraphbuilder-constructor]] steps. +
+ 1. Let |inputs| be the first argument. + 1. [=Assert=]: the type of |inputs| is sequence of {{MLOperand}} objects. + 1. [=Assert=]: the type of |axis| is `unsigned long`. + 1. [=Assert=]: the shape, i.e. {{MLOperandDescriptor/dimensions}}) of each operand in |inputs| is the same, except on the dimension given by |axis| on which they are concatenated. + 1. [=Assert=]: the {{MLOperandDescriptor/type}} of each operand in |inputs| is the same. + 1. If any of the following steps fail, then throw a "{{DataError}}" {{DOMException}} and stop. + 1. If |inputs| is not an array of [=objects=], fail. + 1. If |axis| is not a positive integer [=number=], fail. + 1. If |axis| is greater than or equal to the rank of |inputs|, fail. + 1. Let |desc| be |inputs|[0].{{MLOperand/[[descriptor]]}}. + 1. Let |desc|.{{MLOperandDescriptor/dimensions}}[|axis|] be `0`. + 1. For each |index| between 0 and the rank of |inputs|: + 1. If running validate MLOperand given |inputs|[|index|] and [=this=] returns `false`, then fail. + 1. For each |dim| between 0 and the rank of |inputs|[|index|]: +
+ If the shape of each corresponding dimension and type of the operands, except for those of the dimension given by |axis|, is not the same, fail. +
+ 1. If |dim| is not equal to |axis| and if |inputs|[|index|].{{MLOperandDescriptor/dimensions}}[|dim|] is not equal to |inputs|[0].{{MLOperandDescriptor/dimensions}}[|dim|], fail. + 1. If |inputs|[|dim|].{{MLOperandDescriptor/type}} is not equal to |inputs|[0].{{MLOperandDescriptor/type}}. + 1. If |dim| is equal to |axis|, add to |desc|.{{MLOperandDescriptor/dimensions}}[|axis|] the value of |inputs|[|index|].{{MLOperandDescriptor/dimensions}}[|dim|]. + 1. Let |kind| be |inputs|[0].{{MLOperand/[[kind]]}}. + 1. Let |output| be the result of invoking the create MLOperand steps given [=this=], |kind| and |desc|. + 1. If that throws an error, re-throw the error and stop. + 1. Make a request to the underlying platform to create an operator for this method with |inputs| connected as input and |output| connected as output and store a reference to the [=implementation-defined=] platform object to |operand|.{{MLOperand/[[operand]]}}. + 1. If that fails, throw a "{{DataError}}" {{DOMException}} and stop. + 1. Return |output|. +
### The conv2d() method ### {#api-mlgraphbuilder-conv2d} Compute a 2-D convolution given 4-D input and filter tensors From ef88d7c7e6e246344d68c69bc71d465755379491 Mon Sep 17 00:00:00 2001 From: Zoltan Kis Date: Thu, 25 May 2023 19:27:06 +0300 Subject: [PATCH 014/112] Add steps for 'rank', 'validate MLOperand'. Fix the concat() steps Signed-off-by: Zoltan Kis --- index.bs | 20 +++++++++++++++++--- 1 file changed, 17 insertions(+), 3 deletions(-) diff --git a/index.bs b/index.bs index 957c6bc3..d53fc649 100644 --- a/index.bs +++ b/index.bs @@ -772,6 +772,11 @@ interface MLOperand {};
+To get the rank of an {{MLOperand}} |operand|, run the following steps: +
+ 1. Return the size of |operand|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}. +
+ Since the {{MLOperand/[[builder]]}} object is bound by the {{MLGraphBuilder/constructor()}} constructor to an {{MLContext}} object, an {{MLOperand}} is also always bound to the same {{MLContext}} object. #### Creating {{MLOperand}} #### {#api-mloperand-create} @@ -796,6 +801,16 @@ To check dimensions given |dimensions| and |type|, run the following 1. Return `true`.
+To validate MLOperand given |operand| and |builder|, run the following steps: +
+ 1. If |operand|.{{MLOperand/[[builder]]}} is not an instance of {{MLGraphBuilder}}, return `false`. + 1. If |builder| is not `undefined` and is not equal to |operand|.{{MLOperand/[[builder]]}}, return `false`. + 1. Let |desc| be |operand|.{{MLOperand/[[descriptor]]}}. + 1. If |desc| is not an [=object=] that [=implements=] {{MLOperandDescriptor}}, return `false`. + 1. If |desc|.{{MLOperandDescriptor/dimensions}} [=map/exists=] and invoking check dimensions given |desc|.{{MLOperandDescriptor/dimensions}} and |desc|.{{MLOperandDescriptor/type}} returns `false`, then return `false`. + 1. Return `true`. +
+ ### The MLActivation interface ### {#api-mlactivation} Objects implementing the {{MLActivation}} interface represent activation function types. @@ -1532,10 +1547,9 @@ partial interface MLGraphBuilder { 1. If |dim| is not equal to |axis| and if |inputs|[|index|].{{MLOperandDescriptor/dimensions}}[|dim|] is not equal to |inputs|[0].{{MLOperandDescriptor/dimensions}}[|dim|], fail. 1. If |inputs|[|dim|].{{MLOperandDescriptor/type}} is not equal to |inputs|[0].{{MLOperandDescriptor/type}}. 1. If |dim| is equal to |axis|, add to |desc|.{{MLOperandDescriptor/dimensions}}[|axis|] the value of |inputs|[|index|].{{MLOperandDescriptor/dimensions}}[|dim|]. - 1. Let |kind| be |inputs|[0].{{MLOperand/[[kind]]}}. - 1. Let |output| be the result of invoking the create MLOperand steps given [=this=], |kind| and |desc|. + 1. Let |output| be the result of invoking the create MLOperand steps given [=this=] and |desc|. 1. If that throws an error, re-throw the error and stop. - 1. Make a request to the underlying platform to create an operator for this method with |inputs| connected as input and |output| connected as output and store a reference to the [=implementation-defined=] platform object to |operand|.{{MLOperand/[[operand]]}}. + 1. Make a request to the underlying platform to create an operator for this method with |inputs| connected as input and |output| connected as output and store a reference to the [=implementation-defined=] platform object to |output|.{{MLOperand/[[operand]]}}. 1. If that fails, throw a "{{DataError}}" {{DOMException}} and stop. 1. Return |output|.
From 236bec680f73aed54b5479dc7c3c2e7738d1d200 Mon Sep 17 00:00:00 2001 From: Zoltan Kis Date: Tue, 6 Jun 2023 11:12:47 +0300 Subject: [PATCH 015/112] concat: remove style from title Signed-off-by: Zoltan Kis --- index.bs | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/index.bs b/index.bs index d53fc649..af63e548 100644 --- a/index.bs +++ b/index.bs @@ -1504,7 +1504,7 @@ partial interface MLGraphBuilder {
-### The {{MLGraphBuilder/concat()}} method ### {#api-mlgraphbuilder-concat} +### The concat() method ### {#api-mlgraphbuilder-concat} Concatenates the input tensors along a given axis. -
+
**Arguments:** - *inputs*: a sequence of {{MLOperand}}. All input tensors must have the same shape, except for the size of the dimension to concatenate on. From 98268b1650b81e4b5b2b95f623959664f5e335f3 Mon Sep 17 00:00:00 2001 From: Zoltan Kis Date: Wed, 15 Feb 2023 21:04:24 +0200 Subject: [PATCH 017/112] Add the clamp() algorithm Squashed from the following commits: Replace MLOperand.[[descriptor]] with type and dimensions Clarify the algorithm for only setting up the op Improve the clamp() algorithm, use the prose assuming the create steps for MLOperand and MLActivation Rework clamp with polymorphic behavior. Update for changes in MLOperand. Rework clamp() like constant(), polymorphic forms in separate sections, argument and return descriptions as notes. Fix platform related steps and reference to internal slots Address review, remove note Remove back quotes from title Signed-off-by: Zoltan Kis --- index.bs | 63 ++++++++++++++++++++++++++++++++++++++++++++------------ 1 file changed, 50 insertions(+), 13 deletions(-) diff --git a/index.bs b/index.bs index aa0a6edc..9dbc312d 100644 --- a/index.bs +++ b/index.bs @@ -1464,22 +1464,12 @@ dictionary MLClampOptions { }; partial interface MLGraphBuilder { - MLOperand clamp(MLOperand x, optional MLClampOptions options = {}); + MLOperand clamp(MLOperand operand, optional MLClampOptions options = {}); MLActivation clamp(optional MLClampOptions options = {}); }; -
- **Arguments:** - - *x*: an {{MLOperand}}. The input tensor. - - *options*: an optional {{MLClampOptions}}. The optional parameters of the operation. - - *minValue*: a {{float}} scalar. Specifies the minimum value of the range. When it is not specified, the clamping is not performed on the lower limit of the range. - - *maxValue*: a {{float}} scalar. Specifies the maximum value of the range. When it is not specified, the clamping is not performed on the upper limit of the range. - - **Returns:** - - an {{MLOperand}}. The output tensor of the same shape as *x*. - - an {{MLActivation}}. The activation function representing the clamp operation. -
+
The behavior of this operation can be generically emulated from the usage of other operations as follow. However, user agents typically have a more efficient implementation for it, therefore its usage is encouraged from the @@ -1501,7 +1491,54 @@ partial interface MLGraphBuilder { } } -
+
+ +To check clamp options given |options|, run the following steps: + 1. If |options| is not an object that [=implements=] {{MLClampOptions}}, then return `false`. + 1. If |options|.{{MLClampOptions/minValue}} and |options|.{{MLClampOptions/maxValue}} are not a [=numeric type=], then then return `false`. + 1. If |options|.{{MLClampOptions/minValue}} is greater than |options|.{{MLClampOptions/maxValue}}, then return `false`. + 1. Return `true`. + +#### The {{MLGraphBuilder/clamp(operand, options)}} method #### {#api-mlgraphbuilder-clamp-operand-options} +
+ **Arguments:** + - *operand*: an {{MLOperand}}. The input tensor. + - *options*: an optional {{MLClampOptions}}. The optional parameters of the operation. + - *minValue*: a {{float}} scalar. Specifies the minimum value of the range. When it is not specified, the clamping is not performed on the lower limit of the range. + - *maxValue*: a {{float}} scalar. Specifies the maximum value of the range. When it is not specified, the clamping is not performed on the upper limit of the range. + **Returns:** + - an {{MLOperand}}. The output tensor of the same shape as *operand*. +
+
+ The {{MLGraphBuilder/clamp(operand, options)}} method steps are: + 1. Let |operand| be the first argument. + 1. Let |options| be the second argument. + 1. If running the check clamp options steps with |options| returns `false`, then throw a "{{TypeError}}" {{DOMException}} and abort these steps. + 1. Let |result| be the result of invoking the copy MLOperand steps given |operand|. + 1. If that throws an error, re-throw the error and abort these steps. + 1. Make a request to the underlying platform to connect |result| with the [=implementation-defined=] platform operator for clamp, and store a reference to the resulting [=implementation-defined=] platform operand object in |result|.{{MLOperand/[[operand]]}}. + 1. If that fails, throw an "{{OperationError}}" {{DOMException}} and stop. + 1. Return |result|. +
+ +#### The {{MLGraphBuilder/clamp(options)}} method #### {#api-mlgraphbuilder-clamp-options} +
+ **Arguments:** + - *options*: an optional {{MLClampOptions}}. The optional parameters of the operation. + - *minValue*: a {{float}} scalar. Specifies the minimum value of the range. When it is not specified, the clamping is not performed on the lower limit of the range. + - *maxValue*: a {{float}} scalar. Specifies the maximum value of the range. When it is not specified, the clamping is not performed on the upper limit of the range. + **Returns:** + - an {{MLActivation}}. The operator representing the clamp operation. +
+
+ The {{MLGraphBuilder/clamp(options)}} method steps are: + 1. Let |options| be the first argument. + 1. If running the check clamp options steps with |options| returns `false`, then throw a "{{TypeError}}" {{DOMException}} and abort these steps. + 1. Let |op| be the result of invoking the create MLActivation steps with `"clamp"` and |options|. + 1. If that throws an error, re-throw the error and abort these steps. + 1. Make a request to the underlying platform to connect |op| with the [=implementation-defined=] platform operator for clamp, and store a reference to the resulting [=implementation-defined=] platform operator object in |op|.{{MLActivation/[[operator]]}}. + 1. If that fails, throw an "{{OperationError}}" {{DOMException}} and stop. + 1. Return |op|.
### The concat() method ### {#api-mlgraphbuilder-concat} From 02e977b464109a0a4163e8fd4d09845113a6051f Mon Sep 17 00:00:00 2001 From: Zoltan Kis Date: Thu, 25 May 2023 19:10:35 +0300 Subject: [PATCH 018/112] Add the 'copy MLOperand' and 'create MLActivation' steps Signed-off-by: Zoltan Kis --- index.bs | 53 ++++++++++++++++++++++++++++++++++++++++++++++++++--- 1 file changed, 50 insertions(+), 3 deletions(-) diff --git a/index.bs b/index.bs index 9dbc312d..03f04493 100644 --- a/index.bs +++ b/index.bs @@ -792,6 +792,16 @@ To create MLOperand given |builder| and |desc|, run the following ste 1. Return |operand|.
+To copy MLOperand given |operand|, run the following steps: +
+ 1. If |operand| is not an instance of {{MLOperand}}, then throw a "{{TypeError}}" and stop. + 1. Let |result| be a new [=object=]. + 1. Set |result|.{{MLOperand/[[builder]]}} to |operand|.{{MLOperand/[[builder]]}}. + 1. Set |result|.{{MLOperand/[[descriptor]]}} to |operand|.{{MLOperand/[[descriptor]]}}. + 1. Set |result|.{{MLOperand/[[name]]}} to |operand|.{{MLOperand/[[name]]}}. + 1. Return |result|. +
+ To check dimensions given |dimensions| and |type|, run the following steps:
1. If |dimensions| is not an array of positive numbers, return `false`; @@ -817,16 +827,53 @@ Objects implementing the {{MLActivation}} interface represent activation functio +
+{{MLActivation}} has the following internal slots: +
+ : \[[name]] of type [=string=] + :: + The {{MLActivation}}'s name. + : \[[builder]] of type {{MLGraphBuilder}} + :: + The graph builder object this {{MLActivation}} belongs to. + : \[[options]] of type [=object=] + :: + A dictionary containing {{MLActivation}} options. + : \[[operator]] of type [=object=] + :: + Reference to {{MLActivation}}'s corresponding [=implementation-defined=] platform operator object. +
+
+
These activations function types are used to create other operations. One such use of this interface is for when an activation function is fused into another operation such as [[#api-mlgraphbuilder-conv2d]] or [[#api-mlgraphbuilder-batchnorm]] during a graph construction session. Such fused activation functions can provide a significant performance improvement when supported natively by the underlying implementation. This is intended as an optimization opportunity for implementers.
+#### Creating {{MLActivation}} #### {#api-mlactivation-create}
-The implementation of the {{MLActivation}} interface can simply be a struct that holds a string type of the activation function along with other properties needed. The actual creation of the activation function e.g. a [[#api-mlgraphbuilder-sigmoid]] or [[#api-mlgraphbuilder-relu]] can then be deferred until when the rest of the graph is ready to connect with it such as during the construction of [[#api-mlgraphbuilder-conv2d]] for example. -
+The {{MLActivation}} objects (including the ones passed as input to methods) are created by the methods of {{MLGraphBuilder}} and are identified by their name. The |options| dictionary is defined by those methods. The actual creation of the activation function e.g. a [[#api-mlgraphbuilder-sigmoid]] or [[#api-mlgraphbuilder-relu]] can then be deferred until when the rest of the graph is ready to connect with it such as during the construction of [[#api-mlgraphbuilder-conv2d]] for example. +
+ +
+ + To create MLActivation given |builder|, |name| and |options|, run the following steps: + +
+ 1. If |builder| is not an instance of {{MLGraphBuilder}}, throw a "{{TypeError}}" and abort these steps. + 1. If |name| is `undefined` or `null`, throw a "{{TypeError}}" and abort these steps. + 1. Let |activation| be a new [=object=]. + 1. Set |activation|.{{MLActivation/[[builder]]}} to |builder|. + 1. Set |activation|.{{MLActivation/[[name]]}} to |name|. + 1. If |options| is an [=object=], set |activation|.{{MLActivation/[[options]]}} to |options|. + 1. Make a request to the underlying platform to bind the [=implementation-defined=] platform operator for |name| to |activation|.{{MLActivation/[[operator]]}}. + 1. If that fails, throw a "{{TypeError}}" and abort these steps. + 1. Return |activation|. +
+
## The MLContext interface ## {#api-mlcontext} The {{MLContext}} interface represents a global state of neural network compute workload and execution processes. Each {{MLContext}} object has associated [=context type=], [=device type=] and [=power preference=]. From 5faf1709990d2663f9ffea4e01ed829e31d324e2 Mon Sep 17 00:00:00 2001 From: Zoltan Kis Date: Fri, 2 Jun 2023 17:26:56 +0300 Subject: [PATCH 019/112] clamp(): improve platform related steps Signed-off-by: Zoltan Kis --- index.bs | 16 +++++++++++----- 1 file changed, 11 insertions(+), 5 deletions(-) diff --git a/index.bs b/index.bs index 03f04493..81636409 100644 --- a/index.bs +++ b/index.bs @@ -798,7 +798,7 @@ To copy MLOperand given |operand|, run the following steps: 1. Let |result| be a new [=object=]. 1. Set |result|.{{MLOperand/[[builder]]}} to |operand|.{{MLOperand/[[builder]]}}. 1. Set |result|.{{MLOperand/[[descriptor]]}} to |operand|.{{MLOperand/[[descriptor]]}}. - 1. Set |result|.{{MLOperand/[[name]]}} to |operand|.{{MLOperand/[[name]]}}. + 1. If |operand|.{{MLOperand/[[name]]}} [=map/exists=], then set |result|.{{MLOperand/[[name]]}} to |operand|.{{MLOperand/[[name]]}}. 1. Return |result|.
@@ -1563,8 +1563,13 @@ To check clamp options given |options|, run the following steps: 1. If running the check clamp options steps with |options| returns `false`, then throw a "{{TypeError}}" {{DOMException}} and abort these steps. 1. Let |result| be the result of invoking the copy MLOperand steps given |operand|. 1. If that throws an error, re-throw the error and abort these steps. - 1. Make a request to the underlying platform to connect |result| with the [=implementation-defined=] platform operator for clamp, and store a reference to the resulting [=implementation-defined=] platform operand object in |result|.{{MLOperand/[[operand]]}}. - 1. If that fails, throw an "{{OperationError}}" {{DOMException}} and stop. + 1. If any of the following sub-steps fail, throw an "{{OperationError}}" {{DOMException}} and stop. + 1. Make a request to the underlying platform to create an [=implementation-defined=] platform operand |operandImpl| given |result|.{{MLOperand/[[descriptor]]}}. + 1. Store a reference to |operandImpl| in |result|.{{MLOperand/[[operand]]}}. + 1. Make a request to the underlying platform to create an [=implementation-defined=] platform operator |operatorImpl| for clamp with |options|.{{MLClampOptions/minValue}} and |options|.{{MLClampOptions/minValue}}. + 1. Register the |operand|.{{MLOperand/[[operand]]}} as an input to |operatorImpl|. + 1. Register the |result|.{{MLOperand/[[operand]]}} as output to |operatorImpl|. + 1. Store a reference to |operatorImpl| in |result|.{{MLOperand/[[operator]]}}. 1. Return |result|.
@@ -1583,8 +1588,9 @@ To check clamp options given |options|, run the following steps: 1. If running the check clamp options steps with |options| returns `false`, then throw a "{{TypeError}}" {{DOMException}} and abort these steps. 1. Let |op| be the result of invoking the create MLActivation steps with `"clamp"` and |options|. 1. If that throws an error, re-throw the error and abort these steps. - 1. Make a request to the underlying platform to connect |op| with the [=implementation-defined=] platform operator for clamp, and store a reference to the resulting [=implementation-defined=] platform operator object in |op|.{{MLActivation/[[operator]]}}. - 1. If that fails, throw an "{{OperationError}}" {{DOMException}} and stop. + 1. If any of the following sub-steps fail, throw an "{{OperationError}}" {{DOMException}} and stop. + 1. Make a request to the underlying platform to connect |op| with the [=implementation-defined=] platform operator for clamp |operatorImpl|. + 1. Store a reference to |operatorImpl| in |op|.{{MLActivation/[[operator]]}}. 1. Return |op|.
From 1c894021d30ef70c9ecdfbe79b4961c483fc7581 Mon Sep 17 00:00:00 2001 From: Zoltan Kis Date: Mon, 19 Jun 2023 21:45:27 +0300 Subject: [PATCH 020/112] Add stylistic definitions for hiding algorithms, stylistic boxes Signed-off-by: Zoltan Kis --- index.bs | 190 +++++++++++++++++++++++++++++++++++++++++++++++++++---- 1 file changed, 176 insertions(+), 14 deletions(-) diff --git a/index.bs b/index.bs index 81636409..b9d7763a 100644 --- a/index.bs +++ b/index.bs @@ -104,19 +104,19 @@ p, ul, ol, dl { margin: 1em 0; } -/* Box for Valid Usage requirements. */ -div.validusage { - padding: .5em; - border: thin solid #88e !important; - border-radius: .5em; -} - /* * Stylistic labels, for clarity of presentation of these blocks. * * NOTE: This text is non-accessible and non-selectable; surrounding * text must also explain the context. */ + +/* Box for Valid Usage requirements. */ +div.validusage { + padding: .5em; + border: thin solid #88e !important; + border-radius: .5em; +} .validusage { position: relative; } @@ -134,19 +134,51 @@ div.validusage { content: "Valid Usage"; } -/* Box for Informal steps. */ +details { + padding: .5em; + border: thin solid #88e !important; + border-radius: .5em; +} + +summary { + font-weight: bold; + margin: -0.5em -0.5em 0; + padding: 0.5em; +} + +/* Box for algorithm steps. */ + +div.algorithm-steps { + padding: .5em; + background-color: ghostwhite; +} + +.algorithm-steps { + position: relative; + overflow: hidden; +} +.algorithm-steps::after { + font-weight: bold; + font-style: italic; + font-size: 130%; + color: rgba(0, 0, 0, 0.15); + color: var(--watermark-text); + position: absolute; + right: .3em; + bottom: .1em; +} +.algorithm-steps::after { + content: "Algorithm"; +} + +/* Informal steps */ div.informalsteps { padding: .5em; border: thin solid #88e !important; border-radius: .5em; + background-color: ghostwhite; } -/* - * Stylistic labels, for clarity of presentation of these blocks. - * - * NOTE: This text is non-accessible and non-selectable; surrounding - * text must also explain the context. - */ .informalsteps { position: relative; } @@ -164,6 +196,28 @@ div.informalsteps { content: "Non-normative"; } +/* Internal slots */ +div.internal-slots { + padding: .5em; + border: thin solid #88e !important; + border-radius: .5em; + background-color: aliceblue; +} + +.internal-slots { + position: relative; +} +.internal-slots::after { + font-weight: bold; + font-style: italic; + font-size: 130%; + color: rgba(0, 0, 0, 0.15); + color: var(--watermark-text); + position: absolute; + right: .3em; + bottom: .1em; +} + /* * Ensure that argumentdef blocks don't overflow algorithm section borders. This is made far harder * than it needs to be because the top-level W3C stylesheet has several @media + min-width variants @@ -262,8 +316,116 @@ th, td { } } + +/* Floating button for collapse/expand all details elements */ + +.collapse-expand-button { + position: fixed; + bottom: 40px; + right: 40px; + width: 40px; + height: 40px; + border: none; + border-radius: 50%; + background-color: green; + color: ghostwhite; + font-size: 32px; + text-align: center; + align-items:center; + justify-content:center; + cursor: pointer; +} + +.collapse-expand-button:hover { + background-color: green; +} + +.collapse-expand-button.expand { + background-color: red; +} + +.collapse-expand-button.expand::before { + content: "+"; +} + +.collapse-expand-button.collapse { + background-color: green; +} + +.collapse-expand-button.collapse::before { + content: "-"; +} + +.collapse-expand-button .tooltiptext { + visibility: hidden; + bottom: 20px; + right: 20px; + width: 120px; + background-color: ghostwhite; + color: black; + font-size: 18px; + text-align: center; + align-items:center; + justify-content:center; + padding: 5px 0; + border-radius: 5px; + + /* position */ + position: absolute; + z-index: 1; + bottom: 100%; + left: 50%; + margin-left: -60px; + /* Use half of the width (120/2 = 60), to center the tooltip */ +} + +.collapse-expand-button:hover .tooltiptext { + visibility: visible; + opacity: 0.75; +} + +/* end of floating collapse/expand button */ + + + + + Introduction {#intro} ===================== From 4a269736c437624216c9a07e74ebaa7c39aa27c1 Mon Sep 17 00:00:00 2001 From: Zoltan Kis Date: Mon, 19 Jun 2023 22:45:56 +0300 Subject: [PATCH 021/112] Adapt the existing main version to new style, without indenting existing algorithms Signed-off-by: Zoltan Kis --- index.bs | 261 ++++++++++++++++++++++++++++++++++++++++--------------- 1 file changed, 189 insertions(+), 72 deletions(-) diff --git a/index.bs b/index.bs index b9d7763a..45ccec70 100644 --- a/index.bs +++ b/index.bs @@ -809,7 +809,11 @@ string "webnn". Its default allowlist is 'self'. ### The {{ML/createContext()}} method ### {#api-ml-createcontext} +
+ The {{ML/createContext()}} method steps are: + +
1. If [=this=]'s [=relevant global object=]'s [=associated Document=] is not [=allowed to use=] the [=webnn-feature|webnn=] feature, return [=a new promise=] [=rejected=] with a "{{SecurityError}}" {{DOMException}} and abort these steps. 1. Let |promise| be [=a new promise=]. 1. Return |promise| and run the following steps [=in parallel=]. @@ -826,14 +830,22 @@ The {{ML/createContext()}} method steps are: 1. If |options|["{{powerPreference}}"] [=map/exists=], then set |context|.{{[[powerPreference]]}} to |options|["{{powerPreference}}"]. Otherwise, set |context|.{{[[powerPreference]]}} to "[=power-preference-default|default=]". 1. If the validate MLContext steps given |context| return `false`, [=reject=] |promise| with a "{{NotSupportedError}}" {{DOMException}} and abort these steps. 1. [=Resolve=] |promise| with |context|. +
+
### The {{ML/createContextSync()}} method ### {#api-ml-createcontextsync} +
+ The {{ML/createContextSync()}} method steps are: + +
1. If [=this=]'s [=relevant global object=]'s [=associated Document=] is not [=allowed to use=] the [=webnn-feature|webnn=] feature, throw a "{{SecurityError}}" {{DOMException}} and abort these steps. 1. Let |options| be the first argument. 1. Let |context| be the result of running the create context steps given |options|. 1. If the validate MLContext steps given |context| return `false`, throw a "{{NotSupportedError}}" {{DOMException}} and abort these steps. 1. Return |context|. +
+
## The MLGraph interface ## {#api-mlgraph} The {{MLGraph}} interface represents a compiled computational graph. A compiled graph once constructed is immutable and cannot be subsequently changed. @@ -843,9 +855,9 @@ The {{MLGraph}} interface represents a compiled computational graph. A compiled interface MLGraph {}; +
{{MLGraph}} has the following internal slots: - -
+
: \[[context]] of type {{MLContext}} :: The context of type {{MLContext}} associated with this {{MLGraph}}. @@ -861,7 +873,8 @@ interface MLGraph {}; : \[[implementation]] :: The underlying implementation provided by the User Agent. -
+
+
### The MLOperandDescriptor dictionary ### {#api-mloperanddescriptor} -
+
+ The byte length of an {{MLOperandDescriptor}} |desc| is the value returned by the following steps: - + +
1. Let |elementLength| be 1. 1. For each |dimension| of |desc|.{{MLOperandDescriptor/dimensions}}: 1. Set |elementLength| to |elementLength| × |dimension|. 1. Let |elementSize| be the [=element size=] of one of the {{ArrayBufferView}} types that matches |desc|.{{MLOperandDescriptor/type}} according to [this table](#appendices-mloperandtype-arraybufferview-compatibility). 1. Return |elementLength| × |elementSize|. -
+
+ ### The MLOperand interface ### {#api-mloperand} @@ -909,9 +925,9 @@ For instance, an {{MLOperand}} may represent a constant feeding to an operation interface MLOperand {}; -{{MLOperand}} has the following internal slots:
-
+{{MLOperand}} has the following internal slots: +
: \[[builder]] of type {{MLGraphBuilder}} :: The {{MLOperand}}'s associated builder object. @@ -931,7 +947,7 @@ interface MLOperand {}; : \[[operator]] of type [=object=] :: Reference to {{MLOperand}}'s corresponding [=implementation-defined=] platform operator object. -
+
To get the rank of an {{MLOperand}} |operand|, run the following steps: @@ -944,44 +960,60 @@ Since the {{MLOperand/[[builder]]}} object is bound by the {{MLGraphBuilder/cons #### Creating {{MLOperand}} #### {#api-mloperand-create} The {{MLOperand}} objects are created by the methods of {{MLGraphBuilder}}, internally using the following algorithms. -To create MLOperand given |builder| and |desc|, run the following steps: -
+
+ + To create MLOperand given |builder| and |desc|, run the following steps: + +
1. If |builder| is not an instance of {{MLGraphBuilder}}, then throw a "{{TypeError}}" {{DOMException}} and stop. 1. If |desc| is not an [=object=] that [=implements=] {{MLOperandDescriptor}}, then throw a "{{TypeError}}" {{DOMException}} and stop. 1. Let |operand| be a new [=object=]. 1. Set |operand|.{{MLOperand/[[builder]]}} to |builder|. 1. Set |operand|.{{MLOperand/[[descriptor]]}} to |desc|. 1. Return |operand|. -
+
+ -To copy MLOperand given |operand|, run the following steps: -
+
+ + To copy MLOperand given |operand|, run the following steps: + +
1. If |operand| is not an instance of {{MLOperand}}, then throw a "{{TypeError}}" and stop. 1. Let |result| be a new [=object=]. 1. Set |result|.{{MLOperand/[[builder]]}} to |operand|.{{MLOperand/[[builder]]}}. 1. Set |result|.{{MLOperand/[[descriptor]]}} to |operand|.{{MLOperand/[[descriptor]]}}. 1. If |operand|.{{MLOperand/[[name]]}} [=map/exists=], then set |result|.{{MLOperand/[[name]]}} to |operand|.{{MLOperand/[[name]]}}. 1. Return |result|. -
+
+ -To check dimensions given |dimensions| and |type|, run the following steps: -
+
+ + To check dimensions given |dimensions| and |type|, run the following steps: + +
1. If |dimensions| is not an array of positive numbers, return `false`; 1. If |dimensions|.length is 0, return `false`. 1. If |dimensions|.length is too large to be supported by the implementation, return `false`. 1. If any element of |dimensions| is not a positive number, or it is too large to be supported by the implementation given |type|, return `false`. 1. Return `true`. -
+
+ -To validate MLOperand given |operand| and |builder|, run the following steps: -
+
+ + To validate MLOperand given |operand| and |builder|, run the following steps: + +
1. If |operand|.{{MLOperand/[[builder]]}} is not an instance of {{MLGraphBuilder}}, return `false`. 1. If |builder| is not `undefined` and is not equal to |operand|.{{MLOperand/[[builder]]}}, return `false`. 1. Let |desc| be |operand|.{{MLOperand/[[descriptor]]}}. 1. If |desc| is not an [=object=] that [=implements=] {{MLOperandDescriptor}}, return `false`. 1. If |desc|.{{MLOperandDescriptor/dimensions}} [=map/exists=] and invoking check dimensions given |desc|.{{MLOperandDescriptor/dimensions}} and |desc|.{{MLOperandDescriptor/type}} returns `false`, then return `false`. 1. Return `true`. -
+
+ ### The MLActivation interface ### {#api-mlactivation} @@ -1073,9 +1105,9 @@ typedef record MLNamedArrayBufferViews; interface MLContext {}; +
{{MLContext}} has the following internal slots: - -
+
: \[[contextType]] of type [=context type=] :: The {{MLContext}}'s [=context type=]. @@ -1085,19 +1117,26 @@ interface MLContext {}; : \[[powerPreference]] of type [=power preference=] :: The {{MLContext}}'s [=power preference=]. -
+
+
When the {{[[contextType]]}} is set to [=default-context|default=] with the {{MLContextOptions}}.{{deviceType}} set to [=device-type-gpu|gpu=], the user agent is responsible for creating an internal GPU device that operates within the context and is capable of ML workload submission on behalf of the calling application. In this setting however, only {{ArrayBufferView}} inputs and outputs are allowed in and out of the graph execution since the application has no way to know what type of internal GPU device is being created on their behalf. In this case, the user agent is responsible for automatic uploads and downloads of the inputs and outputs to and from the GPU memory using this said internal device.
### The {{MLContext}} validation algorithm ### {#api-mlcontext-validate} +
+ To validate {{MLContext}}, given |context|, run these steps: + +
1. If |context|.{{[[contextType]]}} is not "[=webgpu-context|webgpu=]" or "[=default-context|default=], return `false`. 1. If |context|.{{[[deviceType]]}} is not "[=device-type-cpu|cpu=]" or "[=device-type-gpu|gpu=]", return `false`. 1. If |context|.{{[[powerPreference]]}} is not "[=power-preference-default|default=]" or "[=power-preference-high-performance|high-performance=]" or "[=power-preference-low-power|low-power=]", return `false`. 1. If the user agent cannot support |context|.{{[[contextType]]}}, |context|.{{[[deviceType]]}} and |context|.{{[[powerPreference]]}}, return `false`. 1. Return `true`; +
+
### Synchronous Execution ### {#api-mlcontext-sync-execution} Synchronously carries out the computational workload of a compiled graph {{MLGraph}} on the calling thread, which must be a worker thread, to produce results as defined by the operations in the graph. This method of execution requires an {{MLContext}} created with {{MLContextOptions}}. Otherwise, it throws an "{{OperationError}}" {{DOMException}}. @@ -1164,7 +1203,10 @@ To validate buffer with descriptor given |bufferView| and |descriptor #### Examples #### {#api-mlcontext-sync-execution-examples}
+
+ The following code showcases the synchronous computation with optional outputs in a worker. +
 const context = navigator.ml.createContextSync();
 
@@ -1196,9 +1238,14 @@ context.computeSync(graph, inputs, {'e': bufferE});
 console.log(`values: ${bufferE}`);
 
+ -### The {{MLNamedArrayBufferViews}} transfer algorithm ### {#mlnamedarraybufferviews-transfer} +### The {{MLNamedArrayBufferViews}} transfer algorithm ### {#mlnamedarraybufferviews-transfer-alg} +
+ To transfer an {{MLNamedArrayBufferViews}} |views|: + +
1. Let |transferredViews| be a new {{MLNamedArrayBufferViews}}. 1. For each |key| -> |value| of |views|: 1. Let |transferredBuffer| be the result of [=ArrayBuffer/transfer|transferring=] the [=underlying buffer=] of |value|. @@ -1207,6 +1254,8 @@ To transfer an {{MLNamedArrayBufferView 1. Let |transferredView| be [=Construct=](|constructor|, |transferredBuffer|, |value|.\[[ByteOffset]], |elementsNumber|). 1. Set |transferredViews|[|key|] to |transferredView|. 1. Return |transferredViews|. +
+
### Asynchronous Execution ### {#api-mlcontext-async-execution} Asynchronously carries out the computational workload of a compiled graph {{MLGraph}} on a separate timeline, either on a worker thread for the CPU execution, or on a GPU timeline for the submission of GPU workload on the command queue. The asynchronous nature of this call avoids blocking the calling thread while the computation for result is ongoing. This method of execution requires an {{MLContext}} created with {{MLContextOptions}}. Otherwise, it throws an "{{OperationError}}" {{DOMException}}. @@ -1284,7 +1333,10 @@ partial interface MLContext { #### Examples #### {#api-mlcontext-async-execution-examples}
+
+ The following code showcases the asynchronous computation. +
 const operandType = {type: 'float32', dimensions: [2, 2]};
 const context = await navigator.ml.createContext();
@@ -1309,6 +1361,7 @@ console.log('Output value: ' + result.outputs.C);
 // Note: the result.outputs.C buffer is different from the bufferC, but it
 // shares the same backing memory allocation.
 
+
### WebGPU Interoperability ### {#api-mlcontext-webgpu-interop} @@ -1336,9 +1389,9 @@ typedef record MLNamedGPUResources; interface MLCommandEncoder {}; +
{{MLCommandEncoder}} has the following internal slots: - -
+
: \[[context]] of type {{MLContext}} :: The context of type {{MLContext}} associated with this {{MLCommandEncoder}}. @@ -1346,7 +1399,8 @@ interface MLCommandEncoder {}; : \[[implementation]] :: The underlying implementation provided by the User Agent. -
+
+
### Graph Initialization ### {#api-mlcommandencoder-graph-initialization} Record the initialization of the {{MLGraph}}. This is a necessary step for optimal performance during graph execution as it gives the platform an opportunity to prepare and optimize constant input data for the subsequent execution of the graph. This method should only be called once per graph. @@ -1364,9 +1418,16 @@ partial interface MLCommandEncoder { **Returns:** {{undefined}}.
-
-Graph initialization stage typically involves a process known as "weight preprocessing" where all the constant inputs to the graph are preprocessed and cached at the operating system level for subsequent graph execution calls. The initializing inputs are typically the constant weight data specified through the {{MLGraphBuilder/constant(descriptor, bufferView)|MLGraphBuilder/constant()}} method as constant operands during graph construction time. -
+
+ + The {{MLCommandEncoder/initializeGraph(graph)}} steps are: + +
+
+ Graph initialization stage typically involves a process known as "weight preprocessing" where all the constant inputs to the graph are preprocessed and cached at the operating system level for subsequent graph execution calls. The initializing inputs are typically the constant weight data specified through the {{MLGraphBuilder/constant(descriptor, bufferView)|MLGraphBuilder/constant()}} method as constant operands during graph construction time. +
+
+
### Dispatch Execution Commands ### {#api-mlcommandencoder-dispatch-commands} Record the {{MLGraph}} execution with the inputs {{MLNamedGPUResources}} and outputs {{MLNamedGPUResources}}. @@ -1475,22 +1536,33 @@ Both {{MLGraphBuilder}}.{{MLGraphBuilder/build()}} and {{MLGraphBuilder}}.{{MLGr ### The {{MLGraphBuilder}} constructor ### {#api-mlgraphbuilder-constructor} -The [=new=] {{MLGraphBuilder}} constructor steps are: -1. If [=this=]'s [=relevant global object=]'s [=associated Document=] is not [=allowed to use=] the [=webnn-feature|webnn=] feature, throw a "{{SecurityError}}" {{DOMException}} and abort these steps. -1. Let |context| be the first argument. -1. If the validate MLContext steps given |context| return `false`, throw a "{{TypeError}}" and abort these steps. -1. Set {{MLGraphBuilder/[[context]]}} to |context|. +
+ + The [=new=] {{MLGraphBuilder}} constructor steps are: + +
+ 1. If [=this=]'s [=relevant global object=]'s [=associated Document=] is not [=allowed to use=] the [=webnn-feature|webnn=] feature, throw a "{{SecurityError}}" {{DOMException}} and abort these steps. + 1. Let |context| be the first argument. + 1. If the validate MLContext steps given |context| return `false`, throw a "{{TypeError}}" and abort these steps. + 1. Set {{MLGraphBuilder/[[context]]}} to |context|. +
+
### The {{MLGraphBuilder/input()}} method ### {#api-mlgraphbuilder-input} Create a named {{MLOperand}} based on a descriptor, that can be used as an input. -
+ +
**Arguments:** - *name*: a [=string=] name of the input. - *descriptor*: an {{MLOperandDescriptor}} object. **Returns:**: an {{MLOperand}} object.
-
+ +
+ The {{MLGraphBuilder/input(name, descriptor)}} steps are: + +
The permissions and context validity have been checked by [[#api-mlgraphbuilder-constructor]] steps.
@@ -1507,20 +1579,26 @@ Create a named {{MLOperand}} based on a descriptor, that can be used as an input 1. Set |operand|.{{MLOperand/[[name]]}} to |name|. 1. Make a request to the underlying platform to register |operand| as an input and store a reference to the corresponding [=implementation-defined=] platform object in |operand|.{{MLOperand/[[operand]]}}. 1. If that fails, throw an "{{OperationError}}" {{DOMException}} and abort these steps. + 1. Return |operand|. +
+
### The constant() method ### {#api-mlgraphbuilder-constant-method} Create a constant {{MLOperand}} that can be used in {{MLGraphBuilder}} methods. #### The {{MLGraphBuilder/constant(descriptor, bufferView)}} method #### {#api-mlgraphbuilder-constant} -
+
**Arguments:** - *descriptor*: an {{MLOperandDescriptor}} object - *bufferView*: an {{MLBufferView}} **Returns:**: an {{MLOperand}} object.
-
-The {{MLGraphBuilder/constant(descriptor, bufferView)}} steps are: +
+ + The {{MLGraphBuilder/constant(descriptor, bufferView)}} steps are: + +
The permissions and context validity have been checked by [[#api-mlgraphbuilder-constructor]] steps.
@@ -1536,19 +1614,23 @@ The {{MLGraphBuilder/constant(descriptor, bufferView)}} steps are: 1. Make a request to the underlying platform to register |operand| as a tensor constant with |bytes| as value and store a reference to the corresponding [=implementation-defined=] object to |operand|.{{MLOperand/[[operand]]}}. 1. If that fails, throw an "{{OperationError}}" {{DOMException}} and stop. 1. Return |operand|. -
+
+ #### The {{MLGraphBuilder/constant(value, type)}} method #### {#api-mlgraphbuilder-constant-value-type} -
+
**Arguments:** - *value*: a number - *type*: an optional {{MLOperandType}}, by default *"float32"*. **Returns:**: an {{MLOperand}} object.
-
-The {{MLGraphBuilder/constant(value, type)}} steps are: +
+ + The {{MLGraphBuilder/constant(value, type)}} steps are: + +
The permissions and context validity have been checked by [[#api-mlgraphbuilder-constructor]] steps.
@@ -1566,7 +1648,8 @@ The {{MLGraphBuilder/constant(value, type)}} steps are: 1. Make a request to the underlying platform to register |operand| as a scalar constant with |value| as value and store a reference of the [=implementation-defined=] platform object for the corresponding (scalar or tensor constant) operand to |operand|.{{MLOperand/[[operand]]}}. 1. If that throws, re-throw the error and stop. 1. Return |operand|. -
+
+ ### The batchNormalization() method ### {#api-mlgraphbuilder-batchnorm} Normalize the tensor values of input features across the batch dimension using [[Batch-Normalization]]. For each input feature, the mean and variance values of that feature supplied in this calculation as parameters are previously computed across the batch dimension of the input during the model training phase of this operation. @@ -1609,23 +1692,21 @@ partial interface MLGraphBuilder { An {{MLActivation}} object. Specifies the optional activation function that immediately follows the normalization operation. -
+
**Arguments:** - *input*: an {{MLOperand}}. The input N-D tensor. - *mean*: an {{MLOperand}}. Specifies the 1-D tensor of the mean values of the input features across the batch whose length is equal to the size of the input dimension denoted by {{MLBatchNormalizationOptions/axis}}. - *variance*: an {{MLOperand}}. The 1-D tensor of the variance values of the input features across the batch whose length is equal to the size of the input dimension denoted by {{MLBatchNormalizationOptions/axis}}. - *options*: an optional {{MLBatchNormalizationOptions}}. Specifies the optional parameters of the operation. - - *scale*: an {{MLOperand}}. The 1-D tensor of the scaling values whose length is equal to the size of the input dimension denoted by *options.axis*. - - *bias*: an {{MLOperand}}. The 1-D tensor of the bias values whose length is equal to the size of the input dimension denoted by *options.axis*. - - *axis*: an {{unsigned long}} scalar. The index to the feature count dimension of the input shape for which the mean and variance values are. Its value must be in the range [0, N-1] where N is the rank of input tensor. When it's not specified, the default value is 1. - - *epsilon*: a {{float}} scalar. A small value to prevent computational error due to divide-by-zero. The default value is 0.00001 when not specified. - - *activation*: an {{MLActivation}}. The optional activation function that immediately follows the normalization operation. **Returns:** an {{MLOperand}}. The batch-normalized N-D tensor of the same shape as the input tensor.
-
+
+ The {{MLGraphBuilder/batchNormalization()}} method steps are: + +
1. Let |input| be the first argument. To validate |input|, run these substeps: 1. If |input| is not an [=object=] that [=implements=] {{MLOperand}}, then throw a "{{TypeError}}" {{DOMException}} and abort these steps. 1. Let |mean| be the second argument, representing a vector with the moving mean values for |input|. To validate |mean|, run the following substeps: @@ -1638,12 +1719,13 @@ partial interface MLGraphBuilder { 1. If |input| is a 4-D tensor of the *"nchw"* layout, set |options|.axis to 1. 1. If |input| is a 4-D tensor of the *"nhwc"* layout, set |options|.axis to 3. 1. Let |result| be an {{MLOperand}} representing the results. It may use the same underlying data as |input|. - 1. Issue a request to the underlying platform to initialize the batch normalization, given |input|, |mean|, |variance|, |options| and |result| to store the results and |options|. Wait for completion. + 1. Issue a request to the underlying platform to initialize the batch normalization, given |result| to store the results and |options|. Wait for completion.
1. If |options|.activation [=map/exists=], implementations MAY use it to optimize the operation flow.
1. Return |result|. -
+
+
The behavior of this operation when the input tensor is 4-D of the *"nchw"* layout and the activation is of operator type *relu* can be generically emulated from the usage of other operations as follow. However, user agents typically have a more efficient implementation for it, therefore its usage is encouraged from the performance standpoint. @@ -1679,11 +1761,14 @@ partial interface MLGraphBuilder {
+
+ The behavior of this operation can be generically emulated from the usage of other operations as follow. However, user agents typically have a more efficient implementation for it, therefore its usage is encouraged from the performance standpoint. -
+  
+
     if (options.minValue === undefined) {
       if (options.maxValue === undefined) {
         return x;
@@ -1699,17 +1784,23 @@ partial interface MLGraphBuilder {
             builder.constant(options.maxValue));
       }
     }
-    
+
-To check clamp options given |options|, run the following steps: - 1. If |options| is not an object that [=implements=] {{MLClampOptions}}, then return `false`. +
+ + To check clamp options given |options|, run the following steps: + +
+ 1. If |options| is not an [=object=] that [=implements=] {{MLClampOptions}}, then return `false`. 1. If |options|.{{MLClampOptions/minValue}} and |options|.{{MLClampOptions/maxValue}} are not a [=numeric type=], then then return `false`. 1. If |options|.{{MLClampOptions/minValue}} is greater than |options|.{{MLClampOptions/maxValue}}, then return `false`. 1. Return `true`. +
+
#### The {{MLGraphBuilder/clamp(operand, options)}} method #### {#api-mlgraphbuilder-clamp-operand-options} -
+
**Arguments:** - *operand*: an {{MLOperand}}. The input tensor. - *options*: an optional {{MLClampOptions}}. The optional parameters of the operation. @@ -1718,8 +1809,12 @@ To check clamp options given |options|, run the following steps: **Returns:** - an {{MLOperand}}. The output tensor of the same shape as *operand*.
-
+ +
+ The {{MLGraphBuilder/clamp(operand, options)}} method steps are: + +
1. Let |operand| be the first argument. 1. Let |options| be the second argument. 1. If running the check clamp options steps with |options| returns `false`, then throw a "{{TypeError}}" {{DOMException}} and abort these steps. @@ -1733,10 +1828,11 @@ To check clamp options given |options|, run the following steps: 1. Register the |result|.{{MLOperand/[[operand]]}} as output to |operatorImpl|. 1. Store a reference to |operatorImpl| in |result|.{{MLOperand/[[operator]]}}. 1. Return |result|. -
+
+ #### The {{MLGraphBuilder/clamp(options)}} method #### {#api-mlgraphbuilder-clamp-options} -
+
**Arguments:** - *options*: an optional {{MLClampOptions}}. The optional parameters of the operation. - *minValue*: a {{float}} scalar. Specifies the minimum value of the range. When it is not specified, the clamping is not performed on the lower limit of the range. @@ -1744,8 +1840,12 @@ To check clamp options given |options|, run the following steps: **Returns:** - an {{MLActivation}}. The operator representing the clamp operation.
-
+ +
+ The {{MLGraphBuilder/clamp(options)}} method steps are: + +
1. Let |options| be the first argument. 1. If running the check clamp options steps with |options| returns `false`, then throw a "{{TypeError}}" {{DOMException}} and abort these steps. 1. Let |op| be the result of invoking the create MLActivation steps with `"clamp"` and |options|. @@ -1754,7 +1854,8 @@ To check clamp options given |options|, run the following steps: 1. Make a request to the underlying platform to connect |op| with the [=implementation-defined=] platform operator for clamp |operatorImpl|. 1. Store a reference to |operatorImpl| in |op|.{{MLActivation/[[operator]]}}. 1. Return |op|. -
+
+ ### The concat() method ### {#api-mlgraphbuilder-concat} Concatenates the input tensors along a given axis. @@ -1774,8 +1875,12 @@ partial interface MLGraphBuilder { that all the inputs concatenated along. The size of that dimension is computed as the sum of all the input sizes of the same dimension.
-
+ +
+ The {{MLGraphBuilder/concat(inputs, axis)}} steps are: + +
The permissions and context validity have been checked by [[#api-mlgraphbuilder-constructor]] steps.
@@ -1804,7 +1909,8 @@ partial interface MLGraphBuilder { 1. Make a request to the underlying platform to create an operator for this method with |inputs| connected as input and |output| connected as output and store a reference to the [=implementation-defined=] platform object to |output|.{{MLOperand/[[operand]]}}. 1. If that fails, throw a "{{DataError}}" {{DOMException}} and stop. 1. Return |output|. -
+
+ ### The conv2d() method ### {#api-mlgraphbuilder-conv2d} Compute a 2-D convolution given 4-D input and filter tensors @@ -3339,12 +3445,15 @@ partial interface MLGraphBuilder { **Returns:** a sequence of {{MLOperand}}. The splitted output tensors. If *splits* is an {{unsigned long}}, the length of the output sequence equals to *splits*. The shape of each output tensor is the same as *input* except the dimension size of *axis* equals to the quotient of dividing the dimension size of *input* along *axis* by *splits*. If *splits* is a sequence of {{unsigned long}}, the length of the output sequence equals to the length of *splits*. The shape of the i-th output tensor is the same as as *input* except along *axis* where the dimension size is *splits[i]*. -
+
+
+ The behavior of this operation can be generically emulated from the usage of other operations as follow. However, user agents typically have a more efficient implementation for it, therefore its usage is encouraged from the performance standpoint. -
+  
+
     // This sample shows the case that the splits parameter is an array.
     const outputs = [];
     let starts = Array(input_rank).fill(0);
@@ -3357,8 +3466,8 @@ partial interface MLGraphBuilder {
       start += size;
     }
     return outputs;
-    
-
+ +
### The squeeze() method ### {#api-mlgraphbuilder-squeeze} @@ -3441,6 +3550,8 @@ const context = await navigator.ml.createContext({powerPreference: 'low-power'})
+
+ The following code builds a graph as:
 constant1 ---+
@@ -3451,6 +3562,7 @@ constant2 ---+                                    |
              +--- Add ---> intermediateOutput2 ---+
 input2    ---+
 
+
 // Use tensors in 4 dimensions.
 const TENSOR_DIMS = [1, 2, 2, 2];
@@ -3484,6 +3596,7 @@ const intermediateOutput2 = builder.add(constant2, input2);
 // output is the output MLOperand of the Mul operation.
 const output = builder.mul(intermediateOutput1, intermediateOutput2);
 
+
@@ -3495,7 +3608,10 @@ const graph = await builder.build({'output': output});
-The following code executes the compiled graph. +
+ + The following code executes the compiled graph. +
 // Setup the input buffers with value 1.
 const inputBuffer1 = new Float32Array(TENSOR_SIZE).fill(1);
@@ -3513,6 +3629,7 @@ const result = await context.compute(graph, inputs, outputs);
 console.log('Output value: ' + result.outputs.output);
 // Output value: 2.25,2.25,2.25,2.25,2.25,2.25,2.25,2.25
 
+
# Appendices # {#appendices} From 75925de43c073d906dfcecc57859369a4610ac45 Mon Sep 17 00:00:00 2001 From: Zoltan Kis Date: Mon, 19 Jun 2023 23:06:33 +0300 Subject: [PATCH 022/112] Indent algorithms and fix make errors Signed-off-by: Zoltan Kis --- index.bs | 389 ++++++++++++++++++++++++++++--------------------------- 1 file changed, 198 insertions(+), 191 deletions(-) diff --git a/index.bs b/index.bs index 45ccec70..00bc14a7 100644 --- a/index.bs +++ b/index.bs @@ -67,19 +67,19 @@ urlPrefix: https://tc39.es/proposal-float16array/; spec: float16array
 {
-	"WEBGPU": {
-		"authors": [
-			"Dzmitry Malyshau",
-			"Kai Ninomiya"
-		],
-		"href": "https://gpuweb.github.io/gpuweb/",
-		"title": "WebGPU",
-		"status": "ED",
-		"publisher": "W3C",
-		"deliveredBy": [
-			"https://www.w3.org/2020/gpu/"
-		]
-	}
+    "WEBGPU": {
+        "authors": [
+            "Dzmitry Malyshau",
+            "Kai Ninomiya"
+        ],
+        "href": "https://gpuweb.github.io/gpuweb/",
+        "title": "WebGPU",
+        "status": "ED",
+        "publisher": "W3C",
+        "deliveredBy": [
+            "https://www.w3.org/2020/gpu/"
+        ]
+    }
 }
 
@@ -811,40 +811,40 @@ Its default allowlist is 'self'. ### The {{ML/createContext()}} method ### {#api-ml-createcontext}
-The {{ML/createContext()}} method steps are: + The {{ML/createContext()}} method steps are:
-1. If [=this=]'s [=relevant global object=]'s [=associated Document=] is not [=allowed to use=] the [=webnn-feature|webnn=] feature, return [=a new promise=] [=rejected=] with a "{{SecurityError}}" {{DOMException}} and abort these steps. -1. Let |promise| be [=a new promise=]. -1. Return |promise| and run the following steps [=in parallel=]. -1. Let |options| be the first argument. -1. Run the create context steps given |options|: - 1. Let |context| be a new {{MLContext}} object. - 1. If |options| is a {{GPUDevice}} object, - 1. Set |context|.{{[[contextType]]}} to "[=webgpu-context|webgpu=]". - 1. Set |context|.{{[[deviceType]]}} to "[=device-type-gpu|gpu=]". - 1. Set |context|.{{[[powerPreference]]}} to "[=power-preference-default|default=]". - 1. Otherwise, - 1. Set |context|.{{[[contextType]]}} to "[=default-context|default=]". - 1. If |options|["{{deviceType}}"] [=map/exists=], then set |context|.{{[[deviceType]]}} to |options|["{{deviceType}}"]. Otherwise, set |context|.{{[[deviceType]]}} to "[=device-type-cpu|cpu=]". - 1. If |options|["{{powerPreference}}"] [=map/exists=], then set |context|.{{[[powerPreference]]}} to |options|["{{powerPreference}}"]. Otherwise, set |context|.{{[[powerPreference]]}} to "[=power-preference-default|default=]". -1. If the validate MLContext steps given |context| return `false`, [=reject=] |promise| with a "{{NotSupportedError}}" {{DOMException}} and abort these steps. -1. [=Resolve=] |promise| with |context|. + 1. If [=this=]'s [=relevant global object=]'s [=associated Document=] is not [=allowed to use=] the [=webnn-feature|webnn=] feature, return [=a new promise=] [=rejected=] with a "{{SecurityError}}" {{DOMException}} and abort these steps. + 1. Let |promise| be [=a new promise=]. + 1. Return |promise| and run the following steps [=in parallel=]. + 1. Let |options| be the first argument. + 1. Run the create context steps given |options|: + 1. Let |context| be a new {{MLContext}} object. + 1. If |options| is a {{GPUDevice}} object, + 1. Set |context|.{{[[contextType]]}} to "[=webgpu-context|webgpu=]". + 1. Set |context|.{{[[deviceType]]}} to "[=device-type-gpu|gpu=]". + 1. Set |context|.{{[[powerPreference]]}} to "[=power-preference-default|default=]". + 1. Otherwise, + 1. Set |context|.{{[[contextType]]}} to "[=default-context|default=]". + 1. If |options|["{{deviceType}}"] [=map/exists=], then set |context|.{{[[deviceType]]}} to |options|["{{deviceType}}"]. Otherwise, set |context|.{{[[deviceType]]}} to "[=device-type-cpu|cpu=]". + 1. If |options|["{{powerPreference}}"] [=map/exists=], then set |context|.{{[[powerPreference]]}} to |options|["{{powerPreference}}"]. Otherwise, set |context|.{{[[powerPreference]]}} to "[=power-preference-default|default=]". + 1. If the validate MLContext steps given |context| return `false`, [=reject=] |promise| with a "{{NotSupportedError}}" {{DOMException}} and abort these steps. + 1. [=Resolve=] |promise| with |context|.
### The {{ML/createContextSync()}} method ### {#api-ml-createcontextsync}
- -The {{ML/createContextSync()}} method steps are: - -
-1. If [=this=]'s [=relevant global object=]'s [=associated Document=] is not [=allowed to use=] the [=webnn-feature|webnn=] feature, throw a "{{SecurityError}}" {{DOMException}} and abort these steps. -1. Let |options| be the first argument. -1. Let |context| be the result of running the create context steps given |options|. -1. If the validate MLContext steps given |context| return `false`, throw a "{{NotSupportedError}}" {{DOMException}} and abort these steps. -1. Return |context|. -
+ + The {{ML/createContextSync()}} method steps are: + +
+ 1. If [=this=]'s [=relevant global object=]'s [=associated Document=] is not [=allowed to use=] the [=webnn-feature|webnn=] feature, throw a "{{SecurityError}}" {{DOMException}} and abort these steps. + 1. Let |options| be the first argument. + 1. Let |context| be the result of running the create context steps given |options|. + 1. If the validate MLContext steps given |context| return `false`, throw a "{{NotSupportedError}}" {{DOMException}} and abort these steps. + 1. Return |context|. +
## The MLGraph interface ## {#api-mlgraph} @@ -1126,16 +1126,16 @@ When the {{[[contextType]]}} is set to [=default-context|default=] with the {{ML ### The {{MLContext}} validation algorithm ### {#api-mlcontext-validate}
- -To validate {{MLContext}}, given |context|, run these steps: - -
-1. If |context|.{{[[contextType]]}} is not "[=webgpu-context|webgpu=]" or "[=default-context|default=], return `false`. -1. If |context|.{{[[deviceType]]}} is not "[=device-type-cpu|cpu=]" or "[=device-type-gpu|gpu=]", return `false`. -1. If |context|.{{[[powerPreference]]}} is not "[=power-preference-default|default=]" or "[=power-preference-high-performance|high-performance=]" or "[=power-preference-low-power|low-power=]", return `false`. -1. If the user agent cannot support |context|.{{[[contextType]]}}, |context|.{{[[deviceType]]}} and |context|.{{[[powerPreference]]}}, return `false`. -1. Return `true`; -
+ + To validate {{MLContext}}, given |context|, run these steps: + +
+ 1. If |context|.{{[[contextType]]}} is not "[=webgpu-context|webgpu=]" or "[=default-context|default=], return `false`. + 1. If |context|.{{[[deviceType]]}} is not "[=device-type-cpu|cpu=]" or "[=device-type-gpu|gpu=]", return `false`. + 1. If |context|.{{[[powerPreference]]}} is not "[=power-preference-default|default=]" or "[=power-preference-high-performance|high-performance=]" or "[=power-preference-low-power|low-power=]", return `false`. + 1. If the user agent cannot support |context|.{{[[contextType]]}}, |context|.{{[[deviceType]]}} and |context|.{{[[powerPreference]]}}, return `false`. + 1. Return `true`; +
### Synchronous Execution ### {#api-mlcontext-sync-execution} @@ -1204,57 +1204,57 @@ To validate buffer with descriptor given |bufferView| and |descriptor
- -The following code showcases the synchronous computation with optional outputs in a worker. - -
-const context = navigator.ml.createContextSync();
-
-// Build a graph with two outputs.
-const builder = new MLGraphBuilder(context);
-const descA = {type: 'float32', dimensions: [3, 4]};
-const a = builder.input('a', descA);
-const descB = {type: 'float32', dimensions: [4, 3]};
-const bufferB = new Float32Array(sizeOfShape(descB.dimensions)).fill(0.5);
-const b = builder.constant(descB, bufferB);
-const descC = {type: 'float32', dimensions: [3, 3]};
-const bufferC = new Float32Array(sizeOfShape(descC.dimensions)).fill(1);
-const c = builder.constant(descC, bufferC);
-const d = builder.matmul(a, b);
-const e = builder.add(d, c);
-const graph = builder.buildSync({'d': d, 'e': e});
-
-const bufferA = new Float32Array(sizeOfShape(descA.dimensions)).fill(0.5);
-const inputs = {'a': bufferA};
-
-// Compute d.
-const bufferD = new Float32Array(sizeOfShape([3, 3]));
-context.computeSync(graph, inputs, {'d': bufferD});
-console.log(`values: ${bufferD}`);
-
-// Compute e.
-const bufferE = new Float32Array(sizeOfShape([3, 3]));
-context.computeSync(graph, inputs, {'e': bufferE});
-console.log(`values: ${bufferE}`);
-
-
+ + The following code showcases the synchronous computation with optional outputs in a worker. + +
+    const context = navigator.ml.createContextSync();
+
+    // Build a graph with two outputs.
+    const builder = new MLGraphBuilder(context);
+    const descA = {type: 'float32', dimensions: [3, 4]};
+    const a = builder.input('a', descA);
+    const descB = {type: 'float32', dimensions: [4, 3]};
+    const bufferB = new Float32Array(sizeOfShape(descB.dimensions)).fill(0.5);
+    const b = builder.constant(descB, bufferB);
+    const descC = {type: 'float32', dimensions: [3, 3]};
+    const bufferC = new Float32Array(sizeOfShape(descC.dimensions)).fill(1);
+    const c = builder.constant(descC, bufferC);
+    const d = builder.matmul(a, b);
+    const e = builder.add(d, c);
+    const graph = builder.buildSync({'d': d, 'e': e});
+
+    const bufferA = new Float32Array(sizeOfShape(descA.dimensions)).fill(0.5);
+    const inputs = {'a': bufferA};
+
+    // Compute d.
+    const bufferD = new Float32Array(sizeOfShape([3, 3]));
+    context.computeSync(graph, inputs, {'d': bufferD});
+    console.log(`values: ${bufferD}`);
+
+    // Compute e.
+    const bufferE = new Float32Array(sizeOfShape([3, 3]));
+    context.computeSync(graph, inputs, {'e': bufferE});
+    console.log(`values: ${bufferE}`);
+  
+
### The {{MLNamedArrayBufferViews}} transfer algorithm ### {#mlnamedarraybufferviews-transfer-alg}
- -To transfer an {{MLNamedArrayBufferViews}} |views|: - -
-1. Let |transferredViews| be a new {{MLNamedArrayBufferViews}}. -1. For each |key| -> |value| of |views|: - 1. Let |transferredBuffer| be the result of [=ArrayBuffer/transfer|transferring=] the [=underlying buffer=] of |value|. - 1. Let |constructor| be the appropriate [=view constructor=] for the type of {{ArrayBufferView}} |value|. - 1. Let |elementsNumber| be the result of the [=buffer byte length|byte length=] of |value| ÷ [=element size=] of |value|. - 1. Let |transferredView| be [=Construct=](|constructor|, |transferredBuffer|, |value|.\[[ByteOffset]], |elementsNumber|). - 1. Set |transferredViews|[|key|] to |transferredView|. -1. Return |transferredViews|. -
+ + To transfer an {{MLNamedArrayBufferViews}} |views|: + +
+ 1. Let |transferredViews| be a new {{MLNamedArrayBufferViews}}. + 1. For each |key| -> |value| of |views|: + 1. Let |transferredBuffer| be the result of [=ArrayBuffer/transfer|transferring=] the [=underlying buffer=] of |value|. + 1. Let |constructor| be the appropriate [=view constructor=] for the type of {{ArrayBufferView}} |value|. + 1. Let |elementsNumber| be the result of the [=buffer byte length|byte length=] of |value| ÷ [=element size=] of |value|. + 1. Let |transferredView| be [=Construct=](|constructor|, |transferredBuffer|, |value|.\[[ByteOffset]], |elementsNumber|). + 1. Set |transferredViews|[|key|] to |transferredView|. + 1. Return |transferredViews|. +
### Asynchronous Execution ### {#api-mlcontext-async-execution} @@ -1334,33 +1334,33 @@ partial interface MLContext {
- -The following code showcases the asynchronous computation. - -
-const operandType = {type: 'float32', dimensions: [2, 2]};
-const context = await navigator.ml.createContext();
-const builder = new MLGraphBuilder(context);
-// 1. Create a computational graph 'C = 0.2 * A + B'.
-const constant = builder.constant(0.2);
-const A = builder.input('A', operandType);
-const B = builder.input('B', operandType);
-const C = builder.add(builder.mul(A, constant), B);
-// 2. Compile it into an executable.
-const graph = await builder.build({'C': C});
-// 3. Bind inputs to the graph and execute for the result.
-const bufferA = new Float32Array(4).fill(1.0);
-const bufferB = new Float32Array(4).fill(0.8);
-const bufferC = new Float32Array(4);
-const inputs = {'A': bufferA, 'B': bufferB};
-const outputs = {'C': bufferC};
-const result = await context.compute(graph, inputs, outputs);
-// The computed result of [[1, 1], [1, 1]] is in the buffer associated with
-// the output operand.
-console.log('Output value: ' + result.outputs.C);
-// Note: the result.outputs.C buffer is different from the bufferC, but it
-// shares the same backing memory allocation.
-
+ + The following code showcases the asynchronous computation. + +
+    const operandType = {type: 'float32', dimensions: [2, 2]};
+    const context = await navigator.ml.createContext();
+    const builder = new MLGraphBuilder(context);
+    // 1. Create a computational graph 'C = 0.2 * A + B'.
+    const constant = builder.constant(0.2);
+    const A = builder.input('A', operandType);
+    const B = builder.input('B', operandType);
+    const C = builder.add(builder.mul(A, constant), B);
+    // 2. Compile it into an executable.
+    const graph = await builder.build({'C': C});
+    // 3. Bind inputs to the graph and execute for the result.
+    const bufferA = new Float32Array(4).fill(1.0);
+    const bufferB = new Float32Array(4).fill(0.8);
+    const bufferC = new Float32Array(4);
+    const inputs = {'A': bufferA, 'B': bufferB};
+    const outputs = {'C': bufferC};
+    const result = await context.compute(graph, inputs, outputs);
+    // The computed result of [[1, 1], [1, 1]] is in the buffer associated with
+    // the output operand.
+    console.log('Output value: ' + result.outputs.C);
+    // Note: the result.outputs.C buffer is different from the bufferC, but it
+    // shares the same backing memory allocation.
+  
@@ -1411,7 +1411,7 @@ partial interface MLCommandEncoder { }; -
+
**Arguments:** - *graph*: an {{MLGraph}}. The compiled graph to be initialized with graph constant inputs. @@ -1438,16 +1438,22 @@ partial interface MLCommandEncoder { }; -
+
**Arguments:** - *graph*: an {{MLGraph}}. The compiled graph to be executed. - *inputs*: an {{MLNamedGPUResources}}. The resources of inputs. - *outputs*: an {{MLNamedGPUResources}}. The pre-allocated resources of required outputs. **Returns:** {{undefined}}. +
- 1. If any of the following requirements are unmet, then throw a "{{DataError}}" {{DOMException}} and stop. -
+
+ + The {{MLCommandEncoder/dispatch(graph, inputs, outputs)}} steps are: + +
+ 1. If any of the following requirements are unmet, then throw a "{{DataError}}" {{DOMException}} and stop. +
1. For each |key| -> |value| of |inputs|: 1. |graph|.{{MLGraph/[[inputDescriptors]]}}[|key|] must [=map/exist=]. 1. Let |inputDesc| be |graph|.{{MLGraph/[[inputDescriptors]]}}[|key|]. @@ -1459,16 +1465,16 @@ partial interface MLCommandEncoder { 1. If |value| is a {{GPUBuffer}}, then: 1. |value|.{{GPUBuffer/size}} must equal to [=byte length=] of |outputDesc|.
- - 1. For each |key| -> |value| of |inputs|: - 1. Set the input of |graph|.{{MLGraph/[[implementation]]}} that is associated with |key| to |value|. - 1. For each |key| -> |value| of |outputs|: - 1. Set the output of |graph|.{{MLGraph/[[implementation]]}} that is associated with |key| to |value|. - 1. Issue a compute request of |graph|.{{MLGraph/[[implementation]]}}. - 1. If there is an error returned by |graph|.{{MLGraph/[[implementation]]}}, then: - 1. Throw an "{{OperationError}}" {{DOMException}} and stop. - 1. Return {{undefined}}. -
+ 1. For each |key| -> |value| of |inputs|: + 1. Set the input of |graph|.{{MLGraph/[[implementation]]}} that is associated with |key| to |value|. + 1. For each |key| -> |value| of |outputs|: + 1. Set the output of |graph|.{{MLGraph/[[implementation]]}} that is associated with |key| to |value|. + 1. Issue a compute request of |graph|.{{MLGraph/[[implementation]]}}. + 1. If there is an error returned by |graph|.{{MLGraph/[[implementation]]}}, then: + 1. Throw an "{{OperationError}}" {{DOMException}} and stop. + 1. Return {{undefined}}. +
+ ### Generate GPU Command Buffer ### {#api-mlcommandencoder-generate-gpu-command-buffer} Complete the recording of ML workload and return a WebGPU-compatible {{GPUCommandBuffer}} containing the recorded workload. @@ -1719,7 +1725,7 @@ partial interface MLGraphBuilder { 1. If |input| is a 4-D tensor of the *"nchw"* layout, set |options|.axis to 1. 1. If |input| is a 4-D tensor of the *"nhwc"* layout, set |options|.axis to 3. 1. Let |result| be an {{MLOperand}} representing the results. It may use the same underlying data as |input|. - 1. Issue a request to the underlying platform to initialize the batch normalization, given |result| to store the results and |options|. Wait for completion. + 1. Issue a request to the underlying platform to initialize the batch normalization, passing the arguments |input|, |mean|, |variance| and |options| and given |result| to store the results and |options|. Wait for completion.
1. If |options|.activation [=map/exists=], implementations MAY use it to optimize the operation flow.
@@ -3550,52 +3556,53 @@ const context = await navigator.ml.createContext({powerPreference: 'low-power'})
-
- -The following code builds a graph as: +Given the following build graph:
-constant1 ---+
-             +--- Add ---> intermediateOutput1 ---+
-input1    ---+                                    |
-                                                  +--- Mul---> output
-constant2 ---+                                    |
-             +--- Add ---> intermediateOutput2 ---+
-input2    ---+
+    constant1 ---+
+                +--- Add ---> intermediateOutput1 ---+
+    input1    ---+                                    |
+                                                    +--- Mul---> output
+    constant2 ---+                                    |
+                +--- Add ---> intermediateOutput2 ---+
+    input2    ---+
 
-
-
-// Use tensors in 4 dimensions.
-const TENSOR_DIMS = [1, 2, 2, 2];
-const TENSOR_SIZE = 8;
+
+ + The following code implements the graph: + +
+    // Use tensors in 4 dimensions.
+    const TENSOR_DIMS = [1, 2, 2, 2];
+    const TENSOR_SIZE = 8;
 
-const builder = new MLGraphBuilder(context);
+    const builder = new MLGraphBuilder(context);
 
-// Create MLOperandDescriptor object.
-const desc = {type: 'float32', dimensions: TENSOR_DIMS};
+    // Create MLOperandDescriptor object.
+    const desc = {type: 'float32', dimensions: TENSOR_DIMS};
 
-// constant1 is a constant MLOperand with the value 0.5.
-const constantBuffer1 = new Float32Array(TENSOR_SIZE).fill(0.5);
-const constant1 = builder.constant(desc, constantBuffer1);
+    // constant1 is a constant MLOperand with the value 0.5.
+    const constantBuffer1 = new Float32Array(TENSOR_SIZE).fill(0.5);
+    const constant1 = builder.constant(desc, constantBuffer1);
 
-// input1 is one of the input MLOperands. Its value will be set before execution.
-const input1 = builder.input('input1', desc);
+    // input1 is one of the input MLOperands. Its value will be set before execution.
+    const input1 = builder.input('input1', desc);
 
-// constant2 is another constant MLOperand with the value 0.5.
-const constantBuffer2 = new Float32Array(TENSOR_SIZE).fill(0.5);
-const constant2 = builder.constant(desc, constantBuffer2);
+    // constant2 is another constant MLOperand with the value 0.5.
+    const constantBuffer2 = new Float32Array(TENSOR_SIZE).fill(0.5);
+    const constant2 = builder.constant(desc, constantBuffer2);
 
-// input2 is another input MLOperand. Its value will be set before execution.
-const input2 = builder.input('input2', desc);
+    // input2 is another input MLOperand. Its value will be set before execution.
+    const input2 = builder.input('input2', desc);
 
-// intermediateOutput1 is the output of the first Add operation.
-const intermediateOutput1 = builder.add(constant1, input1);
+    // intermediateOutput1 is the output of the first Add operation.
+    const intermediateOutput1 = builder.add(constant1, input1);
 
-// intermediateOutput2 is the output of the second Add operation.
-const intermediateOutput2 = builder.add(constant2, input2);
+    // intermediateOutput2 is the output of the second Add operation.
+    const intermediateOutput2 = builder.add(constant2, input2);
 
-// output is the output MLOperand of the Mul operation.
-const output = builder.mul(intermediateOutput1, intermediateOutput2);
-
+ // output is the output MLOperand of the Mul operation. + const output = builder.mul(intermediateOutput1, intermediateOutput2); +
@@ -3612,23 +3619,23 @@ const graph = await builder.build({'output': output}); The following code executes the compiled graph. -
-// Setup the input buffers with value 1.
-const inputBuffer1 = new Float32Array(TENSOR_SIZE).fill(1);
-const inputBuffer2 = new Float32Array(TENSOR_SIZE).fill(1);
-const outputBuffer = new Float32Array(TENSOR_SIZE);
-
-// Execute the compiled graph with the specified inputs.
-const inputs = {
-  'input1': inputBuffer1,
-  'input2': inputBuffer2,
-};
-const outputs = {'output': outputBuffer};
-const result = await context.compute(graph, inputs, outputs);
-
-console.log('Output value: ' + result.outputs.output);
-// Output value: 2.25,2.25,2.25,2.25,2.25,2.25,2.25,2.25
-
+
+    // Setup the input buffers with value 1.
+    const inputBuffer1 = new Float32Array(TENSOR_SIZE).fill(1);
+    const inputBuffer2 = new Float32Array(TENSOR_SIZE).fill(1);
+    const outputBuffer = new Float32Array(TENSOR_SIZE);
+
+    // Execute the compiled graph with the specified inputs.
+    const inputs = {
+    'input1': inputBuffer1,
+    'input2': inputBuffer2,
+    };
+    const outputs = {'output': outputBuffer};
+    const result = await context.compute(graph, inputs, outputs);
+
+    console.log('Output value: ' + result.outputs.output);
+    // Output value: 2.25,2.25,2.25,2.25,2.25,2.25,2.25,2.25
+  
From 8fac5ca49b10e2c5697b6dab871f0c5a25d2a877 Mon Sep 17 00:00:00 2001 From: Zoltan Kis Date: Tue, 20 Jun 2023 21:42:00 +0300 Subject: [PATCH 023/112] Align the algorithms for input, constant, clamp, concat, batch norm etc Signed-off-by: Zoltan Kis --- index.bs | 66 +++++++++++++++++++++++++++++++++++--------------------- 1 file changed, 41 insertions(+), 25 deletions(-) diff --git a/index.bs b/index.bs index 00bc14a7..9562e3c1 100644 --- a/index.bs +++ b/index.bs @@ -1063,8 +1063,10 @@ The {{MLActivation}} objects (including the ones passed as input to methods) are 1. Set |activation|.{{MLActivation/[[builder]]}} to |builder|. 1. Set |activation|.{{MLActivation/[[name]]}} to |name|. 1. If |options| is an [=object=], set |activation|.{{MLActivation/[[options]]}} to |options|. - 1. Make a request to the underlying platform to bind the [=implementation-defined=] platform operator for |name| to |activation|.{{MLActivation/[[operator]]}}. - 1. If that fails, throw a "{{TypeError}}" and abort these steps. + 1. If any of the following sub-steps fail, throw an "{{OperationError}}" {{DOMException}} and stop. + 1. Make a request to the underlying platform to: + 1. Create an [=implementation-defined=] platform operator |opImpl| defined for |name| to |activation|.{{MLActivation/[[operator]]}}, given |options|. + 1. Store a reference of |opImpl| in |activation|.{{MLActivation/[[operator]]}}. 1. Return |activation|.
@@ -1583,8 +1585,11 @@ Create a named {{MLOperand}} based on a descriptor, that can be used as an input 1. Let |operand| be the result of invoking the create MLOperand steps with [=this=] and |descriptor|. 1. If that throws, re-throw the exception and stop. 1. Set |operand|.{{MLOperand/[[name]]}} to |name|. - 1. Make a request to the underlying platform to register |operand| as an input and store a reference to the corresponding [=implementation-defined=] platform object in |operand|.{{MLOperand/[[operand]]}}. - 1. If that fails, throw an "{{OperationError}}" {{DOMException}} and abort these steps. + 1. If any of the following sub-steps fail, throw an "{{OperationError}}" {{DOMException}} and stop. + 1. Make a request to the underlying platform to: + 1. Create an [=implementation-defined=] platform input operand |operandImpl| as an input given |operand|. + 1. Register |operand| as an input. + 1. Store a reference to |operandImpl| in |operand|.{{MLOperand/[[operand]]}}. 1. Return |operand|.
@@ -1617,8 +1622,11 @@ Create a constant {{MLOperand}} that can be used in {{MLGraphBuilder}} methods. 1. Let |operand| be the result of invoking the create MLOperand steps with [=this=] and |descriptor|. 1. If that throws, re-throw the exception and stop. 1. Let |bytes| be the result of invoking [[=get a copy of the bytes held by the buffer source=]] given |bufferView|. - 1. Make a request to the underlying platform to register |operand| as a tensor constant with |bytes| as value and store a reference to the corresponding [=implementation-defined=] object to |operand|.{{MLOperand/[[operand]]}}. - 1. If that fails, throw an "{{OperationError}}" {{DOMException}} and stop. + 1. If any of the following sub-steps fail, throw an "{{OperationError}}" {{DOMException}} and stop. + 1. Make a request to the underlying platform to: + 1. Create an [=implementation-defined=] platform operand |constant| for this method, given |operand|. + 1. Register |operand| as a tensor constant with |bytes| as value. + 1. Store a reference of |constant| in |operand|.{{MLOperand/[[operand]]}}. 1. Return |operand|.
@@ -1649,10 +1657,12 @@ Create a constant {{MLOperand}} that can be used in {{MLGraphBuilder}} methods.
In the case of a scalar constant, |descriptor|.{{MLOperandDescriptor/dimensions}} is ignored.
- 1. Let |operand| be the result of invoking the create MLOperand steps with [=this=] and |descriptor|. - 1. If that throws, re-throw the exception and stop. - 1. Make a request to the underlying platform to register |operand| as a scalar constant with |value| as value and store a reference of the [=implementation-defined=] platform object for the corresponding (scalar or tensor constant) operand to |operand|.{{MLOperand/[[operand]]}}. - 1. If that throws, re-throw the error and stop. + 1. If any of the following sub-steps fail, throw an "{{OperationError}}" {{DOMException}} and stop. + 1. Let |operand| be the result of invoking the create MLOperand steps with [=this=] and |descriptor|. + 1. Make a request to the underlying platform to: + 1. Create an [=implementation-defined=] platform operand |constant|, given |operand|. + 1. Register |operand| as a scalar constant with |value| as value. + 1. Store a reference of |constant| in |operand|.{{MLOperand/[[operand]]}}. 1. Return |operand|.
@@ -1725,10 +1735,13 @@ partial interface MLGraphBuilder { 1. If |input| is a 4-D tensor of the *"nchw"* layout, set |options|.axis to 1. 1. If |input| is a 4-D tensor of the *"nhwc"* layout, set |options|.axis to 3. 1. Let |result| be an {{MLOperand}} representing the results. It may use the same underlying data as |input|. - 1. Issue a request to the underlying platform to initialize the batch normalization, passing the arguments |input|, |mean|, |variance| and |options| and given |result| to store the results and |options|. Wait for completion. -
- 1. If |options|.activation [=map/exists=], implementations MAY use it to optimize the operation flow. -
+ 1. If any of the following sub-steps fail, throw an "{{OperationError}}" {{DOMException}} and stop. + 1. Make a request to the underlying platform to initialize the batch normalization: + 1. Create an [=implementation-defined=] platform operator |batchNormImpl| for this method, given |input|, |mean|, |variance| and |options|. + 1. Register |result| as output to |batchNormImpl|. +
+ 1. If |options|.activation [=map/exists=], implementations MAY use it to optimize the operation flow. +
1. Return |result|.
@@ -1827,12 +1840,14 @@ partial interface MLGraphBuilder { 1. Let |result| be the result of invoking the copy MLOperand steps given |operand|. 1. If that throws an error, re-throw the error and abort these steps. 1. If any of the following sub-steps fail, throw an "{{OperationError}}" {{DOMException}} and stop. - 1. Make a request to the underlying platform to create an [=implementation-defined=] platform operand |operandImpl| given |result|.{{MLOperand/[[descriptor]]}}. - 1. Store a reference to |operandImpl| in |result|.{{MLOperand/[[operand]]}}. - 1. Make a request to the underlying platform to create an [=implementation-defined=] platform operator |operatorImpl| for clamp with |options|.{{MLClampOptions/minValue}} and |options|.{{MLClampOptions/minValue}}. - 1. Register the |operand|.{{MLOperand/[[operand]]}} as an input to |operatorImpl|. - 1. Register the |result|.{{MLOperand/[[operand]]}} as output to |operatorImpl|. - 1. Store a reference to |operatorImpl| in |result|.{{MLOperand/[[operator]]}}. + 1. Make a request to the underlying platform to: + 1. Create an [=implementation-defined=] platform operand |operandImpl| given |result|.{{MLOperand/[[descriptor]]}}. + 1. Store a reference to |operandImpl| in |result|.{{MLOperand/[[operand]]}}. + 1. Make a request to the underlying platform to: + 1. Create an [=implementation-defined=] platform operator |clampImpl| for this method, given |options|.{{MLClampOptions/minValue}} and |options|.{{MLClampOptions/minValue}}. + 1. Register |operand|.{{MLOperand/[[operand]]}} as input to |clampImpl|. + 1. Register |result|.{{MLOperand/[[operand]]}} as output to |clampImpl|. + 1. Store a reference to |clampImpl| in |result|.{{MLOperand/[[operator]]}}. 1. Return |result|.
@@ -1856,9 +1871,6 @@ partial interface MLGraphBuilder { 1. If running the check clamp options steps with |options| returns `false`, then throw a "{{TypeError}}" {{DOMException}} and abort these steps. 1. Let |op| be the result of invoking the create MLActivation steps with `"clamp"` and |options|. 1. If that throws an error, re-throw the error and abort these steps. - 1. If any of the following sub-steps fail, throw an "{{OperationError}}" {{DOMException}} and stop. - 1. Make a request to the underlying platform to connect |op| with the [=implementation-defined=] platform operator for clamp |operatorImpl|. - 1. Store a reference to |operatorImpl| in |op|.{{MLActivation/[[operator]]}}. 1. Return |op|.
@@ -1912,8 +1924,12 @@ partial interface MLGraphBuilder { 1. If |dim| is equal to |axis|, add to |desc|.{{MLOperandDescriptor/dimensions}}[|axis|] the value of |inputs|[|index|].{{MLOperandDescriptor/dimensions}}[|dim|]. 1. Let |output| be the result of invoking the create MLOperand steps given [=this=] and |desc|. 1. If that throws an error, re-throw the error and stop. - 1. Make a request to the underlying platform to create an operator for this method with |inputs| connected as input and |output| connected as output and store a reference to the [=implementation-defined=] platform object to |output|.{{MLOperand/[[operand]]}}. - 1. If that fails, throw a "{{DataError}}" {{DOMException}} and stop. + 1. If any of the following sub-steps fail, throw an "{{OperationError}}" {{DOMException}} and stop. + 1. Make a request to the underlying platform to: + 1. Create an [=implementation-defined=] platform operator |concat| for this method, given |inputs| and |axis|. + 1. Register |inputs|.{{MLOperand/[[operand]]}} as an input to |concat|. + 1. Register |output|.{{MLOperand/[[operand]]}} as output to |concat|. + 1. Store a reference of |concat| in |output|.{{MLOperand/[[operator]]}}. 1. Return |output|.
From 30ba12ece0937263df81c1dd90f06214bdfbf802 Mon Sep 17 00:00:00 2001 From: Zoltan Kis Date: Wed, 21 Jun 2023 16:02:49 +0300 Subject: [PATCH 024/112] Adress review comments on aligning the algorithms for input, constant, clamp, concat, batch norm Signed-off-by: Zoltan Kis --- index.bs | 67 +++++++++++++++++++++++++++----------------------------- 1 file changed, 32 insertions(+), 35 deletions(-) diff --git a/index.bs b/index.bs index 9562e3c1..5a601148 100644 --- a/index.bs +++ b/index.bs @@ -1054,7 +1054,7 @@ The {{MLActivation}} objects (including the ones passed as input to methods) are
- To create MLActivation given |builder|, |name| and |options|, run the following steps: + To create MLActivation given |builder|, |name|, |options| and |init-steps|, run the following steps:
1. If |builder| is not an instance of {{MLGraphBuilder}}, throw a "{{TypeError}}" and abort these steps. @@ -1065,8 +1065,10 @@ The {{MLActivation}} objects (including the ones passed as input to methods) are 1. If |options| is an [=object=], set |activation|.{{MLActivation/[[options]]}} to |options|. 1. If any of the following sub-steps fail, throw an "{{OperationError}}" {{DOMException}} and stop. 1. Make a request to the underlying platform to: - 1. Create an [=implementation-defined=] platform operator |opImpl| defined for |name| to |activation|.{{MLActivation/[[operator]]}}, given |options|. + 1. Create an [=implementation-defined=] platform operator |opImpl| defined for |name| to |activation|.{{MLActivation/[[operator]]}}. 1. Store a reference of |opImpl| in |activation|.{{MLActivation/[[operator]]}}. + 1. If |init-steps| are defined, run |init-steps| with |options|. + 1. Otherwise, initialize |activation|.{{MLActivation/[[operator]]}} given |options| in an [=implementation-defined=] way for the given |name| operation. 1. Return |activation|.
@@ -1582,14 +1584,13 @@ Create a named {{MLOperand}} based on a descriptor, that can be used as an input 1. If |descriptor|.{{MLOperandDescriptor/dimensions}} [=map/exists=]: 1. If the [=check dimensions=] steps given |descriptor|.{{MLOperandDescriptor/type}} and |descriptor|.{{MLOperandDescriptor/dimensions}} return `false`, throw a "{{DataError}}" {{DOMException}} and stop. 1. If the [=byte length=] of |descriptor| is not supported by the underlying platform, then throw a "{{DataError}}" {{DOMException}} and stop. - 1. Let |operand| be the result of invoking the create MLOperand steps with [=this=] and |descriptor|. - 1. If that throws, re-throw the exception and stop. - 1. Set |operand|.{{MLOperand/[[name]]}} to |name|. 1. If any of the following sub-steps fail, throw an "{{OperationError}}" {{DOMException}} and stop. + 1. Let |operand| be the result of invoking the create MLOperand steps with [=this=] and |descriptor|. + 1. Set |operand|.{{MLOperand/[[name]]}} to |name|. 1. Make a request to the underlying platform to: - 1. Create an [=implementation-defined=] platform input operand |operandImpl| as an input given |operand|. + 1. Create an [=implementation-defined=] platform input operand |operandImpl| given |descriptor|. + 1. Store a reference of |operandImpl| in |operand|.{{MLOperand/[[operand]]}}. 1. Register |operand| as an input. - 1. Store a reference to |operandImpl| in |operand|.{{MLOperand/[[operand]]}}. 1. Return |operand|. @@ -1619,14 +1620,13 @@ Create a constant {{MLOperand}} that can be used in {{MLGraphBuilder}} methods. 1. If the [=check dimensions=] steps given |descriptor|.{{MLOperandDescriptor/type}} and |descriptor|.{{MLOperandDescriptor/dimensions}} return `false`, throw a "{{DataError}}" {{DOMException}} and stop. 1. Let |bufferView| be the second argument. 1. If invoking validate buffer with descriptor given |bufferView| and |descriptor| return `false`, then throw a "{{TypeError}}" {{DOMException}} and stop. - 1. Let |operand| be the result of invoking the create MLOperand steps with [=this=] and |descriptor|. - 1. If that throws, re-throw the exception and stop. - 1. Let |bytes| be the result of invoking [[=get a copy of the bytes held by the buffer source=]] given |bufferView|. 1. If any of the following sub-steps fail, throw an "{{OperationError}}" {{DOMException}} and stop. + 1. Let |operand| be the result of invoking the create MLOperand steps with [=this=] and |descriptor|. + 1. Let |bytes| be the result of invoking [[=get a copy of the bytes held by the buffer source=]] given |bufferView|. 1. Make a request to the underlying platform to: - 1. Create an [=implementation-defined=] platform operand |constant| for this method, given |operand|. + 1. Create an [=implementation-defined=] platform operand |constantImpl|, given |descriptor|. + 1. Store a reference of |constantImpl| in |operand|.{{MLOperand/[[operand]]}}. 1. Register |operand| as a tensor constant with |bytes| as value. - 1. Store a reference of |constant| in |operand|.{{MLOperand/[[operand]]}}. 1. Return |operand|. @@ -1660,9 +1660,9 @@ Create a constant {{MLOperand}} that can be used in {{MLGraphBuilder}} methods. 1. If any of the following sub-steps fail, throw an "{{OperationError}}" {{DOMException}} and stop. 1. Let |operand| be the result of invoking the create MLOperand steps with [=this=] and |descriptor|. 1. Make a request to the underlying platform to: - 1. Create an [=implementation-defined=] platform operand |constant|, given |operand|. + 1. Create an [=implementation-defined=] platform operand |constantImpl|, given |descriptor|. + 1. Store a reference of |constantImpl| in |operand|.{{MLOperand/[[operand]]}}. 1. Register |operand| as a scalar constant with |value| as value. - 1. Store a reference of |constant| in |operand|.{{MLOperand/[[operand]]}}. 1. Return |operand|. @@ -1734,15 +1734,13 @@ partial interface MLGraphBuilder { 1. If |options|.axis is not a number between 0 and the rank of |input|, then throw a "{{TypeError}}" {{DOMException}} and abort these steps. 1. If |input| is a 4-D tensor of the *"nchw"* layout, set |options|.axis to 1. 1. If |input| is a 4-D tensor of the *"nhwc"* layout, set |options|.axis to 3. - 1. Let |result| be an {{MLOperand}} representing the results. It may use the same underlying data as |input|. 1. If any of the following sub-steps fail, throw an "{{OperationError}}" {{DOMException}} and stop. + 1. Let |output| be the result of invoking the create MLOperand steps with [=this=] and |descriptor|, that may use the same underlying data as |input|. 1. Make a request to the underlying platform to initialize the batch normalization: 1. Create an [=implementation-defined=] platform operator |batchNormImpl| for this method, given |input|, |mean|, |variance| and |options|. - 1. Register |result| as output to |batchNormImpl|. -
- 1. If |options|.activation [=map/exists=], implementations MAY use it to optimize the operation flow. -
- 1. Return |result|. + 1. If |options|.activation [=map/exists=],register it as activation to |batchNormImpl|. + 1. Connect |output| as output to |batchNormImpl|. + 1. Return |output|. @@ -1837,18 +1835,16 @@ partial interface MLGraphBuilder { 1. Let |operand| be the first argument. 1. Let |options| be the second argument. 1. If running the check clamp options steps with |options| returns `false`, then throw a "{{TypeError}}" {{DOMException}} and abort these steps. - 1. Let |result| be the result of invoking the copy MLOperand steps given |operand|. - 1. If that throws an error, re-throw the error and abort these steps. 1. If any of the following sub-steps fail, throw an "{{OperationError}}" {{DOMException}} and stop. - 1. Make a request to the underlying platform to: - 1. Create an [=implementation-defined=] platform operand |operandImpl| given |result|.{{MLOperand/[[descriptor]]}}. - 1. Store a reference to |operandImpl| in |result|.{{MLOperand/[[operand]]}}. + 1. Let |output| be the result of invoking the copy MLOperand steps given |operand|. 1. Make a request to the underlying platform to: 1. Create an [=implementation-defined=] platform operator |clampImpl| for this method, given |options|.{{MLClampOptions/minValue}} and |options|.{{MLClampOptions/minValue}}. - 1. Register |operand|.{{MLOperand/[[operand]]}} as input to |clampImpl|. - 1. Register |result|.{{MLOperand/[[operand]]}} as output to |clampImpl|. - 1. Store a reference to |clampImpl| in |result|.{{MLOperand/[[operator]]}}. - 1. Return |result|. + 1. Store a reference of |clampImpl| in |output|.{{MLOperand/[[operator]]}}. + 1. Create an [=implementation-defined=] platform operand |outputImpl| given |output| and |clampImpl|. + 1. Store a reference to |outputImpl| in |output|.{{MLOperand/[[operand]]}}. + 1. Connect |operand|.{{MLOperand/[[operand]]}} as input to |clampImpl|. + 1. Connect |output|.{{MLOperand/[[operand]]}} as output to |clampImpl|. + 1. Return |output|. @@ -1922,14 +1918,15 @@ partial interface MLGraphBuilder { 1. If |dim| is not equal to |axis| and if |inputs|[|index|].{{MLOperandDescriptor/dimensions}}[|dim|] is not equal to |inputs|[0].{{MLOperandDescriptor/dimensions}}[|dim|], fail. 1. If |inputs|[|dim|].{{MLOperandDescriptor/type}} is not equal to |inputs|[0].{{MLOperandDescriptor/type}}. 1. If |dim| is equal to |axis|, add to |desc|.{{MLOperandDescriptor/dimensions}}[|axis|] the value of |inputs|[|index|].{{MLOperandDescriptor/dimensions}}[|dim|]. - 1. Let |output| be the result of invoking the create MLOperand steps given [=this=] and |desc|. - 1. If that throws an error, re-throw the error and stop. 1. If any of the following sub-steps fail, throw an "{{OperationError}}" {{DOMException}} and stop. + 1. Let |output| be the result of invoking the create MLOperand steps given [=this=] and |desc|. 1. Make a request to the underlying platform to: - 1. Create an [=implementation-defined=] platform operator |concat| for this method, given |inputs| and |axis|. - 1. Register |inputs|.{{MLOperand/[[operand]]}} as an input to |concat|. - 1. Register |output|.{{MLOperand/[[operand]]}} as output to |concat|. - 1. Store a reference of |concat| in |output|.{{MLOperand/[[operator]]}}. + 1. Create an [=implementation-defined=] platform operator |concatImpl| for this method, given |inputs| and |axis|. + 1. Store a reference of |concatImpl| in |output|.{{MLOperand/[[operator]]}}. + 1. Create an [=implementation-defined=] platform operand |outputImpl| given |output| and |concatImpl|. + 1. Store a reference to |outputImpl| in |output|.{{MLOperand/[[operand]]}}. + 1. Connect |inputs| as input to |concatImpl|. + 1. Connect |output|.{{MLOperand/[[operand]]}} as output to |concatImpl|. 1. Return |output|. From 963c9a0bea08885df6dfdfe06a8dfab365d1fa23 Mon Sep 17 00:00:00 2001 From: Zoltan Kis Date: Wed, 21 Jun 2023 21:05:15 +0300 Subject: [PATCH 025/112] ALign steps for creating platform objects for operands Signed-off-by: Zoltan Kis --- index.bs | 10 +++++----- 1 file changed, 5 insertions(+), 5 deletions(-) diff --git a/index.bs b/index.bs index 5a601148..96cd6729 100644 --- a/index.bs +++ b/index.bs @@ -1065,7 +1065,7 @@ The {{MLActivation}} objects (including the ones passed as input to methods) are 1. If |options| is an [=object=], set |activation|.{{MLActivation/[[options]]}} to |options|. 1. If any of the following sub-steps fail, throw an "{{OperationError}}" {{DOMException}} and stop. 1. Make a request to the underlying platform to: - 1. Create an [=implementation-defined=] platform operator |opImpl| defined for |name| to |activation|.{{MLActivation/[[operator]]}}. + 1. Create an [=implementation-defined=] platform operator |opImpl| for the given |name| operation. 1. Store a reference of |opImpl| in |activation|.{{MLActivation/[[operator]]}}. 1. If |init-steps| are defined, run |init-steps| with |options|. 1. Otherwise, initialize |activation|.{{MLActivation/[[operator]]}} given |options| in an [=implementation-defined=] way for the given |name| operation. @@ -1624,7 +1624,7 @@ Create a constant {{MLOperand}} that can be used in {{MLGraphBuilder}} methods. 1. Let |operand| be the result of invoking the create MLOperand steps with [=this=] and |descriptor|. 1. Let |bytes| be the result of invoking [[=get a copy of the bytes held by the buffer source=]] given |bufferView|. 1. Make a request to the underlying platform to: - 1. Create an [=implementation-defined=] platform operand |constantImpl|, given |descriptor|. + 1. Create an [=implementation-defined=] platform operand |constantImpl| to represent a constant, given |descriptor|. 1. Store a reference of |constantImpl| in |operand|.{{MLOperand/[[operand]]}}. 1. Register |operand| as a tensor constant with |bytes| as value. 1. Return |operand|. @@ -1660,7 +1660,7 @@ Create a constant {{MLOperand}} that can be used in {{MLGraphBuilder}} methods. 1. If any of the following sub-steps fail, throw an "{{OperationError}}" {{DOMException}} and stop. 1. Let |operand| be the result of invoking the create MLOperand steps with [=this=] and |descriptor|. 1. Make a request to the underlying platform to: - 1. Create an [=implementation-defined=] platform operand |constantImpl|, given |descriptor|. + 1. Create an [=implementation-defined=] platform operand |constantImpl| to represent a constant, given |descriptor|. 1. Store a reference of |constantImpl| in |operand|.{{MLOperand/[[operand]]}}. 1. Register |operand| as a scalar constant with |value| as value. 1. Return |operand|. @@ -1840,7 +1840,7 @@ partial interface MLGraphBuilder { 1. Make a request to the underlying platform to: 1. Create an [=implementation-defined=] platform operator |clampImpl| for this method, given |options|.{{MLClampOptions/minValue}} and |options|.{{MLClampOptions/minValue}}. 1. Store a reference of |clampImpl| in |output|.{{MLOperand/[[operator]]}}. - 1. Create an [=implementation-defined=] platform operand |outputImpl| given |output| and |clampImpl|. + 1. Create an [=implementation-defined=] platform operand |outputImpl| to represent clamp output, given |output| and |clampImpl|. 1. Store a reference to |outputImpl| in |output|.{{MLOperand/[[operand]]}}. 1. Connect |operand|.{{MLOperand/[[operand]]}} as input to |clampImpl|. 1. Connect |output|.{{MLOperand/[[operand]]}} as output to |clampImpl|. @@ -1923,7 +1923,7 @@ partial interface MLGraphBuilder { 1. Make a request to the underlying platform to: 1. Create an [=implementation-defined=] platform operator |concatImpl| for this method, given |inputs| and |axis|. 1. Store a reference of |concatImpl| in |output|.{{MLOperand/[[operator]]}}. - 1. Create an [=implementation-defined=] platform operand |outputImpl| given |output| and |concatImpl|. + 1. Create an [=implementation-defined=] platform operand |outputImpl| to represent output,given |output| and |concatImpl|. 1. Store a reference to |outputImpl| in |output|.{{MLOperand/[[operand]]}}. 1. Connect |inputs| as input to |concatImpl|. 1. Connect |output|.{{MLOperand/[[operand]]}} as output to |concatImpl|. From 6c6e033c51de2553fa7d256672debbd09d7a20ba Mon Sep 17 00:00:00 2001 From: Zoltan Kis Date: Wed, 21 Jun 2023 22:05:54 +0300 Subject: [PATCH 026/112] Add the build() and buildSync() algorithms Signed-off-by: Zoltan Kis --- index.bs | 73 ++++++++++++++++++++++++++++++++++++++++++++++++++++++-- 1 file changed, 71 insertions(+), 2 deletions(-) diff --git a/index.bs b/index.bs index 96cd6729..ab647b5d 100644 --- a/index.bs +++ b/index.bs @@ -1538,12 +1538,29 @@ interface MLGraphBuilder { Both {{MLGraphBuilder}}.{{MLGraphBuilder/build()}} and {{MLGraphBuilder}}.{{MLGraphBuilder/buildSync()}} methods compile the graph builder state up to the specified output operands into a compiled graph according to the type of {{MLContext}} that creates it. Since this operation can be costly in some machine configurations, the calling thread of the {{MLGraphBuilder}}.{{MLGraphBuilder/buildSync()}} method must only be a worker thread to avoid potential disruption of the user experience. When the {{[[contextType]]}} of the {{MLContext}} is set to [=default-context|default=], the compiled graph is initialized right before the {{MLGraph}} is returned. This graph initialization stage is important for optimal performance of the subsequent graph executions. See [[#api-mlcommandencoder-graph-initialization]] for more detail. +{{MLBufferResourceView}} has the following members: +
+ : resource + :: + A {{GPUBuffer}} object. Specifies the GPU buffer source. + + : offset + :: + Specifies an {{unsigned long long}} offset in the buffer source. + + : size + :: + Specifies the {{unsigned long long}} size of the buffer view. +
+ +
{{MLGraphBuilder}} has the following internal slots: -
+
: \[[context]] of type {{MLContext}} :: The context of type {{MLContext}} associated with this {{MLGraphBuilder}}. -
+
+
### The {{MLGraphBuilder}} constructor ### {#api-mlgraphbuilder-constructor}
@@ -1595,6 +1612,58 @@ Create a named {{MLOperand}} based on a descriptor, that can be used as an input
+### The build() method ### {#api-mlgraphbuilder-build} +Build a composed graph up to a given output operand into a computational graph, asynchronously or synchronously. + +#### The {{MLGraphBuilder/build(outputs)}} method #### {#api-mlgraphbuilder-build-outputs} +
+ + The {{MLGraphBuilder/build(outputs)}} steps are: + +
+
+ The permissions and context validity have been checked by [[#api-mlgraphbuilder-constructor]] steps. +
+ 1. Let |promise| be [=a new promise=]. + 1. Return |promise| and run the following steps [=in parallel=]. + 1. Return the result of invoking {{MLGraphBuilder/buildSync(outputs)}} given |outputs|. + 1. If that throws, re-throw the error and stop. +
+
+ +#### The {{MLGraphBuilder/buildSync(outputs)}} method #### {#api-mlgraphbuilder-buildsync-outputs} +
+ + The {{MLGraphBuilder/buildSync(outputs)}} steps are: + +
+
+ The permissions and context validity have been checked by [[#api-mlgraphbuilder-constructor]] steps. +
+ 1. If |outputs| is not an instance of {{MLNamedOperands}}, then throw an "{{TypeError}}" {{DOMException}} and stop. + 1. For each |element| in |outputs|: + 1. If |element|.key is not a [=string=], then throw an "{{TypeError}}" {{DOMException}} and stop. + 1. If |element|.value is not an instance of {{MLOperand}}, then throw an "{{TypeError}}" {{DOMException}} and stop. + 1. If any of the following sub-steps fail, throw an "{{OperationError}}" {{DOMException}} and stop. + 1. Let |graph| be a new {{MLGraph}}: + 1. Set |graph|.{{MLGraph/[[context]]}} to [=this=].{{MLGraphBuilder/[[context]]}}. + 1. Set |graph|.{{MLGraph/[[outputDescriptors]]}} to |outputs|. + 1. Make a request to the underlying platform to: + 1. Connect |graph| to a new [=implementation-defined=] graph implementation |graphImpl| given |graph|. + 1. Store a reference to |graphImpl| in |graph|.{{MLGraph/[[implementation]]}}. + 1. Make a request to the underlying platform to initialize the graph: + 1. For each |operand| in |outputs|: + 1. If |operand| was created as an input by the underlying platform: + 1. Add |operand| to |graph|.{{MLGraph/[[inputDescriptors]]}}. + 1. Initialize the weights of |operand|. + 1. If |operand| was created as a constant by the underlying platform: + 1. Preprocess and optimize the tensor data of |operand|. + 1. Update |graphImpl| with |operand|.{{MLOperand/[[operand]]}}. + 1. Update |graphImpl| with |operand|.{{MLOperand/[[operator]]}}. + 1. Return |graph|. +
+
+ ### The constant() method ### {#api-mlgraphbuilder-constant-method} Create a constant {{MLOperand}} that can be used in {{MLGraphBuilder}} methods. From 567396bc1890c663f8fb87f2caf59a65126ae1d5 Mon Sep 17 00:00:00 2001 From: Zoltan Kis Date: Wed, 21 Jun 2023 22:39:48 +0300 Subject: [PATCH 027/112] Add the conv2d() and convTranspose2d() algorithms Signed-off-by: Zoltan Kis --- index.bs | 304 +++++++++++++++++++++++++++++++++++++++++++------------ 1 file changed, 238 insertions(+), 66 deletions(-) diff --git a/index.bs b/index.bs index 96cd6729..faf6b015 100644 --- a/index.bs +++ b/index.bs @@ -1963,55 +1963,132 @@ partial interface MLGraphBuilder { MLOperand conv2d(MLOperand input, MLOperand filter, optional MLConv2dOptions options = {}); }; -
- **Arguments:** - - *input*: an {{MLOperand}}. The input 4-D tensor. The logical shape - is interpreted according to the value of *options.inputLayout*. - - *filter*: an {{MLOperand}}. The filter 4-D tensor. The logical shape is - interpreted according to the value of *options.filterLayout* and *options.groups*. - - *options*: an optional {{MLConv2dOptions}}. The optional parameters of the operation. - - *padding*: a sequence of {{unsigned long}} of length 4. The additional rows and columns added to the beginning and ending of each spatial dimension of *input*, [beginning_height, ending_height, beginning_width, ending_width]. If not present, the values are assumed to be [0,0,0,0]. - - *strides*: a sequence of {{unsigned long}} of length 2. The stride of the sliding window for each spatial dimension of *input*, [stride_height, stride_width]. If not present, the values are assumed to be [1,1]. - - *dilations*: a sequence of {{unsigned long}} of length 2. The dilation factor for each spatial dimension of *input*, [dilation_height, dilation_width]. If not present, the values are assumed to be [1,1]. - - *autoPad*: an {{MLAutoPad}}. The automatic input padding options. By default, this argument is set to *"explicit"*, which means that the values in the *options.padding* array should be used for input padding. When the option is set other than *"explicit"*, the values in the *options.padding* array are ignored. With the *"same-upper"* option, the padding values are automatically computed such that the additional ending padding of the spatial input dimensions would allow all of the input values in the corresponding dimension to be filtered. The *"same-lower"* option is similar but padding is applied to the beginning padding of the spatial input dimensions instead of the ending one. - - *groups*: an {{unsigned long}} scalar. The number of groups that input channels and output channels are divided into, default to 1. - - *inputLayout*: an {{MLInputOperandLayout}}. The default value is *"nchw"*. This option specifies the layout format of the input and output tensor as follow: - "nchw": - - input tensor: [batches, input_channels, height, width] - - output tensor: [batches, output_channels, height, width] +{{MLConv2dOptions}} has the following members: +
+ : padding + :: + A sequence of {{unsigned long}} of length 4: [beginning_height, ending_height, beginning_width, ending_width]. + Specifies the additional rows and columns added to the beginning and ending of each spatial dimension of the convolution input. + The default value is [0, 0, 0, 0]. - "nhwc": - - input tensor: [batches, height, width, input_channels] - - output tensor: [batches, height, width, output_channels] + : strides + :: + A sequence of {{unsigned long}} of length 2: [stride_height, stride_width]. + Specifies the stride of the sliding window for each spatial dimension of the convolution input. + The default value is [1, 1]. - - *filterLayout*: an {{MLConv2dFilterOperandLayout}}. The default value is *"oihw"*. This option specifies the layout format of the filter tensor as follow: + : dilations + :: + A sequence of {{unsigned long}} of length 2: [dilation_height, dilation_width]. Specifies the dilation factor for each spatial dimension applied on the convolution filter (kernel). + The default value is [1, 1]. - "oihw": - - [output_channels, input_channels/groups, height, width] + : autoPad + :: + An {{MLAutoPad}} [=string=]. + Specifies the automatic input padding options. + The default value is *"explicit"*, which means that the values in the {{MLConv2dOptions/padding}} array should be used for input padding. + When the option is set other than *"explicit"*, the values in the {{MLConv2dOptions/padding}} array are ignored. - "hwio": - - [height, width, input_channels/groups, output_channels] + With the *"same-upper"* option, the padding values are automatically computed such that the additional ending padding of the spatial input dimensions would allow all of the input values in the corresponding dimension to be filtered. - "ohwi": - - [output_channels, height, width, input_channels/groups] + The *"same-lower"* option is similar but padding is applied to the beginning padding of the spatial input dimensions instead of the ending one. - "ihwo": - - [input_channels/groups, height, width, output_channels] + : groups + :: + An {{unsigned long}} scalar. + Specifies the number of groups that input channels and output channels are divided into. + The default value is 1. - - *bias*: an {{MLOperand}}. The additional 1-D tensor with the shape of [output_channels] whose values are to be added to the convolution result. - - *activation*: an {{MLActivation}}. The optional activation function that immediately follows the convolution operation. + : inputLayout + :: + An {{MLInputOperandLayout}} [=string=]. + Specifies the layout format of the input and output tensor as follows: + - **"nchw"** + - input tensor: *[batches, input_channels, height, width]* + - output tensor: *[batches, output_channels, height, width]* + - **"nhwc"**: + - input tensor: *[batches, height, width, input_channels]* + - output tensor: *[batches, height, width, output_channels]* + The default value is *"nchw"*. + + : filterLayout + :: + An {{MLConv2dFilterOperandLayout}} [=string=]. + Specifies the layout format of the filter tensor as follow: + - **"oihw"**: *[output_channels, input_channels/groups, height, width]* + - **"hwio"**: *[height, width, input_channels/groups, output_channels]* + - **"ohwi"**: *[output_channels, height, width, input_channels/groups]* + - **"ihwo"**: *[input_channels/groups, height, width, output_channels]* + The default value is *"oihw"*. - **Returns:** an {{MLOperand}}. The output 4-D tensor that contains the convolution result. The output shape is interpreted according to the *options.inputLayout* value. More specifically, the spatial dimensions or the sizes of the last two dimensions of the output tensor for the *nchw* input layout can be calculated as follow: + : bias + :: + An {{MLOperand}} object. + Specifies the additional 1-D tensor with the shape of *[output_channels]* whose values are to be added to the convolution result. - *output size = 1 + (input size - (filter size - 1) ** *dilation - 1 + beginning padding + ending padding) / stride* + : activation + :: + An {{MLActivation}} object. + Specifies the optional activation function that immediately follows the convolution operation. +
-
+
+ **Arguments:** + - *input*: an {{MLOperand}}. The input 4-D tensor. The logical shape + is interpreted according to the value of *options*.{{MLConv2dOptions/inputLayout}}. + - *filter*: an {{MLOperand}}. The filter 4-D tensor. The logical shape is + interpreted according to the value of *options*.{{MLConv2dOptions/filterLayout}} and *options*.{{MLConv2dOptions/groups}}. + - *options*: an {{MLConv2dOptions}}. The optional parameters of the operation. + + **Returns:** an {{MLOperand}}. The output 4-D tensor that contains the convolution result. The output shape is interpreted according to the *options*.{{MLConv2dOptions/inputLayout}} value. More specifically, the spatial dimensions or the sizes of the last two dimensions of the output tensor for the *nchw* input layout can be calculated as follow: + + *output_size = 1 + (input_size - (filter_size - 1) ** *dilation - 1 + beginning_padding + ending_padding) / stride* +
+ +
A *depthwise* conv2d operation is a variant of grouped convolution, used in models like the MobileNet, where the *options.groups* = input_channels = output_channels and the shape of filter tensor is [options.groups, 1, height, width] for *"oihw"* layout, [height, width, 1, options.groups] for *"hwio"* layout, [options.groups, height, width, 1] for *"ohwi"* layout and [1, height, width, options.groups] for *"ihwo"* layout. -
+
+ + The {{MLGraphBuilder/conv2d(input, filter, options)}} steps are: + +
+ 1. If |input| or |filter| is not an instance of {{MLOperand}}, then then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. Let |input_size| be the size of |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}. + 1. Let |filter_size| be the size of |filter|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}. + 1. If |input_size| is not `4`, then then throw a "{{DataError}}" {{DOMException}} and stop. + 1. If |filter_size| is not `4`, then then throw a "{{DataError}}" {{DOMException}} and stop. + 1. If |options| is `undefined`, let |options| be an empty [=object=]. + 1. If |options|.{{MLConv2dOptions/padding}} is `undefined`, set it to `[0, 0, 0, 0]`. + 1. If |options|.{{MLConv2dOptions/strides}} is `undefined`, set it to `[1, 1]`. + 1. If |options|.{{MLConv2dOptions/dilations}} is `undefined`, set it to `[1, 1]`. + 1. If |options|.{{MLConv2dOptions/autoPad}} is `undefined`, set it to `"explicit"`. + 1. If |options|.{{MLConv2dOptions/groups}} is `undefined`, set it to `1`. + 1. If |options|.{{MLConv2dOptions/inputLayout}} is `undefined`, set it to `"nchw"`. + 1. If |options|.{{MLConv2dOptions/filterLayout}} is `undefined`, set it to `"oihw"`. + 1. If |options|.{{MLConv2dOptions/bias}} [=map/exists=] and it is not an instance of {{MLOperand}}, then then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. If |options|.{{MLConv2dOptions/activation}} [=map/exists=] and it is not an instance of {{MLActivation}}, then then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. Let |output_shape| be the result of calculating output dimensions based on input, filter, dilation, padding and stride, taking into account |options|.{{MLConv2dOptions/inputLayout}}. + 1. Let |desc| a new {{MLOperandDescriptor}}. + 1. Set |desc|.{{MLOperandDescriptor/type}} to |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/type}}. + 1. Set |desc|.{{MLOperandDescriptor/dimensions}} to |output_shape|. + 1. If any of the following sub-steps fail, throw an "{{OperationError}}" {{DOMException}} and stop. + 1. Let |output| be the result of invoking the create MLOperand steps given [=this=] and |desc|. + 1. Make a request to the underlying platform to: + 1. Create an [=implementation-defined=] platform operator |conv2dImpl| for this method, given |options| and |filter|. + 1. If |options|.{{MLConv2dOptions/activation}} [=map/exists=],register it as activation to |conv2dImpl|. + 1. Store a reference of |conv2dImpl| in |output|.{{MLOperand/[[operator]]}}. + 1. Create an [=implementation-defined=] platform operand |outputImpl| to represent the output, given |output| and |conv2dImpl|. + 1. Store a reference to |outputImpl| in |output|.{{MLOperand/[[operand]]}}. + 1. Connect |input|.{{MLOperand/[[operand]]}} as input to |conv2dImpl|. + 1. Connect |output|.{{MLOperand/[[operand]]}} as output to |conv2dImpl|. + 1. Return |output|. +
+
+ ### The convTranspose2d() method ### {#api-mlgraphbuilder-convtranspose2d} Compute a 2-D transposed convolution given 4-D input and filter tensors -
- **Arguments:** - - *input*: an {{MLOperand}}. The input 4-D tensor. The logical shape - is interpreted according to the value of *options.inputLayout*. - - *filter*: an {{MLOperand}}. The filter 4-D tensor. The logical shape is - interpreted according to the value of *options.filterLayout* and *options.groups*. - - *options*: an optional {{MLConvTranspose2dOptions}}. The optional parameters of the operation. - - *padding*: a sequence of {{unsigned long}} of length 4. The additional rows and columns added to the beginning and ending of each spatial dimension of *input*, [beginning_height, ending_height, beginning_width, ending_width]. If not present, the values are assumed to be [0,0,0,0]. - - *strides*: a sequence of {{unsigned long}} of length 2. The stride of the sliding window for each spatial dimension of *input*, [stride_height, stride_width]. If not present, the values are assumed to be [1,1]. - - *dilations*: a sequence of {{unsigned long}} of length 2. The dilation factor for each spatial dimension of *input*, [dilation_height, dilation_width]. If not present, the values are assumed to be [1,1]. - - *outputPadding*: a sequence of {{unsigned long}} of length 2. The padding values applied to each spatial dimension of the output tensor. This explicit padding values are needed to disambiguate the output tensor shape for transposed convolution when the value of the *options.strides* is greater than 1. Note that these values are only used to disambiguate output shape when needed; it does not necessarily cause any padding value to be written to the output tensor. If not specified, the values are assumed to be [0,0]. - - *outputSizes*: a sequence of {{unsigned long}} of length 2. The sizes of the last two dimensions of the output tensor. When the output sizes are explicitly specified, the output padding values in *options.outputPadding* are ignored. If not specified, the output sizes are automatically computed. - - *autoPad*: an {{MLAutoPad}}. The automatic input padding options. By default, this argument is set to *"explicit"*, which means that the values in the *options.padding* array should be used for input padding. When the option is set other than *"explicit"*, the values in the *options.padding* array are ignored. With the *"same-upper"* option, the padding values are automatically computed such that the additional ending padding of the spatial input dimensions would allow all of the input values in the corresponding dimension to be filtered. The *"same-lower"* option is similar but padding is applied to the beginning padding of the spatial input dimensions instead of the ending one. - - *groups*: an {{unsigned long}} scalar. The number of groups that input channels and output channels are divided into, default to 1. - - *inputLayout*: an {{MLInputOperandLayout}}. The default value is *"nchw"*. This option specifies the layout format of the input and output tensor as follow: - "nchw": - - input tensor: [batches, input_channels, height, width] - - output tensor: [batches, output_channels, height, width] +{{MLConvTranspose2dOptions}} has the following members: +
+ : padding + :: + A sequence of {{unsigned long}} of length 4: [beginning_height, ending_height, beginning_width, ending_width]. + Specifies the additional rows and columns added to the beginning and ending of each spatial dimension of the convolution input. + The default value is [0, 0, 0, 0]. - "nhwc": - - input tensor: [batches, height, width, input_channels] - - output tensor: [batches, height, width, output_channels] + : strides + :: + A sequence of {{unsigned long}} of length 2: [stride_height, stride_width]. + Specifies the stride of the sliding window for each spatial dimension of the convolution input. + The default value is [1, 1]. + + : dilations + :: + A sequence of {{unsigned long}} of length 2: [dilation_height, dilation_width]. Specifies the dilation factor for each spatial dimension applied on the convolution filter (kernel). + The default value is [1, 1]. + + : outputPadding + :: + A sequence of {{unsigned long}} of length 2. + Specifies the padding values applied to each spatial dimension of the output tensor. The explicit padding values are needed to disambiguate the output tensor shape for transposed convolution when the value of the *options*.{{MLConvTranspose2dOptions/strides}} is greater than 1. + + Note that these values are only used to disambiguate output shape when needed; it does not necessarily cause any padding value to be written to the output tensor. - - *filterLayout*: an {{MLConvTranspose2dFilterOperandLayout}}. The default value is *"iohw"*. This option specifies the layout format of the filter tensor as follow: + The default values is [0, 0]. + + : outputSizes + :: + A sequence of {{unsigned long}} of length 2. + Specifies the sizes of the last two dimensions of the output tensor. When the output sizes are explicitly specified, the output padding values in {{MLConvTranspose2dOptions/outputPadding}} are ignored. + + If not specified, the output sizes are automatically computed. + + : autoPad + :: + An {{MLAutoPad}} [=string=]. + Specifies the automatic input padding options. + The default value is *"explicit"*, which means that the values in the {{MLConvTranspose2dOptions/padding}} array should be used for input padding. + + When the option is set other than *"explicit"*, the values in the {{MLConvTranspose2dOptions/padding}} array are ignored. + + With the *"same-upper"* option, the padding values are automatically computed such that the additional ending padding of the spatial input dimensions would allow all of the input values in the corresponding dimension to be filtered. + + The *"same-lower"* option is similar but padding is applied to the beginning padding of the spatial input dimensions instead of the ending one. + + : groups + :: + An {{unsigned long}} scalar. + Specifies the number of groups that input channels and output channels are divided into. + The default value is 1. - "iohw": - - [input_channels, output_channels/groups, height, width] + : inputLayout + :: + An {{MLInputOperandLayout}} [=string=]. + Specifies the layout format of the input and output tensor as follows: + - **"nchw"** + - input tensor: *[batches, input_channels, height, width]* + - output tensor: *[batches, output_channels, height, width]* + - **"nhwc"**: + - input tensor: *[batches, height, width, input_channels]* + - output tensor: *[batches, height, width, output_channels]* + The default value is *"nchw"*. + + : filterLayout + :: + An {{MLConvTranspose2dFilterOperandLayout}} [=string=]. + Specifies the layout format of the filter tensor as follow: + - **"iohw"**: [input_channels, output_channels/groups, height, width] + - **"hwoi"**: [height, width, output_channels/groups, input_channels] + - **"ohwi"**: [output_channels/groups, height, width, input_channels] + The default value is *"iohw"*. - "hwoi": - - [height, width, output_channels/groups, input_channels] + : bias + :: + An {{MLOperand}} object. + Specifies the additional 1-D tensor with the shape of *[output_channels]* whose values are to be added to the convolution result. - "ohwi": - - [output_channels/groups, height, width, input_channels] + : activation + :: + An {{MLActivation}} object. + Specifies the optional activation function that immediately follows the convolution operation. +
- - *bias*: an {{MLOperand}}. The additional 1-D tensor with the shape of [output_channels] whose values are to be added to the transposed convolution result. - - *activation*: an {{MLActivation}}. The optional activation function that immediately follows the transposed convolution operation. +
+ **Arguments:** + - *input*: an {{MLOperand}}. The input 4-D tensor. The logical shape + is interpreted according to the value of *options*.{{MLConvTranspose2dOptions/inputLayout}}. + - *filter*: an {{MLOperand}}. The filter 4-D tensor. The logical shape is + interpreted according to the value of *options*.{{MLConvTranspose2dOptions/filterLayout}} and {{MLConvTranspose2dOptions/groups}}. + - *options*: an optional {{MLConvTranspose2dOptions}}. - **Returns:** an {{MLOperand}}. The output 4-D tensor that contains the transposed convolution result. The output shape is interpreted according to the *options.inputLayout* value. More specifically, unless the *options.outputSizes* values are explicitly specified, the *options.outputPadding* may be needed to compute the spatial dimension values of the output tensor as follow: + **Returns:** an {{MLOperand}}. The output 4-D tensor that contains the transposed convolution result. The output shape is interpreted according to the *options*.{{MLConvTranspose2dOptions/inputLayout}} value. More specifically, unless the *options*.{{MLConvTranspose2dOptions/outputSizes}} values are explicitly specified, the *options*.{{MLConvTranspose2dOptions/outputPadding}} may be needed to compute the spatial dimension values of the output tensor as follow: - *output size = (input size - 1) ** *stride + (filter size - 1) ** *dilation + 1 - beginning padding - ending padding + output padding* + *output_size = (input_size - 1) ** *stride + (filter_size - 1) ** *dilation + 1 - beginning_padding - ending_padding + output_padding*
+
+ + The {{MLGraphBuilder/convTranspose2d(input, filter, options)}} steps are: + +
+ 1. If |input| or |filter| is not an instance of {{MLOperand}}, then then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. Let |input_size| be the size of |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}. + 1. Let |filter_size| be the size of |filter|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}. + 1. If |input_size| is not `4`, then then throw a "{{DataError}}" {{DOMException}} and stop. + 1. If |filter_size| is not `4`, then then throw a "{{DataError}}" {{DOMException}} and stop. + 1. If |options| is `undefined`, let |options| be an empty [=object=]. + 1. If |options|.{{MLConvTranspose2dOptions/padding}} is `undefined`, set it to `[0, 0, 0, 0]`. + 1. If |options|.{{MLConvTranspose2dOptions/strides}} is `undefined`, set it to `[1, 1]`. + 1. If |options|.{{MLConvTranspose2dOptions/dilations}} is `undefined`, set it to `[1, 1]`. + 1. If |options|.{{MLConvTranspose2dOptions/outputPadding}} is `undefined`, set it to `[0, 0]`. + 1. If |options|.{{MLConvTranspose2dOptions/autoPad}} is `undefined`, set it to `"explicit"`. + 1. If |options|.{{MLConvTranspose2dOptions/groups}} is `undefined`, set it to `1`. + 1. If |options|.{{MLConvTranspose2dOptions/inputLayout}} is `undefined`, set it to `"nchw"`. + 1. If |options|.{{MLConvTranspose2dOptions/filterLayout}} is `undefined`, set it to `"iohw"`. + 1. If |options|.{{MLConvTranspose2dOptions/bias}} [=map/exists=] and it is not an instance of {{MLOperand}}, then then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. If |options|.{{MLConvTranspose2dOptions/activation}} [=map/exists=] and it is not an instance of {{MLActivation}}, then then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. Let |output_shape| be the result of calculating output dimensions based on |input|, |filter|, |options|.{{MLConvTranspose2dOptions/dilations}}, |options|.{{MLConvTranspose2dOptions/padding}} and |options|.{{MLConvTranspose2dOptions/strides}}, taking into account |options|.{{MLConvTranspose2dOptions/inputLayout}}. + 1. Let |desc| a new {{MLOperandDescriptor}}. + 1. Set |desc|.{{MLOperandDescriptor/type}} to |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/type}}. + 1. Set |desc|.{{MLOperandDescriptor/dimensions}} to |output_shape|. + 1. If any of the following sub-steps fail, throw an "{{OperationError}}" {{DOMException}} and stop. + 1. Let |output| be the result of invoking the create MLOperand steps given [=this=] and |desc|. + 1. Make a request to the underlying platform to: + 1. Create an [=implementation-defined=] platform operator |convTranspose2dImpl| for this method, given |options| and |filter|. + 1. If |options|.{{MLConvTranspose2dOptions/activation}} [=map/exists=],register it as activation to |convTranspose2dImpl|. + 1. Store a reference of |convTranspose2dImpl| in |output|.{{MLOperand/[[operator]]}}. + 1. Create an [=implementation-defined=] platform operand |outputImpl| to represent the output, given |output| and |convTranspose2dImpl|. + 1. Store a reference to |outputImpl| in |output|.{{MLOperand/[[operand]]}}. + 1. Connect |input|.{{MLOperand/[[operand]]}} as input to |convTranspose2dImpl|. + 1. Connect |output|.{{MLOperand/[[operand]]}} as output to |convTranspose2dImpl|. + 1. Return |output|. +
+
+ ### Element-wise binary operations ### {#api-mlgraphbuilder-binary} Compute the element-wise binary addition, subtraction, multiplication, division, maximum and minimum of the two input tensors. From 5f7a507feea4f04a1473997f1b82423fca35773d Mon Sep 17 00:00:00 2001 From: Zoltan Kis Date: Wed, 21 Jun 2023 22:57:33 +0300 Subject: [PATCH 028/112] Add the element-wise binary operation algorithms Signed-off-by: Zoltan Kis --- index.bs | 99 ++++++++++++++++++++++++++++++++++++++++++++++++++++---- 1 file changed, 92 insertions(+), 7 deletions(-) diff --git a/index.bs b/index.bs index 96cd6729..4076af14 100644 --- a/index.bs +++ b/index.bs @@ -2087,6 +2087,12 @@ partial interface MLGraphBuilder { ### Element-wise binary operations ### {#api-mlgraphbuilder-binary} Compute the element-wise binary addition, subtraction, multiplication, division, maximum and minimum of the two input tensors. + +The element-wise binary operations will be broadcasted according to +[[!numpy-broadcasting-rule]]. The rank of the output tensor is the maximum +rank of the input tensors. For each dimension of the output tensor, its size +is the maximum size along that dimension of the input tensors. + -
+ +
**Arguments:** - *a*: an {{MLOperand}}. The first input tensor. - *b*: an {{MLOperand}}. The second input tensor. **Returns:** an {{MLOperand}}. The output tensor that contains the result of element-wise binary operation of the two input tensors. - - The element-wise binary operation will be broadcasted according to - [[!numpy-broadcasting-rule]]. The rank of the output tensor is the maximum - rank of the input tensors. For each dimension of the output tensor, its size - is the maximum size along that dimension of the input tensors. - +
+
**Operation types:** - *add*: Add the values of the two input tensors, element-wise. - *sub*: Subtract the values of the second input tensor from the values of the first input tensor, element-wise. @@ -2121,6 +2124,88 @@ partial interface MLGraphBuilder { - *pow*: Compute the values of the values of the first input tensor to the power of the values of the second input tensor, element-wise.
+
+ + To create element-wise binary operation given |op|, |a| and |b|, run the following steps: + +
+ 1. [=Assert=]: |op| is one of "add", "sub", "mul", "div", "max", "min", "pow". + 1. If |a| or |b| is not an instance of {{MLOperand}}, then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. If |a|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/type}} is not equal to |b|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/type}}, then throw a "{{DataError}}" {{DOMException}} and stop. + 1. Let |descriptor| be a new {{MLOperandDescriptor}}. + 1. Set |descriptor|.{{MLOperandDescriptor/dimensions}}.{{MLOperandDescriptor/type}} to |a|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/type}}. + 1. Let |descriptor|.{{MLOperandDescriptor/dimensions}} be the result of running the [=MLGraphBuilder/broadcast-shapes=] steps given |a|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}} and |b|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}. + 1. If that throws an error, re-throw the error and stop. + 1. If any of the following sub-steps fail, throw an "{{OperationError}}" {{DOMException}} and stop. + 1. Let |output| be the result of invoking the create MLOperand steps given [=this=] and |descriptor|. + 1. Make a request to the underlying platform to: + 1. Let |opImpl| be an [=implementation-defined=] platform operator for the binary operation |op|, given |a| and |b|. + 1. Store a reference of |opImpl| in |output|.{{MLOperand/[[operator]]}}. + 1. Create an [=implementation-defined=] platform operand |outputImpl| to represent the output, given |output| and |opImpl|. + 1. Store a reference to |outputImpl| in |output|.{{MLOperand/[[operand]]}}. + 1. Connect |a|.{{MLOperand/[[operand]]}} and |b|.{{MLOperand/[[operand]]}} as inputs to |opImpl|. + 1. Connect |output|.{{MLOperand/[[operand]]}} as output to |opImpl|. + 1. Return |output|. +
+
+ +
+ + To broadcast shapes given |shape1| and |shape2|, run the following steps: + +
+ 1. [=Assert=]: The type of |shape1| and |shape2| is `sequence of unsigned long`. + 1. Let |output| be the result of invoking the [=implementation-defined=] shape broadcast on |shape1| and |shape2|. + 1. If that fails, throw a "{{DataError}}" {{DOMException}} and stop. + 1. Return |output|. +
+ The most common implementation is that two shapes are compatible, when each of their corresponding dimensions are equal, or one of them is 1. The output shape consists of the maximum of the corresponding dimensions. +
+
+
+ +
+ + The element-wise binary operation algorithms invoke the [=MLGraphBuilder/element-wise-binary-op | create element-wise binary operation =] steps as follows. + +
+ The {{MLGraphBuilder/add(a, b)}} steps are: + 1. Let |output| be the result of running the [=MLGraphBuilder/element-wise-binary-op | create element-wise binary operation =] given "add", |a| and |b|. + 1. If that throws an error, then re-throw the error and stop. + 1. Return |output|. + + The {{MLGraphBuilder/sub(a, b)}} steps are: + 1. Let |output| be the result of running the [=MLGraphBuilder/element-wise-binary-op | create element-wise binary operation =] given "sub", |a| and |b|. + 1. If that throws an error, then re-throw the error and stop. + 1. Return |output|. + + The {{MLGraphBuilder/mul(a, b)}} steps are: + 1. Let |output| be the result of running the [=MLGraphBuilder/element-wise-binary-op | create element-wise binary operation =] given "mul", |a| and |b|. + 1. If that throws an error, then re-throw the error and stop. + 1. Return |output|. + + The {{MLGraphBuilder/div(a, b)}} steps are: + 1. Let |output| be the result of running the [=MLGraphBuilder/element-wise-binary-op | create element-wise binary operation =] given "div", |a| and |b|. + 1. If that throws an error, then re-throw the error and stop. + 1. Return |output|. + + The {{MLGraphBuilder/max(a, b)}} steps are: + 1. Let |output| be the result of running the [=MLGraphBuilder/element-wise-binary-op | create element-wise binary operation =] given "max", |a| and |b|. + 1. If that throws an error, then re-throw the error and stop. + 1. Return |output|. + + The {{MLGraphBuilder/min(a, b)}} steps are: + 1. Let |output| be the result of running the [=MLGraphBuilder/element-wise-binary-op | create element-wise binary operation =] given "min", |a| and |b|. + 1. If that throws an error, then re-throw the error and stop. + 1. Return |output|. + + The {{MLGraphBuilder/pow(a, b)}} steps are: + 1. Let |output| be the result of running the [=MLGraphBuilder/element-wise-binary-op | create element-wise binary operation =] given "pow", |a| and |b|. + 1. If that throws an error, then re-throw the error and stop. + 1. Return |output|. +
+
+ ### Element-wise unary operations ### {#api-mlgraphbuilder-unary} Compute the element-wise unary operation for input tensor. -
+ +
**Arguments:** - - *x*: an {{MLOperand}}. The input tensor. + - *input*: an {{MLOperand}}. The input tensor. **Returns:** an {{MLOperand}}. The output tensor that contains the result of element-wise unary operation of the input tensor. The shape of the output tensor is the same as the shape of input tensor. - +
+
**Operation types:** - *abs*: Compute the absolute value of the input tensor, element-wise. - *ceil*: Compute the ceiling of the input tensor, element-wise. @@ -2156,6 +2158,80 @@ partial interface MLGraphBuilder { - *tan*: Compute the tangent of the input tensor, element-wise.
+
+ + To create element-wise unary operation given |op| and |input|, run the following steps: + +
+ 1. [=Assert=]: |op| is one of "abs", "ceil", "cos", "exp", "floor", "log", "neg", "sin", "tan". + 1. If |input| is not an instance of {{MLOperand}}, then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. Let |kind| be `"output"`. + 1. Let |descriptor| be a new {{MLOperandDescriptor}}. + 1. If any of the following sub-steps fail, throw an "{{OperationError}}" {{DOMException}} and stop. + 1. Let |output| be the result of invoking the copy MLOperand steps given |input|. + 1. Make a request to the underlying platform to: + 1. Let |opImpl| be an [=implementation-defined=] platform operator for the unary operation |op|. + 1. Store a reference of |opImpl| in |output|.{{MLOperand/[[operator]]}}. + 1. Create an [=implementation-defined=] platform operand |outputImpl| to represent the output, given |output| and |opImpl|. + 1. Store a reference to |outputImpl| in |output|.{{MLOperand/[[operand]]}}. + 1. Connect |input|.{{MLOperand/[[operand]]}} as input to |opImpl|. + 1. Connect |output|.{{MLOperand/[[operand]]}} as output to |opImpl|. + 1. Return |output|. +
+
+ +
+ + The element-wise unary operation algorithms invoke the [=MLGraphBuilder/element-wise-unary-op | create element-wise unary operation =] steps as follows. + +
+ The {{MLGraphBuilder/abs(input)}} steps are: + 1. Let |output| be the result of running the [=MLGraphBuilder/element-wise-unary-op | create element-wise unary operation =] given "abs" and |input|. + 1. If that throws an error, then re-throw the error and stop. + 1. Return |output|. + + The {{MLGraphBuilder/ceil(input)}} steps are: + 1. Let |output| be the result of running the [=MLGraphBuilder/element-wise-unary-op | create element-wise unary operation =] given "ceil" and |input|. + 1. If that throws an error, then re-throw the error and stop. + 1. Return |output|. + + The {{MLGraphBuilder/cos(input)}} steps are: + 1. Let |output| be the result of running the [=MLGraphBuilder/element-wise-unary-op | create element-wise unary operation =] given "cos" and |input|. + 1. If that throws an error, then re-throw the error and stop. + 1. Return |output|. + + The {{MLGraphBuilder/exp(input)}} steps are: + 1. Let |output| be the result of running the [=MLGraphBuilder/element-wise-unary-op | create element-wise unary operation =] given "exp" and |input|. + 1. If that throws an error, then re-throw the error and stop. + 1. Return |output|. + + The {{MLGraphBuilder/floor(input)}} steps are: + 1. Let |output| be the result of running the [=MLGraphBuilder/element-wise-unary-op | create element-wise unary operation =] given "floor" and |input|. + 1. If that throws an error, then re-throw the error and stop. + 1. Return |output|. + + The {{MLGraphBuilder/log(input)}} steps are: + 1. Let |output| be the result of running the [=MLGraphBuilder/element-wise-unary-op | create element-wise unary operation =] given "log" and |input|. + 1. If that throws an error, then re-throw the error and stop. + 1. Return |output|. + + The {{MLGraphBuilder/neg(input)}} steps are: + 1. Let |output| be the result of running the [=MLGraphBuilder/element-wise-unary-op | create element-wise unary operation =] given "neg" and |input|. + 1. If that throws an error, then re-throw the error and stop. + 1. Return |output|. + + The {{MLGraphBuilder/sin(input)}} steps are: + 1. Let |output| be the result of running the [=MLGraphBuilder/element-wise-unary-op | create element-wise unary operation =] given "sin" and |input|. + 1. If that throws an error, then re-throw the error and stop. + 1. Return |output|. + + The {{MLGraphBuilder/tan(input)}} steps are: + 1. Let |output| be the result of running the [=MLGraphBuilder/element-wise-unary-op | create element-wise unary operation =] given "tan" and |input|. + 1. If that throws an error, then re-throw the error and stop. + 1. Return |output|. +
+
+ ### The elu() method ### {#api-mlgraphbuilder-elu} Calculate the exponential linear unit function on the input tensor element-wise. The calculation follows the expression `max(0, x) + alpha * (exp(min(0, x)) - 1)`. -
- **Arguments:** - - *x*: an {{MLOperand}}. The input tensor. - - *options*: an optional {{MLEluOptions}}. The optional parameters of the operation. - - *alpha*: a {{float}} scalar multiplier, default to 1. - **Returns:** - - an {{MLOperand}}. The output tensor of the same shape as *x*. - - an {{MLActivation}}. The activation function representing the elu operation. - -
+
The behavior of this operation can be generically emulated from the usage of other operations as follow. However, user agents typically have a more efficient implementation for it, therefore its usage is encouraged from the @@ -2192,9 +2184,75 @@ partial interface MLGraphBuilder { builder.exp(builder.min(builder.constant(0), x)), builder.constant(1)))); -
+
+ + To check ELU options given |options|, run the following steps: + +
+ 1. If |options| is not an [=object=] that [=implements=] {{MLEluOptions}}, then return `false`. + 1. If |options|.{{MLEluOptions/alpha}} is `undefined`, set |options|.{{MLEluOptions/alpha}} to `1`. + 1. Else if |options|.{{MLEluOptions/alpha}} is not a [=numeric type=], then then return `false`. + 1. Return `true`. +
+
+ +#### The {{MLGraphBuilder/elu(input, options)}} method #### {#api-mlgraphbuilder-elu-input-options} +
+ **Arguments:** + - *input*: an {{MLOperand}}. The input tensor. + - *options*: an optional {{MLEluOptions}}. The optional parameters of the operation. + - *alpha*: a {{float}} scalar multiplier, default to 1. + + **Returns:** + - an {{MLOperand}}. The output tensor of the same shape as *x*. +
+ +
+ + The {{MLGraphBuilder/elu(input, options)}} method steps are: + +
+ 1. Let |input| be the first argument. + 1. Let |options| be the second argument. + 1. If running the check ELU options steps with |options| returns `false`, then throw a "{{TypeError}}" {{DOMException}} and abort these steps. + 1. If any of the following sub-steps fail, throw an "{{OperationError}}" {{DOMException}} and stop. + 1. Let |output| be the result of invoking the copy MLOperand steps given |input|. + 1. Make a request to the underlying platform to: + 1. Let |opImpl| be an [=implementation-defined=] platform operator for the ELU operation, given |options|. + 1. Store a reference of |opImpl| in |output|.{{MLOperand/[[operator]]}}. + 1. Create an [=implementation-defined=] platform operand |outputImpl| to represent the output, given |output| and |opImpl|. + 1. Store a reference to |outputImpl| in |output|.{{MLOperand/[[operand]]}}. + 1. Connect |input|.{{MLOperand/[[operand]]}} as input to |opImpl|. + 1. Connect |output|.{{MLOperand/[[operand]]}} as output to |opImpl|. + 1. Return |output|. +
+
+ +#### The {{MLGraphBuilder/elu(options)}} method #### {#api-mlgraphbuilder-elu-options} +
+ **Arguments:** + - *options*: an optional {{MLEluOptions}}. The optional parameters of the operation. + - *alpha*: a {{float}} scalar multiplier, default to 1. + + **Returns:** + - an {{MLActivation}}. The activation function representing the elu operation. +
+ +
+ + The {{MLGraphBuilder/elu(options)}} method steps are: + +
+ 1. Let |options| be the first argument. + 1. If |options| is `undefined`, let |options| be a new {{MLEluOptions}} object. + 1. If running the check ELU options steps with |options| returns `false`, then throw a "{{TypeError}}" {{DOMException}} and abort these steps. + 1. Let |op| be the result of invoking the create MLActivation steps with `"elu"` and |options|. + 1. Return |op|. +
+
+ ### The gemm() method ### {#api-mlgraphbuilder-gemm} Calculate the [general matrix multiplication of the Basic Linear Algebra Subprograms](https://en.wikipedia.org/wiki/Basic_Linear_Algebra_Subprograms#Level_3). The calculation follows the expression `alpha * A * B + beta * C`, where `A` is a 2-D tensor with shape [M, K] or [K, M], `B` is a 2-D tensor with shape [K, N] or [N, K], and `C` is broadcastable to the shape [M, N]. `A` and `B` may optionally be transposed prior to the calculation. -
+ +{{MLGemmOptions}} has the following members: +
+ : c + :: + An {{MLOperand}}. Specifies the third input tensor. It is either a scalar, or of the shape that is unidirectionally broadcastable to the shape [M, N] according to [[!numpy-broadcasting-rule]]. When it is not specified, the computation is done as if *c* is a scalar `0.0`. + + : alpha + :: + A {{float}} scalar multiplier for the first input. + + : beta + :: + A {{float}} scalar multiplier for the third input {{MLGemmOptions/c}}. + + : aTranspose + :: + A {{boolean}} indicating if the first input should be transposed prior to calculating the output. + + : bTranspose + :: + A {{boolean}} indicating if the second input should be transposed prior to calculating the output. +
+ +
**Arguments:** - *a*: an {{MLOperand}}. The first input 2-D tensor with shape [M, K] if *aTranspose* is false, or [K, M] if *aTranspose* is true. - *b*: an {{MLOperand}}. The second input 2-D tensor with shape [K, N] if *bTranspose* is false, or [N, K] if *bTranspose* is true. - *options*: an optional {{MLGemmOptions}}. The optional parameters of the operation. - - *c*: an {{MLOperand}}. The third input tensor. It is either a scalar, or of the shape that is unidirectionally broadcastable to the shape [M, N] according to [[!numpy-broadcasting-rule]]. When it is not specified, the computation is done as if *c* is a scalar 0.0. - - *alpha*: a {{float}} scalar multiplier for the first input, default to 1.0. - - *beta*: a {{float}} scalar multiplier for the third input, default to 1.0. - - *aTranspose*: a {{boolean}} indicating if the first input should be transposed prior to calculating the output, default to false. - - *bTranspose*: a {{boolean}} indicating if the second input should be transposed prior to calculating the output, default to false. **Returns:** an {{MLOperand}}. The output 2-D tensor of shape [M, N] that contains the calculated product of all the inputs. +
-
+
+ + The {{MLGraphBuilder/gemm(a, b, options)}} steps are: + +
+ 1. If |a| or |b| is not an instance of {{MLOperand}}, then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. If |options| is `undefined`, let |options| be an empty [=object=]. + 1. If |options|.{{MLGemmOptions/alpha}} is `undefined`, set it to `1.0`. + 1. If |options|.{{MLGemmOptions/beta}} is `undefined`, set it to `1.0`. + 1. If |options|.{{MLGemmOptions/aTranspose}} is `undefined`, set it to `false`. + 1. If |options|.{{MLGemmOptions/aTranspose}} is not `false`, set it to `true`. + 1. If |options|.{{MLGemmOptions/bTranspose}} is `undefined`, set it to `false`. + 1. If |options|.{{MLGemmOptions/bTranspose}} is not `false`, set it to `true`. + 1. Let |shapeA| be |a|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}} and |sizeA| the size of |shapeA|. + 1. Let |shapeB| be |a|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}} and |sizeB| the size of |shapeB|. + 1. If |sizeA| is not `2` or |sizeB| is not `2`, then throw a "{{DataError}}" {{DOMException}} and stop. + 1. If |options|.{{MLGemmOptions/aTranspose}} is `true`, then let |shapeA| be the reverse array of |shapeA|. + 1. If |options|.{{MLGemmOptions/bTranspose}} is `true`, then let |shapeB| be the reverse array of |shapeB|. + 1. If |shapeA|[1] is not equal to |shapeB|[0], then throw a "{{DataError}}" {{DOMException}} and stop. + 1. If |options|.{{MLGemmOptions/c}} [=map/exists=] and is not unidirectionally broadcastable to the shape [|shapeA|[0], |shapeB|[1]] according to the [[!numpy-broadcasting-rule]], then throw a "{{DataError}}" {{DOMException}} and stop. +
+ Type compatibility between |a|, |b| and |options|.{{MLGemmOptions/c}} can be also checked. +
+ 1. Let |desc| a new {{MLOperandDescriptor}}. + 1. Set |desc|.{{MLOperandDescriptor/dimensions}} to [|shapeA|[0], |shapeB|[1]]. + 1. Set |desc|.{{MLOperandDescriptor/type}} to |a|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/type}}. + 1. If any of the following sub-steps fail, throw an "{{OperationError}}" {{DOMException}} and stop. + 1. Let |output| be the result of invoking the create MLOperand steps given [=this=] and |desc|. + 1. Make a request to the underlying platform to: + 1. Let |opImpl| be an [=implementation-defined=] platform operator for the GEMM operation, given |options|. + 1. Store a reference of |opImpl| in |output|.{{MLOperand/[[operator]]}}. + 1. Create an [=implementation-defined=] platform operand |outputImpl| to represent the output, given |output| and |opImpl|. + 1. Store a reference to |outputImpl| in |output|.{{MLOperand/[[operand]]}}. + 1. Connect |a|.{{MLOperand/[[operand]]}} and |b|.{{MLOperand/[[operand]]}} as inputs to |opImpl|. + 1. Connect |output|.{{MLOperand/[[operand]]}} as output to |opImpl|. + 1. Return |output|. +
+
+ +
+
+ The behavior of this operation can be generically emulated from the usage of other operations as follow. However, user agents typically have a more efficient implementation for it, therefore its usage is encouraged from the performance standpoint. -
+  
+
     if (options.aTranspose)
       a = builder.transpose(a);
 
@@ -2234,8 +2297,8 @@ partial interface MLGraphBuilder {
 
     let ab = builder.matmul(builder.mul(builder.constant(options.alpha), a), b);
     return (c ? builder.add(ab, builder.mul(builder.constant(options.beta), c)) : ab);
-    
-
+ +
### The gru() method ### {#api-mlgraphbuilder-gru} From c3ec17681cec176b3bb12b9772ca51a848be2bf8 Mon Sep 17 00:00:00 2001 From: Zoltan Kis Date: Thu, 22 Jun 2023 11:25:06 +0300 Subject: [PATCH 032/112] Add the gru() and gruCell() algorithms Signed-off-by: Zoltan Kis --- index.bs | 181 +++++++++++++++++++++++++++++++++++++++++++++++-------- 1 file changed, 155 insertions(+), 26 deletions(-) diff --git a/index.bs b/index.bs index 96cd6729..ab4802bc 100644 --- a/index.bs +++ b/index.bs @@ -2269,28 +2269,98 @@ partial interface MLGraphBuilder { optional MLGruOptions options = {}); }; -
+ +{{MLGruOptions}} has the following members: +
+ : bias + :: + An {{MLOperand}}. Specifies the 2-D input bias tensor of shape [num_directions, 3 * hidden_size]. The ordering of the bias vectors in the second dimension of the tensor shape is specified according to the {{MLGruOptions/layout}} argument. + + : recurrentBias + :: + An {{MLOperand}}. Specifies the 2-D recurrent bias tensor of shape [num_directions, 3 * hidden_size]. The ordering of the bias vectors in the second dimension of the tensor shape is specified according to the {{MLGruOptions/layout}} argument. + + : initialHiddenState + :: + An {{MLOperand}}. The 3-D initial hidden state tensor of shape [num_directions, batch_size, hidden_size]. + When not specified, implementations SHOULD use a tensor filled with zero. + + : resetAfter + :: + A {{boolean}} indicating whether to apply the reset gate after or before matrix multiplication. The default value is `true`. + + : returnSequence + :: + A {{boolean}} indicating whether to also return the entire sequence with every output from each time step in it in addition to the output of the last time step. + The default value is `false`. + + : direction + :: + An {{MLRecurrentNetworkDirection}}. Specifies the processing direction of the input sequence. When set to `"both"`, the size of the first dimension of the weight and the bias tensor shapes must be `2`, and the input is processed in both directions. + + : layout + :: + An {{MLGruWeightLayout}}. The ordering of the weight and bias vectors for the internal gates of GRU, specifically the `update (z)`, `reset (r)`, and `new (n)` gate, as indicated in the second dimension of the weight and bias tensor shape. When not specified, the default layout is `"zrn"`. + + : activations + :: + A sequence of {{MLActivation}}. Specifies a pair of activation functions with the first function used for the update and reset gate, and the second used for the new gate. When not specified, implementations SHOULD use the the pair of sigmoid (`"sigmoid"`) and the hyperbolic tangent (`"tanh"`) functions, respectively. +
+ +
**Arguments:** - *input*: an {{MLOperand}}. The input 3-D tensor of shape [steps, batch_size, input_size]. - - *weight*: an {{MLOperand}}. The 3-D input weight tensor of shape [num_directions, 3 * hidden_size, input_size]. The ordering of the weight vectors in the second dimension of the tensor shape is specified according to the *options.layout* argument. - - *recurrentWeight*: an {{MLOperand}}. The 3-D recurrent weight tensor of shape [num_directions, 3 * hidden_size, hidden_size]. The ordering of the weight vectors in the second dimension of the tensor shape is specified according to the *options.layout* argument. + - *weight*: an {{MLOperand}}. The 3-D input weight tensor of shape [num_directions, 3 * hidden_size, input_size]. The ordering of the weight vectors in the second dimension of the tensor shape is specified according to the |options|.{{MLGruOptions/layout}} argument. + - *recurrentWeight*: an {{MLOperand}}. The 3-D recurrent weight tensor of shape [num_directions, 3 * hidden_size, hidden_size]. The ordering of the weight vectors in the second dimension of the tensor shape is specified according to the |options|.{{MLGruOptions/layout}} argument. - *steps*: an {{unsigned long}} scalar. The number of time steps in the recurrent network. The value must be greater than 0. - *hiddenSize*: an {{unsigned long}} scalar. The value of the third dimension of the cell output tensor shape. It indicates the number of features in the hidden state. - *options*: an optional {{MLGruOptions}}. The optional parameters of the operation. - - *bias*: an {{MLOperand}}. The 2-D input bias tensor of shape [num_directions, 3 * hidden_size]. The ordering of the bias vectors in the second dimension of the tensor shape is specified according to the *options.layout* argument. - - *recurrentBias*: an {{MLOperand}}. The 2-D recurrent bias tensor of shape [num_directions, 3 * hidden_size]. The ordering of the bias vectors in the second dimension of the tensor shape is specified according to the *options.layout* argument. - - *initialHiddenState*: an {{MLOperand}}. The 3-D initial hidden state tensor of shape [num_directions, batch_size, hidden_size]. When not specified, it's assumed to be a tensor filled with zero. - - *resetAfter*: a {{boolean}} indicating whether to apply the reset gate after or before matrix multiplication. Default to true. - - *returnSequence*: a {{boolean}} indicating whether to also return the entire sequence with every output from each time step in it in addition to the output of the last time step. Default to false. - - *direction*: an {{MLRecurrentNetworkDirection}}. The processing direction of the input sequence. When set to *"both"*, the size of the first dimension of the weight and the bias tensor shapes must be 2, and the input is processed in both directions. - - *layout*: an {{MLGruWeightLayout}}. The ordering of the weight and bias vectors for the internal gates of GRU, specifically the *update (z)*, *reset (r)*, and *new (n)* gate, as indicated in the second dimension of the weight and bias tensor shape. When not specified, the default layout is *"zrn"*. - - *activations*: a sequence of {{MLActivation}}. A pair of activation functions with the first function used for the update and reset gate, and the second used for the new gate. When not specified, it's assumed to be the sigmoid (*"sigmoid"*) and the hyperbolic tangent (*"tanh"*) function respectively. - **Returns:** a sequence of {{MLOperand}}. The first element of the sequence is a 3-D tensor of shape [num_directions, batch_size, hidden_size], the cell output from the last time step of the network. Additionally, if *options.returnSequence* is set to true, the second element is the 4-D output tensor of shape [steps, num_directions, batch_size, hidden_size] containing every cell outputs from each time step in the temporal sequence. + **Returns:** a sequence of {{MLOperand}}. The first element of the sequence is a 3-D tensor of shape [num_directions, batch_size, hidden_size], the cell output from the last time step of the network. Additionally, if |options|.{{MLGruOptions/returnSequence}} is set to `true`, the second element is the 4-D output tensor of shape [steps, num_directions, batch_size, hidden_size] containing every cell outputs from each time step in the temporal sequence. +
-
+
+ + The {{MLGraphBuilder/gru(input, weight, recurrentWeight, steps, hiddenSize, options)}} steps are: + +
+ 1. If |input|, |weight| or |recurrentWeight| is not an instance of {{MLOperand}}, then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. If the rank of |input| or |weight| is not `3`, then throw a "{{DataError}}" {{DOMException}} and stop. + 1. If the rank of |weight| or |recurrentWeight| is not `2`, then throw a "{{DataError}}" {{DOMException}} and stop. + 1. If |options| is `undefined`, let |options| be an empty [=object=]. + 1. If |options|.{{MLGruOptions/bias}} [=map/exists=]. + 1. If it is not an instance of {{MLOperand}}, then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. If its rank is not `2`, then throw a "{{DataError}}" {{DOMException}} and stop. + 1. If |options|.{{MLGruOptions/recurrentBias}} [=map/exists=]. + 1. If it is not an instance of {{MLOperand}}, then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. If its rank is not `2`, then throw a "{{DataError}}" {{DOMException}} and stop. + 1. If |options|.{{MLGruOptions/initialHiddenState}} [=map/exists=]. + 1. If it is not an instance of {{MLOperand}}, then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. If its rank is not `3`, then throw a "{{DataError}}" {{DOMException}} and stop. + 1. If |options|.{{MLGruOptions/resetAfter}} is `undefined`, set it to `true`. + 1. If |options|.{{MLGruOptions/returnSequence}} is `undefined`, set it to `false`. + 1. If |options|.{{MLGruOptions/direction}} is `undefined`, set it to `"forward"`. + 1. If |options|.{{MLGruOptions/direction}} is not one of {{MLRecurrentNetworkDirection}}, then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. If |options|.{{MLGruOptions/layout}} is `undefined`, set it to `"zrn"`. + 1. If |options|.{{MLGruOptions/layout}} is not one of {{MLGruWeightLayout}}, then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. If |options|.{{MLGruOptions/activations}} [=map/exists=] and is not an array of size `2`, or if any of its elements is not an instance of {{MLActivation}}, then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. If |steps| is not a [=number=] or it is `0`, then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. Let |output| be an empty sequence of {{MLOperand}} objects. + 1. If any of the following sub-steps fail, throw an "{{OperationError}}" {{DOMException}} and stop. + 1. Make a request to the underlying platform to: + 1. Let |opImpl| be an [=implementation-defined=] platform operator for `"gru"`, given |weight|, |recurrentWeight|, |steps|, |hiddenSize| and |options| as parameters. + 1. Connect |input|.{{MLOperand/[[operand]]}} as input to |opImpl|. + 1. Connect |output| as output to |opImpl|. + 1. Return |output|. +
+
+ +
+
+ The behavior of this operation can be generically emulated from the usage of other operations as follows. However, user agents typically have a more efficient implementation for it, therefore its usage is encouraged from the performance standpoint. -
+  
+
     const numDirections = (options.direction == "both" ? 2 : 1);
     let hiddenState = options.initialHiddenState;
 
@@ -2346,12 +2416,13 @@ partial interface MLGraphBuilder {
     }
 
     return (sequence ? [hiddenState, sequence] : [hiddenState]);
-    
-
+ +
### The gruCell() method ### {#api-mlgraphbuilder-grucell} A single time step of the Gated Recurrent Unit [[GRU]] recurrent network using an update gate and a reset gate to compute the hidden state that rolls into the output across the temporal sequence of a recurrent network. + -
+ +{{MLGruCellOptions}} has the following members: +
+ : bias + :: + An {{MLOperand}}. Specifies the 1-D input bias tensor of shape [3 * hidden_size]. The ordering of the bias vectors in the second dimension of the tensor shape is specified according to the {{MLGruOptions/layout}} argument. + + : recurrentBias + :: + An {{MLOperand}}. Specifies the 1-D recurrent bias tensor of shape [3 * hidden_size]. The ordering of the bias vectors in the second dimension of the tensor shape is specified according to the {{MLGruOptions/layout}} argument. + + : resetAfter + :: + A {{boolean}} indicating whether to apply the reset gate after or before matrix multiplication. The default value is `true`. + + : layout + :: + An {{MLGruWeightLayout}}. The ordering of the weight and bias vectors for the internal gates of GRU, specifically the `update (z)`, `reset (r)`, and `new (n)` gate, as indicated in the second dimension of the weight and bias tensor shape. When not specified, the default layout is `"zrn"`. + + : activations + :: + A sequence of {{MLActivation}}. Specifies a pair of activation functions with the first function used for the update and reset gate, and the second used for the new gate. When not specified, implementations SHOULD use the the pair of sigmoid (`"sigmoid"`) and the hyperbolic tangent (`"tanh"`) functions, respectively. +
+ +
**Arguments:** - *input*: an {{MLOperand}}. The input 2-D tensor of shape [batch_size, input_size]. - *weight*: an {{MLOperand}}. The 2-D input weight tensor of shape [3 * hidden_size, input_size]. The ordering of the weight vectors in the first dimension of the tensor shape is specified according to the *options.layout* argument. @@ -2375,17 +2470,51 @@ partial interface MLGraphBuilder { - *hiddenState*: an {{MLOperand}}. The 2-D input hidden state tensor of shape [batch_size, hidden_size]. - *hiddenSize*: an {{unsigned long}} scalar. The value of the second dimension of the output tensor shape. It indicates the number of features in the hidden state. - *options*: an optional {{MLGruCellOptions}}. The optional parameters of the operation. - - *bias*: an {{MLOperand}}. The 1-D input bias tensor of shape [3 * hidden_size]. The ordering of the bias vectors in the first dimension of the tensor shape is specified according to the *options.layout* argument. - - *recurrentBias*: an {{MLOperand}}. The 1-D recurrent bias tensor of shape [3 * hidden_size]. The ordering of the bias vectors in the first dimension of the tensor shape is specified according to the *options.layout* argument. - - *resetAfter*: a {{boolean}} indicating whether to apply the reset gate after or before matrix multiplication. Default to true. - - *layout*: an {{MLGruWeightLayout}}. The ordering of the weight and bias vectors for the internal gates of GRU, specifically the *update (z)*, *reset (r)*, and *new (n)* gate, as indicated in the first dimension of the weight and bias tensor shapes. When not specified, the default layout is *"zrn"*. - - *activations*: a sequence of {{MLActivation}}. A pair of activation functions with the first function used for the *update (z)* and *reset (r)* gate, and the second used for the *new (n)* gate. When not specified, it's default to the sigmoid (*"sigmoid"*) and the hyperbolic tangent (*"tanh"*) function respectively. **Returns:** an {{MLOperand}}. The 2-D tensor of shape [batch_size, hidden_size], the cell output hidden state of a single time step of the recurrent network. +
-
+
+ + The {{MLGraphBuilder/gruCell(input, weight, recurrentWeight, hiddenState, hiddenSize, options)}} steps are: + +
+ 1. If |input|, |weight| or |recurrentWeight| is not an instance of {{MLOperand}}, then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. If the rank of |input| or |weight| is not `3`, then throw a "{{DataError}}" {{DOMException}} and stop. + 1. If the rank of |weight| or |recurrentWeight| is not `2`, then throw a "{{DataError}}" {{DOMException}} and stop. + 1. If |options| is `undefined`, let |options| be an empty [=object=]. + 1. If |options|.{{MLGruOptions/bias}} [=map/exists=]. + 1. If it is not an instance of {{MLOperand}}, then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. If its rank is not `1`, then throw a "{{DataError}}" {{DOMException}} and stop. + 1. If |options|.{{MLGruOptions/recurrentBias}} [=map/exists=]. + 1. If it is not an instance of {{MLOperand}}, then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. If its rank is not `1`, then throw a "{{DataError}}" {{DOMException}} and stop. + 1. If |options|.{{MLGruOptions/resetAfter}} is `undefined`, set it to `true`. + 1. If |options|.{{MLGruOptions/layout}} is `undefined`, set it to `"zrn"`. + 1. If |options|.{{MLGruOptions/layout}} is not one of {{MLGruWeightLayout}}, then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. If |options|.{{MLGruOptions/activations}} [=map/exists=] and is not an array of size `2`, or if any of its elements is not an instance of {{MLActivation}}, then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. Let |desc| a new {{MLOperandDescriptor}}. + 1. Set |desc|.{{MLOperandDescriptor/dimensions}} to [ |input|.{{MLOperandDescriptor/dimensions}}[0], |hiddenSize| ]. + 1. Set |desc|.{{MLOperandDescriptor/type}} to |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/type}}. + 1. If any of the following sub-steps fail, throw an "{{OperationError}}" {{DOMException}} and stop. + 1. Let |output| be the result of invoking the create MLOperand steps given [=this=] and |desc|. + 1. Make a request to the underlying platform to: + 1. Let |opImpl| be an [=implementation-defined=] platform operator for `"gruCell"`, given |weight|, |recurrentWeight|, |hiddenState|, |hiddenSize| and |options| as parameters. + 1. Store a reference of |opImpl| in |output|.{{MLOperand/[[operator]]}}. + 1. Create an [=implementation-defined=] platform operand |outputImpl| to represent the output, given |output| and |opImpl|. + 1. Store a reference to |outputImpl| in |output|.{{MLOperand/[[operand]]}}. + 1. Connect |input|.{{MLOperand/[[operand]]}} as input to |opImpl|. + 1. Connect |output|.{{MLOperand/[[operand]]}} as output to |opImpl|. + 1. Return |output|. +
+
+ +
+
+ The behavior of this operation can be generically emulated via other operations as shown below, when the weight layout is the default *"zrn"* layout, and the activation functions of the update/reset gate and new gate are of the operator types *sigmoid* and *tanh* respectively. -
+  
+
     const one = builder.constant(1);
     const zero = builder.constant(0);
 
@@ -2477,8 +2606,8 @@ partial interface MLGraphBuilder {
 
     // compute the new hidden state
     return builder.add(builder.mul(z, hiddenState), builder.mul(n, builder.sub(one, z)));
-    
-
+ +
### The hardSigmoid() method ### {#api-mlgraphbuilder-hard-sigmoid} From 9f40ccd6c9eb4508a7811cc002bbc6aca0b51528 Mon Sep 17 00:00:00 2001 From: Zoltan Kis Date: Thu, 22 Jun 2023 12:10:56 +0300 Subject: [PATCH 033/112] Add the hard-sigmoid algorithms Signed-off-by: Zoltan Kis --- index.bs | 91 +++++++++++++++++++++++++++++++++++++++++++++++--------- 1 file changed, 77 insertions(+), 14 deletions(-) diff --git a/index.bs b/index.bs index 96cd6729..34888b45 100644 --- a/index.bs +++ b/index.bs @@ -2482,7 +2482,7 @@ partial interface MLGraphBuilder {
### The hardSigmoid() method ### {#api-mlgraphbuilder-hard-sigmoid} -Calculate the non-smooth function used in place of a sigmoid function on the input tensor. +Calculate the non-smooth hard sigmoid function on the input tensor, used instead of the sigmoid function for faster computation. -
- **Arguments:** - - *x*: an {{MLOperand}}. The input tensor. - - *options*: an optional {{MLHardSigmoidOptions}}. The optional parameters of the operation. - - *alpha*: a {{float}} scalar multiplier, default to 0.2. - - *beta*: a {{float}} scalar addition, default to 0.5. - - **Returns:** - - an {{MLOperand}}. The output tensor of the same shape as *x*. - - an {{MLActivation}}. The activation function representing the hard sigmoid operation. -
+
The behavior of this operation can be generically emulated from the usage of other operations as follow. However, user agents typically have a more efficient implementation for it, therefore its usage is encouraged from the @@ -2519,9 +2509,82 @@ partial interface MLGraphBuilder { builder.constant(1)), builder.constant(0)); -
+{{MLHardSigmoidOptions}} has the following members: +
+ : alpha + :: + A {{float}} scalar multiplier. + The default value is `0.2`. + : beta + :: + A {{float}} point scalar addition. + The default value is `0.5`. +
+ +
+ + To check hard-sigmoid options given |options|, run the following steps: + +
+ 1. If |options| is not an [=object=] that [=implements=] {{MLHardSigmoidOptions}}, then return `false`. + 1. If |options|.{{MLEluOptions/alpha}} is `undefined`, set |options|.{{MLHardSigmoidOptions/alpha}} to `0.2`. + 1. Else if |options|.{{MLHardSigmoidOptions/alpha}} is not a [=numeric type=], then then return `false`. + 1. If |options|.{{MLHardSigmoidOptions/beta}} is `undefined`, set |options|.{{MLHardSigmoidOptions/beta}} to `0.5`. + 1. Else if |options|.{{MLHardSigmoidOptions/beta}} is not a [=numeric type=], then then return `false`. + 1. Return `true`. +
+
+ +#### The {{MLGraphBuilder/hardSigmoid(input, options)}} method #### {#api-mlgraphbuilder-hardsigmoid-input-options} +
+ **Arguments:** + - *input*: an {{MLOperand}}. The input tensor. + - *options*: an optional {{MLHardSigmoidOptions}}. The optional parameters of the operation. + + **Returns:** + - an {{MLOperand}}. The output tensor of the same shape as *input*. +
+
+ The {{MLGraphBuilder/hardSigmoid(input, options)}} method steps are: + 1. Let |input| be the first argument. + 1. Let |options| be the second argument. + 1. If running the check hard-sigmoid options steps with |options| returns `false`, then throw a "{{TypeError}}" {{DOMException}} and abort these steps. + 1. If any of the following sub-steps fail, throw an "{{OperationError}}" {{DOMException}} and stop. + 1. Let |output| be the result of invoking the copy MLOperand steps given |input|. + 1. Make a request to the underlying platform to: + 1. Let |opImpl| be an [=implementation-defined=] platform operator for the hard sigmoid operation, given |options|. + 1. Store a reference of |opImpl| in |output|.{{MLOperand/[[operator]]}}. + 1. Create an [=implementation-defined=] platform operand |outputImpl| to represent the output, given |output| and |opImpl|. + 1. Store a reference to |outputImpl| in |output|.{{MLOperand/[[operand]]}}. + 1. Connect |input|.{{MLOperand/[[operand]]}} as input to |opImpl|. + 1. Connect |output|.{{MLOperand/[[operand]]}} as output to |opImpl|. + 1. Return |output|. +
+ +#### The {{MLGraphBuilder/hardSigmoid(options)}} method #### {#api-mlgraphbuilder-hardsigmoid-options} +
+ **Arguments:** + - *options*: an optional {{MLHardSigmoidOptions}}. The optional parameters of the operation. + + **Returns:** + - an {{MLActivation}}. The activation function representing the hard sigmoid operation. +
+ +
+ + The {{MLGraphBuilder/hardSigmoid(options)}} method steps are: + +
+ 1. Let |options| be the first argument. + 1. If running the check hard-sigmoid options steps with |options| returns `false`, then throw a "{{TypeError}}" {{DOMException}} and abort these steps. + 1. Let |op| be the result of invoking the create MLActivation steps with `"hardSigmoid"` and |options|. + 1. If that throws an error, re-throw the error and abort these steps. + 1. Return |op|. +
+
+ ### The hardSwish() method ### {#api-mlgraphbuilder-hard-swish} Computes the nonlinear function `y = x * max(0, min(6, (x + 3))) / 6` that is introduced by [[MobileNetV3]] on the input tensor element-wise. -
- **Arguments:** - - *x*: an {{MLOperand}}. The input tensor. - - **Returns:** - - an {{MLOperand}}. The output tensor of the same shape as *x*. - - an {{MLActivation}}. The activation function representing the hard-swish operation. -
+
+
+ The behavior of this operation can be generically emulated from the usage of other operations as follow. However, user agents typically have a more efficient implementation for it, therefore its usage is encouraged from the performance standpoint. -
+  
+
     return builder.div(
                builder.mul(
                    x,
@@ -2553,10 +2549,58 @@ partial interface MLGraphBuilder {
                            builder.constant(6),
                            builder.add(x, builder.constant(3))))),
                builder.constant(6));
-    
-
+ + +
+ +#### The {{MLGraphBuilder/hardSwish(input)}} method #### {#api-mlgraphbuilder-hardswish-input} +
+ **Arguments:** + - *input*: an {{MLOperand}}. The input tensor. + + **Returns:** + - an {{MLOperand}}. The output tensor of the same shape as *input*.
+
+ + The {{MLGraphBuilder/hardSwish(input)}} method steps are: + +
+ 1. Let |input| be the first argument. + 1. If any of the following sub-steps fail, throw an "{{OperationError}}" {{DOMException}} and stop. + 1. Let |output| be the result of invoking the copy MLOperand steps given |input|. + 1. Make a request to the underlying platform to: + 1. Let |opImpl| be an [=implementation-defined=] platform operator for the hard-swish operation. + 1. Store a reference of |opImpl| in |output|.{{MLOperand/[[operator]]}}. + 1. Create an [=implementation-defined=] platform operand |outputImpl| to represent the output, given |output| and |opImpl|. + 1. Store a reference to |outputImpl| in |output|.{{MLOperand/[[operand]]}}. + 1. Connect |input|.{{MLOperand/[[operand]]}} as input to |opImpl|. + 1. Connect |output|.{{MLOperand/[[operand]]}} as output to |opImpl|. + 1. Return |output|. +
+
+ +#### The {{MLGraphBuilder/hardSwish()}} method #### {#api-mlgraphbuilder-hardswish} +
+ **Arguments:** + - None. + + **Returns:** + - an {{MLActivation}}. The activation function representing the hard-swish operation. +
+ +
+ + The {{MLGraphBuilder/hardSwish()}} method steps are: + +
+ 1. Let |op| be the result of invoking the create MLActivation steps with `"hardSwish"`. + 1. If that throws an error, re-throw the error and abort these steps. + 1. Return |op|. +
+
+ ### The instanceNormalization() method ### {#api-mlgraphbuilder-instancenorm} Normalize the input features using [[Instance-Normalization]]. Unlike [[#api-mlgraphbuilder-batchnorm]] where the mean and variance values used in the calculation are previously computed across the batch dimension during the model training phase, the mean and variance values used in the calculation of an instance normalization are computed internally on the fly per input feature. -
+ +The {{MLInstanceNormalizationOptions}} members are: +
+ : scale + :: + An {{MLOperand}}. Specifies he 1-D tensor of the scaling values whose length is equal to the number of channels, i.e. the size of the feature dimension of the input. For example, for an |input| tensor with `nchw` layout, the length is the value of |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}[1]. + + : bias + :: + An {{MLOperand}}. Specifies the 1-D tensor of the bias values whose length is equal to the size of the feature dimension of the input. For example, for an |input| tensor with `nchw` layout, the length is the value of |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}[1]. + + : epsilon + :: + A {{float}} scalar. Specifies a small value to prevent computational error due to divide-by-zero. + + : layout + :: + An {{MLInputOperandLayout}}. Specifies the layout format of the input. + +
+ +
**Arguments:** - *input*: an {{MLOperand}}. The input 4-D tensor. - *options*: an optional {{MLInstanceNormalizationOptions}}. The optional parameters of the operation. - - *scale*: an {{MLOperand}}. The 1-D tensor of the scaling values whose length is equal to the size of the feature dimension of the input e.g. for the input tensor with *nchw* layout, the feature dimension is 1. - - *bias*: an {{MLOperand}}. The 1-D tensor of the bias values whose length is equal to the size of the feature dimension of the input e.g. for the input tensor with *nchw* layout, the feature dimension is 1. - - *epsilon*: a {{float}} scalar. A small value to prevent computational error due to divide-by-zero. The default value is 0.00001 when not specified. - - *layout*: an {{MLInputOperandLayout}}. This option specifies the layout format of the input. The default value is *"nchw"*. **Returns:** an {{MLOperand}}. The instance-normalized 4-D tensor of the same shape as the input tensor. +
-
+
+ + The {{MLGraphBuilder/instanceNormalization(input, options)}} steps are: + +
+ 1. If |input| is not an instance of {{MLOperand}}, then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. If the rank of |input| is not `4`, then throw a "{{DataError}}" {{DOMException}} and stop. + 1. If |options| is `undefined`, let |options| be an empty [=object=]. + 1. If |options|.{{MLInstanceNormalizationOptions/scale}} is not an instance of {{MLOperand}}, then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. If the rank of |options|.{{MLInstanceNormalizationOptions/scale}} is not equal to the size of the channel dimension of |input|, then throw a "{{DataError}}" {{DOMException}} and stop. + 1. If |options|.{{MLInstanceNormalizationOptions/bias}} is not an instance of {{MLOperand}}, then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. If the rank of |options|.{{MLInstanceNormalizationOptions/bias}} is not equal to the size of the channel dimension of |input|, then throw a "{{DataError}}" {{DOMException}} and stop. + 1. If |options|.{{MLInstanceNormalizationOptions/epsilon}} is `undefined`, let it be `0.00001`. + 1. If |options|.{{MLInstanceNormalizationOptions/layout}} is `undefined`, let it be `"nchw"`. + 1. Otherwise if |options|.{{MLInstanceNormalizationOptions/layout}} is not one of {{MLInputOperandLayout}}, then throw a "{{DataError}}" {{DOMException}} and stop. + 1. If any of the following sub-steps fail, throw an "{{OperationError}}" {{DOMException}} and stop. + 1. Let |output| be the result of invoking the copy MLOperand steps given |input|. + 1. Make a request to the underlying platform to: + 1. Let |opImpl| be an [=implementation-defined=] platform operator for the instance normalization operation, given |options|. + 1. Store a reference of |opImpl| in |output|.{{MLOperand/[[operator]]}}. + 1. Create an [=implementation-defined=] platform operand |outputImpl| to represent the output, given |output| and |opImpl|. + 1. Store a reference to |outputImpl| in |output|.{{MLOperand/[[operand]]}}. + 1. Connect |input|.{{MLOperand/[[operand]]}} as input to |opImpl|. + 1. Connect |output|.{{MLOperand/[[operand]]}} as output to |opImpl|. + 1. Return |output|. +
+
+ +
+
+ The behavior of this operation when the input tensor is 4-D of the *"nchw"* layout can be generically emulated from the usage of other operations as follow. However, user agents typically have a more efficient implementation for it, therefore its usage is encouraged from the performance standpoint. -
+  
+
     // The mean reductions happen over the spatial dimensions of the input
     // e.g. axis 2 and 3 of the input tensor.
     const reduceOptions = { axes: [2,3], keepDimensions: true };
@@ -2614,8 +2664,8 @@ partial interface MLGraphBuilder {
         ),
       builder.reshape(options.bias, shape)
       );
-    
-
+ +
### The leakyRelu() method ### {#api-mlgraphbuilder-leakyrelu} From a6ef2f0658b7d524d9d60875e52a255587b04267 Mon Sep 17 00:00:00 2001 From: Zoltan Kis Date: Mon, 26 Jun 2023 15:38:59 +0300 Subject: [PATCH 036/112] Add the leaky RELU algorithm Signed-off-by: Zoltan Kis --- index.bs | 90 ++++++++++++++++++++++++++++++++++++++++++++++++-------- 1 file changed, 78 insertions(+), 12 deletions(-) diff --git a/index.bs b/index.bs index 96cd6729..6a0a0758 100644 --- a/index.bs +++ b/index.bs @@ -2620,27 +2620,19 @@ partial interface MLGraphBuilder { ### The leakyRelu() method ### {#api-mlgraphbuilder-leakyrelu} Calculate the leaky version of rectified linear function on the input tensor element-wise. The calculation follows the expression `max(0, x) + alpha ∗ min(0, x)`. + -
- **Arguments:** - - *x*: an {{MLOperand}}. The input tensor. - - *options*: an optional {{MLLeakyReluOptions}}. The optional parameters of the operation. - - *alpha*: a {{float}} scalar multiplier, default to 0.01. - - **Returns:** - - an {{MLOperand}}. The output tensor of the same shape as *x*. - - an {{MLActivation}}. The activation function representing the leaky relu operation. -
+
The behavior of this operation can be generically emulated from the usage of other operations as follow. However, user agents typically have a more efficient implementation for it, therefore its usage is encouraged from the @@ -2649,9 +2641,83 @@ partial interface MLGraphBuilder { return builder.add(builder.max(builder.constant(0), x), builder.mul(builder.constant(options.alpha), builder.min(builder.constant(0), x))); -
+{{MLLeakyReluOptions}} has the following members: +
+ : alpha + :: + A {{float}} scalar multiplier. + The default value is `0.01`. +
+ +
+ + To check leaky-relu options given |options|, run the following steps: + +
+ 1. If |options| is not an [=object=] that [=implements=] {{MLLeakyReluOptions}}, then return `false`. + 1. If |options|.{{MLLeakyReluOptions/alpha}} is `undefined`, set |options|.{{MLLeakyReluOptions/alpha}} to `1`. + 1. Else if |options|.{{MLLeakyReluOptions/alpha}} is not a [=numeric type=], then then return `false`. + 1. Return `true`. +
+
+ +#### The {{MLGraphBuilder/leakyRelu(input, options)}} method #### {#api-mlgraphbuilder-leaky-relu-input-options} +
+ **Arguments:** + - *input*: an {{MLOperand}}. The input tensor. + - *options*: an optional {{MLLeakyReluOptions}}. The optional parameters of the operation. + + **Returns:** + - an {{MLOperand}}. The output tensor of the same shape as *input*. +
+ +
+ + The {{MLGraphBuilder/leakyRelu(input, options)}} method steps are: + +
+ 1. Let |input| be the first argument. + 1. Let |options| be the second argument. + 1. If |options| is `undefined`, let |options| be a new {{MLLeakyReluOptions}} object. + 1. If running the check leaky-relu options steps with |options| returns `false`, then throw a "{{TypeError}}" {{DOMException}} and abort these steps. + 1. If any of the following sub-steps fail, throw an "{{OperationError}}" {{DOMException}} and stop. + 1. Let |output| be the result of invoking the copy MLOperand steps given |input|. + 1. Make a request to the underlying platform to: + 1. Let |opImpl| be an [=implementation-defined=] platform operator for the Leaky RELU operation, given |options|. + 1. Store a reference of |opImpl| in |output|.{{MLOperand/[[operator]]}}. + 1. Create an [=implementation-defined=] platform operand |outputImpl| to represent the output, given |output| and |opImpl|. + 1. Store a reference to |outputImpl| in |output|.{{MLOperand/[[operand]]}}. + 1. Connect |input|.{{MLOperand/[[operand]]}} as input to |opImpl|. + 1. Connect |output|.{{MLOperand/[[operand]]}} as output to |opImpl|. + 1. Return |output|. +
+
+ +#### The {{MLGraphBuilder/leakyRelu(options)}} method #### {#api-mlgraphbuilder-leaky-relu-options} +
+ **Arguments:** + - *options*: an optional {{MLLeakyReluOptions}}. The optional parameters of the operation. + + **Returns:** + - an {{MLActivation}}. The activation function representing the leaky relu operation. +
+ +
+ + The {{MLGraphBuilder/elu(options)}} method steps are: + +
+ 1. Let |options| be the first argument. + 1. If |options| is `undefined`, let |options| be a new {{MLLeakyReluOptions}} object. + 1. If running the check leaky-relu options steps with |options| returns `false`, then throw a "{{TypeError}}" {{DOMException}} and abort these steps. + 1. Let |op| be the result of invoking the create MLActivation steps with `"leakyRelu"` and |options|. + 1. If that throws an error, re-throw the error and abort these steps. + 1. Return |op|. +
+
+ ### The linear() method ### {#api-mlgraphbuilder-linear} Calculate a linear function `y = alpha * x + beta` on the input tensor. -
- **Arguments:** - - *x*: an {{MLOperand}}. The input tensor. - - *options*: an optional {{MLLinearOptions}}. The optional parameters of the operation. - - *alpha*: a {{float}} scalar multiplier, default to 1. - - *beta*: a {{float}} scalar addition, default to 0. - **Returns:** - - an {{MLOperand}}. The output tensor of the same shape as *x*. - - an {{MLActivation}}. The activation function representing the linear operation. - -
+
The behavior of this operation can be generically emulated from the usage of other operations as follow. However, user agents typically have a more efficient implementation for it, therefore its usage is encouraged from the @@ -2686,9 +2677,87 @@ partial interface MLGraphBuilder { builder.mul(x, builder.constant(options.alpha)), builder.constant(options.beta)); -
+{{MLLinearOptions}} has the following members: +
+ : alpha + :: + A {{float}} scalar multiplier. + The default value is `1`. + : beta + :: + A {{float}} scalar addition. + The default value is `0`. +
+ +
+ + To check linear options given |options|, run the following steps: + +
+ 1. If |options| is not an [=object=] that [=implements=] {{MLLinearOptions}}, then return `false`. + 1. If |options|.{{MLEluOptions/alpha}} is `undefined`, set |options|.{{MLLinearOptions/alpha}} to `1`. + 1. Else if |options|.{{MLLinearOptions/alpha}} is not a [=numeric type=], then then return `false`. + 1. If |options|.{{MLLinearOptions/beta}} is `undefined`, set |options|.{{MLLinearOptions/beta}} to `0`. + 1. Else if |options|.{{MLLinearOptions/beta}} is not a [=numeric type=], then then return `false`. + 1. Return `true`. +
+
+ +#### The {{MLGraphBuilder/linear(input, options)}} method #### {#api-mlgraphbuilder-linear-input-options} +
+ **Arguments:** + - *input*: an {{MLOperand}}. The input tensor. + - *options*: an optional {{MLLinearOptions}}. The optional parameters of the operation. + + **Returns:** + - an {{MLOperand}}. The output tensor of the same shape as *x*. +
+ +
+ + The {{MLGraphBuilder/linear(input, options)}} method steps are: + +
+ 1. Let |input| be the first argument. + 1. Let |options| be the second argument. + 1. If running the check linear options steps with |options| returns `false`, then throw a "{{TypeError}}" {{DOMException}} and abort these steps. + 1. If any of the following sub-steps fail, throw an "{{OperationError}}" {{DOMException}} and stop. + 1. Let |output| be the result of invoking the copy MLOperand steps given |input|. + 1. Make a request to the underlying platform to: + 1. Let |opImpl| be an [=implementation-defined=] platform operator for the linear operation, given |options|. + 1. Store a reference of |opImpl| in |output|.{{MLOperand/[[operator]]}}. + 1. Create an [=implementation-defined=] platform operand |outputImpl| to represent the output, given |output| and |opImpl|. + 1. Store a reference to |outputImpl| in |output|.{{MLOperand/[[operand]]}}. + 1. Connect |input|.{{MLOperand/[[operand]]}} as input to |opImpl|. + 1. Connect |output|.{{MLOperand/[[operand]]}} as output to |opImpl|. + 1. Return |output|. +
+
+ +#### The {{MLGraphBuilder/linear(options)}} method #### {#api-mlgraphbuilder-linear-options} +
+ **Arguments:** + - *options*: an optional {{MLLinearOptions}}. The optional parameters of the operation. + + **Returns:** + - an {{MLActivation}}. The activation function representing the linear operation. +
+ +
+ + The {{MLGraphBuilder/linear(options)}} method steps are: + +
+ 1. Let |options| be the first argument. + 1. If running the check linear options steps with |options| returns `false`, then throw a "{{TypeError}}" {{DOMException}} and abort these steps. + 1. Let |op| be the result of invoking the create MLActivation steps with `"linear"` and |options|. + 1. If that throws an error, re-throw the error and abort these steps. + 1. Return |op|. +
+
+ ### The lstm() method ### {#api-mlgraphbuilder-lstm} Long Short-Term Memory [[LSTM]] recurrent network uses an input, output, forget, and cell gate to compute the output state that rolls into the output across the temporal sequence of the network. -
+ +{{MLLstmOptions}} has the following members: +
+ : bias + :: + An {{MLOperand}}. Specifies the 2-D input bias tensor of shape [num_directions, 4 * hidden_size]. The ordering of the bias vectors in the second dimension of the tensor shape is specified according to {{MLLstmOptions/layout}}. + + : recurrentBias + :: + An {{MLOperand}}. Specifies the 2-D recurrent bias tensor of shape [num_directions, 4 * hidden_size]. The ordering of the bias vectors in the first dimension of the tensor shape is specified according to {{MLLstmOptions/layout}}. + + : peepholeWeight + :: + An {{MLOperand}}. Specifies the 2-D weight tensor for peepholes of shape [num_directions, 4 * hidden_size]. The pack ordering of the weight vectors is for the `input (i)`, `output (o)`, and `forget (f)` gate, respectively. + + : initialHiddenState + :: + An {{MLOperand}}. Specifies the 3-D initial hidden state tensor of shape [num_directions, batch_size, hidden_size]. When not specified, implementations SHOULD use a tensor filled with zero. + + : initialCellState + :: + An {{MLOperand}}. Specifies the 3-D initial hidden state tensor of shape [num_directions, batch_size, hidden_size]. When not specified, implementations SHOULD use a tensor filled with zero. + + : returnSequence + :: + A {{boolean}} indicating whether to also return the entire sequence with every output from each time step in it in addition to the output of the last time step. + + : direction + :: + An {{MLRecurrentNetworkDirection}}. Specifies the processing direction of the input sequence. When set to `"both"`, the size of the first dimension of the weight and the bias tensor shapes must be `2`, and the input is processed in both directions. + + : layout + :: + An {{MLLstmWeightLayout}}. The ordering of the weight and bias vectors for the internal gates of LSTM, specifically the `input (i)`, `output (o)`, `forget (f)`, and `cell (g)` gate, as indicated in the first dimension of the weight and bias tensor shapes. When not specified, the default layout is `"iofg"`. + + : activations + :: + A sequence of {{MLActivation}}. A sequence of three activation functions, the first one is used for the `input (i)`, `forget (f)`, and `output (o)` gate, the second one is used for the `cell (g)` gate, and the last used for filtering the output cell state before combining it with the result of the output gate to form the output hidden state. When not specified, implementations SHOULD use the sequence of the sigmoid function (`"sigmoid"`) followed by two hyperbolic tangent functions (`"tanh"`) respectively. +
+ +
**Arguments:** - *input*: an {{MLOperand}}. The input 3-D tensor of shape [steps, batch_size, input_size]. - - *weight*: an {{MLOperand}}. The 3-D input weight tensor of shape [num_directions, 4 * hidden_size, input_size]. The ordering of the weight vectors in the second dimension of the tensor shape is specified according to the *options.layout* argument. - - *recurrentWeight*: an {{MLOperand}}. The 3-D recurrent weight tensor of shape [num_directions, 4 * hidden_size, hidden_size]. The ordering of the weight vectors in the second dimension of the tensor shape is specified according to the *options.layout* argument. + - *weight*: an {{MLOperand}}. The 3-D input weight tensor of shape [num_directions, 4 * hidden_size, input_size]. The ordering of the weight vectors in the second dimension of the tensor shape is specified according to the |options|.{{MLLstmOptions/layout}}. + - *recurrentWeight*: an {{MLOperand}}. The 3-D recurrent weight tensor of shape [num_directions, 4 * hidden_size, hidden_size]. The ordering of the weight vectors in the second dimension of the tensor shape is specified according to the |options|.{{MLLstmOptions/layout}} argument. - *steps*: an {{unsigned long}} scalar. The number of time steps in the recurrent network. The value must be greater than 0. - *hiddenSize*: an {{unsigned long}} scalar. The value of the third dimension of the cell output tensor shape. It indicates the number of features in the hidden state. - - *options*: an optional {{MLGruOptions}}. The optional parameters of the operation. - - *bias*: an {{MLOperand}}. The 2-D input bias tensor of shape [num_directions, 4 * hidden_size]. The ordering of the bias vectors in the second dimension of the tensor shape is specified according to the *options.layout* argument. - - *recurrentBias*: an {{MLOperand}}. The 2-D recurrent bias tensor of shape [num_directions, 4 * hidden_size]. The ordering of the bias vectors in the second dimension of the tensor shape is specified according to the *options.layout* argument. - - *peepholeWeight*: an {{MLOperand}}. The 2-D weight tensor for peepholes of shape [num_directions, 3 * hidden_size]. The pack ordering of the weight vectors is for the *input (i)*, *output (o)*, and *forget (f)* gate respectively. - - *initialHiddenState*: an {{MLOperand}}. The 3-D initial hidden state tensor of shape [num_directions, batch_size, hidden_size]. When not specified, it's assumed to be a tensor filled with zero. - - *initialCellState*: an {{MLOperand}}. The 3-D initial hidden state tensor of shape [num_directions, batch_size, hidden_size]. When not specified, it's assumed to be a tensor filled with zero. - - *returnSequence*: a {{boolean}} indicating whether to also return the entire sequence with every output from each time step in it in addition to the output of the last time step. Default to false. - - *direction*: an {{MLRecurrentNetworkDirection}}. The processing direction of the input sequence. When set to *"both"*, the size of the first dimension of the weight and the bias tensor shapes must be 2, and the input is processed in both directions. - - *layout*: an {{MLLstmWeightLayout}}. The ordering of the weight and bias vectors for the internal gates of LSTM, specifically the *input (i)*, *output (o)*, *forget (f)*, and *cell (g)* gate, as indicated in the second dimension of the weight and bias tensor shapes. When not specified, the default layout is *"iofg"*. - - *activations*: a sequence of {{MLActivation}}. A sequence of three activation functions, the first one is used for the *input (i)*, *forget (f)*, and *output (o)* gate, the second one is used for the *cell (g)* gate, and the last used for filtering the output cell state before combining it with the result of the output gate to form the output hidden state. When not specified, they are assumed to be of the sigmoid function (*"sigmoid"*) followed by two hyperbolic tangent functions (*"tanh"*) respectively. + - *options*: an optional {{MLLstmOptions}}. The optional parameters of the operation. - **Returns:** a sequence of {{MLOperand}}. The first element of the sequence is a 3-D tensor of shape [num_directions, batch_size, hidden_size], the output hidden state from the last time step of the network. The second element is a 3-D tensor of shape [num_directions, batch_size, hidden_size], the output cell state from the last time step of the network. Additionally, if *options.returnSequence* is set to true, the third element is the 4-D output tensor of shape [steps, num_directions, batch_size, hidden_size] containing every output from each time step in the temporal sequence. + **Returns:** a sequence of {{MLOperand}}. The first element of the sequence is a 3-D tensor of shape [num_directions, batch_size, hidden_size], the output hidden state from the last time step of the network. The second element is a 3-D tensor of shape [num_directions, batch_size, hidden_size], the output cell state from the last time step of the network. Additionally, if |options|.{{MLLstmOptions/returnSequence}} is set to true, the third element is the 4-D output tensor of shape [steps, num_directions, batch_size, hidden_size] containing every output from each time step in the temporal sequence. +
-
+
+ + The {{MLGraphBuilder/lstm(input, weight, recurrentWeight, steps, hiddenSize, options)}} steps are: + +
+ 1. If |options| is `undefined`, let |options| be an empty [=object=]. + 1. If |options|.{{MLLstmOptions/direction}} is `undefined`, set it to `"forward"`. + 1. If |options|.{{MLLstmOptions/direction}} is not one of {{MLRecurrentNetworkDirection}}, then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. Let |num_directions| be `1` if |options|.{{MLLstmOptions/direction}} is `"forward"`, or otherwise let it be `2`. + 1. If |input|, |weight| or |recurrentWeight| is not an instance of {{MLOperand}}, then throw a "{{TypeError}}" {{DOMException}} and stop. +
+ The shape of |input|, |weight| or |recurrentWeight| could be also checked here. +
+ 1. If |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}[0] is not equal to |steps|, then throw a "{{DataError}}" {{DOMException}} and stop. + 1. Let |batch_size| be |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}[1]. + 1. If |options|.{{MLLstmOptions/bias}} [=map/exists=]. + 1. If it is not an instance of {{MLOperand}}, then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. If its rank is not `2`, then throw a "{{DataError}}" {{DOMException}} and stop. + 1. If |options|.{{MLLstmOptions/bias}}.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}[0] is not |num_directions|, then throw a "{{DataError}}" {{DOMException}} and stop. + 1. If |options|.{{MLLstmOptions/bias}}.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}[1] is not 4 * |hiddenSize|, then throw a "{{DataError}}" {{DOMException}} and stop. + 1. If |options|.{{MLLstmOptions/recurrentBias}} [=map/exists=]. + 1. If it is not an instance of {{MLOperand}}, then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. If its rank is not `2`, then throw a "{{DataError}}" {{DOMException}} and stop. + 1. If |options|.{{MLLstmOptions/recurrentBias}}.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}[0] is not |num_directions|, then throw a "{{DataError}}" {{DOMException}} and stop. + 1. If |options|.{{MLLstmOptions/recurrentBias}}.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}[1] is not 4 * |hiddenSize|, then throw a "{{DataError}}" {{DOMException}} and stop. + 1. If |options|.{{MLLstmOptions/peepholeWeight}} [=map/exists=]. + 1. If it is not an instance of {{MLOperand}}, then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. If its rank is not `2`, then throw a "{{DataError}}" {{DOMException}} and stop. + 1. If |options|.{{MLLstmOptions/peepholeWeight}}.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}[0] is not |num_directions|, then throw a "{{DataError}}" {{DOMException}} and stop. + 1. If |options|.{{MLLstmOptions/peepholeWeight}}.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}[1] is not 4 * |hiddenSize|, then throw a "{{DataError}}" {{DOMException}} and stop. + 1. If |options|.{{MLLstmOptions/initialHiddenState}} [=map/exists=]. + 1. If it is not an instance of {{MLOperand}}, then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. If its rank is not `3`, then throw a "{{DataError}}" {{DOMException}} and stop. + 1. If |options|.{{MLLstmOptions/initialHiddenState}}.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}[0] is not |num_directions|, then throw a "{{DataError}}" {{DOMException}} and stop. + 1. If |options|.{{MLLstmOptions/initialHiddenState}}.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}[1] is not equal to |batch_size|, then throw a "{{DataError}}" {{DOMException}} and stop. + 1. If |options|.{{MLLstmOptions/initialHiddenState}}.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}[2] is not |hiddenSize|, then throw a "{{DataError}}" {{DOMException}} and stop. + 1. If |options|.{{MLLstmOptions/initialCellState}} [=map/exists=]. + 1. If it is not an instance of {{MLOperand}}, then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. If its rank is not `3`, then throw a "{{DataError}}" {{DOMException}} and stop. + 1. If |options|.{{MLLstmOptions/initialCellState}}.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}[0] is not |num_directions|, then throw a "{{DataError}}" {{DOMException}} and stop. + 1. If |options|.{{MLLstmOptions/initialCellState}}.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}[1] is not equal to |batch_size|, then throw a "{{DataError}}" {{DOMException}} and stop. + 1. If |options|.{{MLLstmOptions/initialCellState}}.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}[2] is not |hiddenSize|, then throw a "{{DataError}}" {{DOMException}} and stop. + 1. If |options|.{{MLLstmOptions/returnSequence}} is `undefined`, set it to `false`. + 1. If |options|.{{MLLstmOptions/layout}} is `undefined`, set it to `"iofg"`. + 1. If |options|.{{MLLstmOptions/layout}} is not one of {{MLLstmWeightLayout}}, then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. If |options|.{{MLLstmOptions/activations}} [=map/exists=]: + 1. If it is not an array of size `3`, then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. If any of its elements is not an instance of {{MLActivation}}, then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. If any of the following sub-steps fail, throw an "{{OperationError}}" {{DOMException}} and stop. + 1. Let |desc| a new {{MLOperandDescriptor}}. + 1. Set |desc|.{{MLOperandDescriptor/dimensions}} to [ |nume_directions|, |batch_size|, |hiddenSize| ]. + 1. Set |desc|.{{MLOperandDescriptor/type}} to |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/type}}. + 1. Let |output0| be the result of invoking the create MLOperand steps given [=this=] and |desc|. + 1. Let |output1| be the result of invoking the create MLOperand steps given [=this=] and |desc|. + 1. Set |desc|.{{MLOperandDescriptor/dimensions}} to [ |steps|, |nume_directions|, |batch_size|, |hiddenSize| ]. + 1. Let |output2| be the result of invoking the create MLOperand steps given [=this=] and |desc|. + 1. Let |output| be the array [ |output0|, |output1|, |output2 ]. + 1. Make a request to the underlying platform to: + 1. Let |opImpl| be an [=implementation-defined=] platform operator for the LSTM operation, given |weight|, |recurrentWeight|, |steps|, |hiddenSize| and |options|. + 1. Store a reference of |opImpl| in |output0|.{{MLOperand/[[operator]]}}, |output1|.{{MLOperand/[[operator]]}} and |output2|.{{MLOperand/[[operator]]}}. + 1. Create an [=implementation-defined=] platform operand |outputImpl| to represent the output, given |output| and |opImpl|. + 1. Store a reference to |outputImpl| in |output0|.{{MLOperand/[[operand]]}}, |output1|.{{MLOperand/[[operand]]}} and |output2|.{{MLOperand/[[operand]]}}. + 1. Connect |input|.{{MLOperand/[[operand]]}} as input to |opImpl|. + 1. Connect |output| as output to |opImpl|. + 1. Return |output|. +
+
+ +
+
+ The behavior of this operation can be generically emulated from the usage of other operations as follow. However, user agents typically have a more efficient implementation for it, therefore its usage is encouraged from the performance standpoint. -
+  
+
     const numDirections = (options.direction == "both" ? 2 : 1);
     let hiddenState = options.initialHiddenState;
     let cellState = options.initialCellState;
@@ -2809,8 +2913,8 @@ partial interface MLGraphBuilder {
     }
 
     return (sequence ? [hiddenState, cellState, sequence] : [hiddenState, cellState]);
-    
-
+ +
### The lstmCell() method ### {#api-mlgraphbuilder-lstmcell} @@ -2831,7 +2935,31 @@ partial interface MLGraphBuilder { optional MLLstmCellOptions options = {}); }; -
+ +{{MLLstmCellOptions}} has the following members: +
+ : bias + :: + An {{MLOperand}}. The 1-D input bias tensor of shape [4 * hidden_size]. The ordering of the bias vectors in the first dimension of the tensor shape is specified according to the {{MLLstmCellOptions/layout}} argument. + + : recurrentBias + :: + An {{MLOperand}}. The 1-D recurrent bias tensor of shape [4 * hidden_size]. The ordering of the bias vectors in the first dimension of the tensor shape is specified according to the {{MLLstmCellOptions/layout}} argument. + + : peepholeWeight + :: + An {{MLOperand}}. The 1-D weight tensor for peepholes of shape [3 * hidden_size]. The pack ordering of the weight vectors is for the `input (i)`, `output (o)`, and `forget (f)` gate, respectively. + + : layout + :: + An {{MLLstmWeightLayout}}. The ordering of the weight and bias vectors for the internal gates of LSTM, specifically the `input (i)`, `output (o)`, `forget (f)`, and `cell (g)` gate, as indicated in the first dimension of the weight and bias tensor shapes. When not specified, the default layout is `"iofg"`. + + : activations + :: + A sequence of {{MLActivation}}. A sequence of three activation functions, the first one is used for the `input (i)`, `forget (f)`, and `output (o)` gate, the second one is used for the `cell (g)` gate, and the last used for filtering the output cell state before combining it with the result of the output gate to form the output hidden state. When not specified, they are assumed to be of the sigmoid function (`"sigmoid"`) followed by two hyperbolic tangent functions (`"tanh"`) respectively. +
+ +
**Arguments:** - *input*: an {{MLOperand}}. The input 2-D tensor of shape [batch_size, input_size]. - *weight*: an {{MLOperand}}. The 2-D input weight tensor of shape [4 * hidden_size, input_size]. The ordering of the weight vectors in the first dimension of the tensor shape is specified according to the *options.layout* argument. @@ -2840,17 +2968,60 @@ partial interface MLGraphBuilder { - *cellState*: an {{MLOperand}}. The 2-D input cell state tensor of shape [batch_size, hidden_size]. - *hiddenSize*: an {{unsigned long}} scalar. The value of the second dimension of the output tensor shape. It indicates the number of features in the hidden state. - *options*: an optional {{MLLstmCellOptions}}. The optional parameters of the operation. - - *bias*: an {{MLOperand}}. The 1-D input bias tensor of shape [4 * hidden_size]. The ordering of the bias vectors in the first dimension of the tensor shape is specified according to the *options.layout* argument. - - *recurrentBias*: an {{MLOperand}}. The 1-D recurrent bias tensor of shape [4 * hidden_size]. The ordering of the bias vectors in the first dimension of the tensor shape is specified according to the *options.layout* argument. - - *peepholeWeight*: an {{MLOperand}}. The 1-D weight tensor for peepholes of shape [3 * hidden_size]. The pack ordering of the weight vectors is for the *input (i)*, *output (o)*, and *forget (f)* gate respectively. - - *layout*: an {{MLLstmWeightLayout}}. The ordering of the weight and bias vectors for the internal gates of LSTM, specifically the *input (i)*, *output (o)*, *forget (f)*, and *cell (g)* gate, as indicated in the first dimension of the weight and bias tensor shapes. When not specified, the default layout is *"iofg"*. - - *activations*: a sequence of {{MLActivation}}. A sequence of three activation functions, the first one is used for the *input (i)*, *forget (f)*, and *output (o)* gate, the second one is used for the *cell (g)* gate, and the last used for filtering the output cell state before combining it with the result of the output gate to form the output hidden state. When not specified, they are assumed to be of the sigmoid function (*"sigmoid"*) followed by two hyperbolic tangent functions (*"tanh"*) respectively. **Returns:** a sequence of {{MLOperand}}. The first element of the sequence is the output hidden state of the current time step of the recurrent network. The following element is the output cell state. Both elements are 2-D tensors of shape [batch_size, hidden_size]. +
-
+
+ + The {{MLGraphBuilder/lstmCell(input, weight, recurrentWeight, hiddenState, cellState, hiddenSize, options)}} steps are: + +
+ 1. If |input|, |weight|, |recurrentWeight|, |hiddenState| or |cellState| is not an instance of {{MLOperand}}, then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. If the rank of |input|, |weight|, |recurrentWeight|, |hiddenState| or |cellState| is not `2`, then throw a "{{DataError}}" {{DOMException}} and stop. + 1. Let |batch_size| be |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}[0]. + 1. If |options| is `undefined`, let |options| be an empty [=object=]. + 1. If |options|.{{MLLstmCellOptions/bias}} [=map/exists=]. + 1. If it is not an instance of {{MLOperand}}, then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. If its rank is not `1`, then throw a "{{DataError}}" {{DOMException}} and stop. + 1. If |options|.{{MLLstmCellOptions/bias}}.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}[0] is not 4 * |hiddenSize|, then throw a "{{DataError}}" {{DOMException}} and stop. + 1. If |options|.{{MLLstmCellOptions/recurrentBias}} [=map/exists=]. + 1. If it is not an instance of {{MLOperand}}, then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. If its rank is not `1`, then throw a "{{DataError}}" {{DOMException}} and stop. + 1. If |options|.{{MLLstmCellOptions/recurrentBias}}.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}[0] is not 4 * |hiddenSize|, then throw a "{{DataError}}" {{DOMException}} and stop. + 1. If |options|.{{MLLstmCellOptions/peepholeWeight}} [=map/exists=]. + 1. If it is not an instance of {{MLOperand}}, then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. If its rank is not `1`, then throw a "{{DataError}}" {{DOMException}} and stop. + 1. If |options|.{{MLLstmCellOptions/peepholeWeight}}.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}[0] is not 3 * |hiddenSize|, then throw a "{{DataError}}" {{DOMException}} and stop. + 1. If |options|.{{MLLstmCellOptions/layout}} is `undefined`, set it to `"iofg"`. + 1. If |options|.{{MLLstmCellOptions/layout}} is not one of {{MLLstmWeightLayout}}, then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. If |options|.{{MLLstmCellOptions/activations}} [=map/exists=]: + 1. If it is not an array of size `3`, then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. If any of its elements is not an instance of {{MLActivation}}, then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. Let |desc| a new {{MLOperandDescriptor}}. + 1. Set |desc|.{{MLOperandDescriptor/dimensions}} to [ |batch_size|, |hiddenSize| ]. + 1. Set |desc|.{{MLOperandDescriptor/type}} to |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/type}}. + 1. If any of the following sub-steps fail, throw an "{{OperationError}}" {{DOMException}} and stop. + 1. Let |output0| be the result of invoking the create MLOperand steps given [=this=] and |desc|. + 1. Let |output1| be the result of invoking the create MLOperand steps given [=this=] and |desc|. + 1. Let |output| be the array [ |output0|, |output1| ]. + 1. Make a request to the underlying platform to: + 1. Let |opImpl| be an [=implementation-defined=] platform operator for the LSTM cell operation, given |weight|, |recurrentWeight|, |hiddenState|, |cellState|, |hiddenSize| and |options|. + 1. Store a reference of |opImpl| in |output0|.{{MLOperand/[[operator]]}} and |output1|.{{MLOperand/[[operator]]}}. + 1. Create an [=implementation-defined=] platform operand |outputImpl| to represent the output, given |output| and |opImpl|. + 1. Store a reference to |outputImpl| in |output0|.{{MLOperand/[[operand]]}} and |output1|.{{MLOperand/[[operand]]}}. + 1. Connect |input|.{{MLOperand/[[operand]]}} as input to |opImpl|. + 1. Connect |output| as output to |opImpl|. + 1. Return |output|. +
+
+ +
+
+ The behavior of this operation can be generically emulated via other operations as shown below, when the weight layout is the default *"iofg"* layout, and the activation functions of the input/forget/output gate and the cell gate/the cell state's filter for the output hidden state are of the operator types *sigmoid* and *tanh* respectively. -
+  
+
     const zero = builder.constant(0);
 
     // input gate (i)
@@ -2958,8 +3129,8 @@ partial interface MLGraphBuilder {
     let ht = builder.mul(o, builder.tanh(ct));
 
     return [ht, ct];
-    
-
+ +
### The matmul() method ### {#api-mlgraphbuilder-matmul} From 35a8de7c7d5c3b0c510f837ee5ad71c0d299aeab Mon Sep 17 00:00:00 2001 From: Zoltan Kis Date: Mon, 26 Jun 2023 17:09:54 +0300 Subject: [PATCH 039/112] Add the matmul() algorithm Signed-off-by: Zoltan Kis --- index.bs | 74 ++++++++++++++++++++++++++++++++++++++++---------------- 1 file changed, 53 insertions(+), 21 deletions(-) diff --git a/index.bs b/index.bs index 96cd6729..6d63b94b 100644 --- a/index.bs +++ b/index.bs @@ -2969,31 +2969,63 @@ partial interface MLGraphBuilder { MLOperand matmul(MLOperand a, MLOperand b); }; -
+ +
**Arguments:** - - *a*: an {{MLOperand}}. The first input N-D tensor. - - *b*: an {{MLOperand}}. The second input N-D tensor. + - *a*: an {{MLOperand}}. The first N-dimensional input tensor. + - *b*: an {{MLOperand}}. The second N-dimensional input tensor. - **Returns:** an {{MLOperand}}. The output N-D tensor that contains the matrix + **Returns:** an {{MLOperand}}. The output tensor that contains the matrix product of two input tensors. - - Compute the matrix product of two input tensors. It behaves as following: - - If both *a* and *b* are 2-D, they are multiplied like conventional - matrices and produce a 2-D tensor as the output. - - If either *a* or *b* is N-D, N > 2, it is treated as a stack of - matrices with dimensions corresponding to the last two indices. The - matrix multiplication will be broadcasted accordingly by following - [[!numpy-broadcasting-rule]]. The output is a N-D tensor whose rank - is the maximum rank of the input tensors. For each dimension, except - the last two, of the output tensor, its size is the maximum size - along that dimension of the input tensors. - - If *a* is 1-D, it is converted to a 2-D tensor by prepending a 1 to - its dimensions. - - If *b* is 1-D, it is converted to a 2-D tensor by by appending a 1 to - its dimensions. - - If both *a* and *b* are 1-D, the operation is a vector dot-product, - which produces a scalar output.
+
+ Computes the matrix product of two input tensors as follows: + - If both *a* and *b* are 2-dimensional, they are multiplied like conventional + matrices and produce a 2-dimensional tensor as the output. + - If either *a* or *b* is `N`-dimensional where `N > 2`, it is treated as a stack of matrices with dimensions corresponding to the last two indices. The matrix multiplication will be broadcasted accordingly by following the [[!numpy-broadcasting-rule]]. The output is a `N`-dimensional tensor whose rank is the maximum rank of the input tensors. For each dimension, except the last two, of the output tensor, its size is the maximum size along that dimension of the input tensors. + - If *a* is 1-dimensional, it is converted to a 2-dimensional tensor by prepending a 1 to its dimensions. + - If *b* is 1-dimensional, it is converted to a 2-dimensional tensor by by appending a 1 to its dimensions. + - If both *a* and *b* are 1-dimensional, the operation is a vector dot-product, which produces a scalar output. +
+ +
+ + To calculate matmul output sizes, given |a| and |b| run the following steps: + +
+ 1. Let |shapeA| be |a|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}} and |sizeA| the size of |shapeA|. + 1. Let |shapeB| be |a|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}} and |sizeB| the size of |shapeB|. + 1. If |sizeA| and |sizeB| is `1`, return `[ 1 ]`. + 1. If | sizeA| is `1` and |sizeB| is not, then insert `1` in the front of |shapeA| to become [ 1 | |shapeA| ] and let |sizeA| be `2`. + 1. If | sizeB| is `1` and |sizeA| is not, then insert `1` in the front of |shapeB| to become [ 1 | |shapeB| ] and let |sizeB| be `2`. + 1. Let |shape| be an array whose size |size| is the maximum of |sizeA| and |sizeB|. + 1. For each |index| between 0 and |size|: + 1. Set |shape|[|index|] to the maximum of |shapeA|[|index|] and |shapeB|[|index|]. + 1. Return |shape|. +
+
+ +
+ + The {{MLGraphBuilder/matmul(a, b)}} steps are: + +
+ 1. If |a| or |b| is not an instance of {{MLOperand}}, then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. Let |desc| a new {{MLOperandDescriptor}}. + 1. Set |desc|.{{MLOperandDescriptor/dimensions}} to the result of invoking the calculate matmul output sizes given |a| and |b|. + 1. Set |desc|.{{MLOperandDescriptor/type}} to |a|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/type}}. + 1. If any of the following sub-steps fail, throw an "{{OperationError}}" {{DOMException}} and stop. + 1. Let |output| be the result of invoking the create MLOperand steps given [=this=] and |desc|. + 1. Make a request to the underlying platform to: + 1. Let |opImpl| be an [=implementation-defined=] platform operator for the matrix multiplication operation. + 1. Store a reference of |opImpl| in |output|.{{MLOperand/[[operator]]}}. + 1. Create an [=implementation-defined=] platform operand |outputImpl| to represent the output, given |output| and |opImpl|. + 1. Store a reference to |outputImpl| in |output|.{{MLOperand/[[operand]]}}. + 1. Connect |a|.{{MLOperand/[[operand]]}} and |b|.{{MLOperand/[[operand]]}} as inputs to |opImpl|. + 1. Connect |output|.{{MLOperand/[[operand]]}} as output to |opImpl|. + 1. Return |output|. +
+
### The pad() method ### {#api-mlgraphbuilder-pad} Inflate the tensor with constant or mirrored values on the edges. From 1b0e8f45bcdcb6f2a5fefa0737221cbc56811ea8 Mon Sep 17 00:00:00 2001 From: Zoltan Kis Date: Mon, 26 Jun 2023 17:19:54 +0300 Subject: [PATCH 040/112] Add the pad() algorithm Signed-off-by: Zoltan Kis --- index.bs | 72 ++++++++++++++++++++++++++++++++++++++++++++++++++------ 1 file changed, 65 insertions(+), 7 deletions(-) diff --git a/index.bs b/index.bs index 96cd6729..e0ff7b79 100644 --- a/index.bs +++ b/index.bs @@ -3017,21 +3017,79 @@ partial interface MLGraphBuilder { optional MLPadOptions options = {}); }; -
+ +{{MLPadOptions}} has the following members: +
+ : mode + :: + An {{MLPaddingMode}} [=string=]. + Specifies the different ways to pad the tensor. + The default value is `"constant"`. + + : value + :: + A {{float}}. + Specifies the padding value when {{MLPadOptions/mode}} is set to `"constant"`. + The default value is `0`. +
+ +
**Arguments:** - *input*: an {{MLOperand}}. The input tensor. - *beginningPadding*: a sequence of {{unsigned long}}. The sequence of unsigned integer values indicating the number of padding values to add at the beginning of each input dimension, of length *N* where *N* is the rank of the input tensor. For each dimension *d* of *input*, *beginningPadding[d]* indicates how many values to add before the content in that dimension. - *endingPadding*: a sequence of {{unsigned long}}. The sequence of unsigned integer values indicating the number of padding values to add at the ending of each input dimension, of length *N* where *N* is the rank of the input tensor. For each dimension *d* of *input*, *endingPadding[d]* indicates how many values to add after the content in that dimension. - *options*: an optional {{MLPadOptions}}. The optional parameters of the operation. - - *mode*: an {{MLPaddingMode}}. The different ways to pad the tensor. When not set, it's assumed to be "constant". - - *value*: a {{float}}. The pad value when the *options.mode* is set to *"constant"*. When not set, it's assumed to be 0. **Returns:** an {{MLOperand}}. The padded output tensor. Each dimension of the output tensor can be calculated as follow: *output size = beginning padding + input size + ending padding* +
-
-
+
+ + To calculate padding output sizes, given |input|, |beginningPadding| and |endingPadding|, run the following steps: + +
+ 1. Let |shape| be a copy of |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}. + 1. For |index| between `0` and the rank of |shape|: + 1. Add to |shape|[|index|] the value of |beginningPadding|[|index|]. + 1. Add to |shape|[|index|] the value of |endingPadding|[|index|]. + 1. Return |shape|. +
+
+ +
+ + The {{MLGraphBuilder/pad(input, beginningPadding, endingPadding, options)}} steps are: + +
+ 1. If |input| is not an instance of {{MLOperand}}, then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. If |beginningPadding| or |endingPadding| is not a sequence of {{unsigned long}}, then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. If |options| is `undefined`, let |options| be an empty [=object=]. + 1. If |options|.{{MLPadOptions/mode}} is `undefined`, set it to `"constant"`. + 1. Otherwise, if |options|.{{MLPadOptions/mode}} is not one of {{MLPaddingMode}}, then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. If |options|.{{MLPadOptions/value}} is `undefined`, set it to `0`. + 1. Let |desc| be a copy of |input|.{{MLOperand/[[descriptor]]}}. + 1. Set |desc|.{{MLOperandDescriptor/dimensions}} to the result of invoking the calculate padding output sizes given |input|, |beginningPadding| and |endingPadding|. + 1. If any of the following sub-steps fail, throw an "{{OperationError}}" {{DOMException}} and stop. + 1. Let |output| be the result of invoking the create MLOperand steps given [=this=] and |desc|. + 1. Make a request to the underlying platform to: + 1. Let |opImpl| be an [=implementation-defined=] platform operator for the padding operation, given |beginningPadding|, |endingPadding| and |options|. + 1. Store a reference of |opImpl| in |output|.{{MLOperand/[[operator]]}}. + 1. Create an [=implementation-defined=] platform operand |outputImpl| to represent the output, given |output| and |opImpl|. + 1. Store a reference to |outputImpl| in |output|.{{MLOperand/[[operand]]}}. + 1. Connect |input|.{{MLOperand/[[operand]]}} as input to |opImpl|. + 1. Connect |output|.{{MLOperand/[[operand]]}} as output to |opImpl|. + 1. Return |output|. +
+
+ +
+
+ + Examples for constant, edge, reflection and symmetric padding: + +
     // input: [[1,2,3], [4,5,6]]
     const input = builder.constant(
       { type: 'float32', dimensions: [2,3] }, new Float32Array([1,2,3,4,5,6]));
@@ -3066,8 +3124,8 @@ partial interface MLGraphBuilder {
     //     [5,4,4,5,6,6,5],
     //     [5,4,4,5,6,6,5]]
     builder.pad(input, beginningPadding, endingPadding, { mode: "symmetric" });
-    
-
+
+
### Pooling operations ### {#api-mlgraphbuilder-pool2d} From e2aa350ebc32f2fbd5e3515b6edd49480873bc8e Mon Sep 17 00:00:00 2001 From: Zoltan Kis Date: Mon, 26 Jun 2023 22:03:56 +0300 Subject: [PATCH 041/112] Add the prelu() algorithm Signed-off-by: Zoltan Kis --- index.bs | 39 +++++++++++++++++++++++++++++++-------- 1 file changed, 31 insertions(+), 8 deletions(-) diff --git a/index.bs b/index.bs index 96cd6729..1f447fce 100644 --- a/index.bs +++ b/index.bs @@ -3144,21 +3144,46 @@ partial interface MLGraphBuilder {
### The prelu() method ### {#api-mlgraphbuilder-prelu} -Calculate the parametric version of rectified linear function (Parametric Relu) on the input tensor element-wise. Parametric Relu is a type of leaky ReLU that, instead of having a scalar slope like 0.01, making the slope (coefficient of leakage) into a parameter that is learned during the model training phase of this operation. The calculation follows the expression `max(0, x) + slope ∗ min(0, x)`. +Calculate the parametric version of rectified linear function (Parametric ReLU) on the input tensor element-wise. Parametric ReLU is a type of leaky ReLU that, instead of having a scalar slope like 0.01, making the slope (coefficient of leakage) into a parameter that is learned during the model training phase of this operation. The calculation follows the expression `max(0, x) + slope ∗ min(0, x)`. -
+ +
**Arguments:** - - *x*: an {{MLOperand}}. The input tensor. - - *slope*: an {{MLOperand}}. The slope tensor. Its shape is either the same as, or unidirectionally broadcastable to the shape of input tensor *x* according to [[!numpy-broadcasting-rule]]. + - *input*: an {{MLOperand}}. The input tensor. + - *slope*: an {{MLOperand}}. The slope tensor. Its shape is either the same as, or unidirectionally broadcastable to the shape of input tensor *input* according to [[!numpy-broadcasting-rule]]. **Returns:** - an {{MLOperand}}. The output tensor of the same shape as *x*. +
-
+
+ + The {{MLGraphBuilder/prelu(input, slope)}} steps are: + +
+ 1. If |input| or |slope| is not an instance of {{MLOperand}}, then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. Let |descriptor| be a new {{MLOperandDescriptor}}. + 1. Set |descriptor|.{{MLOperandDescriptor/dimensions}}.{{MLOperandDescriptor/type}} to |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/type}}. + 1. Let |descriptor|.{{MLOperandDescriptor/dimensions}} be the result of running the [=MLGraphBuilder/broadcast-shapes=] steps given |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}} and |slope|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}. + 1. If that throws an error, re-throw the error and stop. + 1. If any of the following sub-steps fail, throw an "{{OperationError}}" {{DOMException}} and stop. + 1. Let |output| be the result of invoking the create MLOperand steps given [=this=] and |descriptor|. + 1. Make a request to the underlying platform to: + 1. Let |opImpl| be an [=implementation-defined=] platform operator for the PreLU operation, given |slope|. + 1. Store a reference of |opImpl| in |output|.{{MLOperand/[[operator]]}}. + 1. Create an [=implementation-defined=] platform operand |outputImpl| to represent the output, given |output| and |opImpl|. + 1. Store a reference to |outputImpl| in |output|.{{MLOperand/[[operand]]}}. + 1. Connect |input|.{{MLOperand/[[operand]]}} as input to |opImpl|. + 1. Connect |output|.{{MLOperand/[[operand]]}} as output to |opImpl|. + 1. Return |output|. +
+
+ +
The behavior of this operation can be generically emulated from the usage of other operations as follow. However, user agents typically have a more efficient implementation for it, therefore its usage is encouraged from the @@ -3167,10 +3192,8 @@ partial interface MLGraphBuilder { return builder.add(builder.max(builder.constant(0), x), builder.mul(slope, builder.min(builder.constant(0), x))); -
- ### Reduction operations ### {#api-mlgraphbuilder-reduce} Reduce the input along the dimensions given in *axes*. -
+ +
**Arguments:** - *input*: an {{MLOperand}}. The input tensor. - *options*: an optional {{MLReduceOptions}}. The optional parameters of the operation. @@ -3202,7 +3203,9 @@ partial interface MLGraphBuilder { The default value is false. **Returns:** an {{MLOperand}}. The reduced output tensor. +
+
**Reduction types:** - *L1*: Compute the L1 norm of all the input values along the axes. - *L2*: Compute the L2 norm of all the input values along the axes. @@ -3216,6 +3219,82 @@ partial interface MLGraphBuilder { - *SumSquare*: Compute the sum of the square of all the input values along the axes.
+
+ + To create reduce operation given |op|, |input| and |options|, run the following steps: + +
+ 1. [=Assert=]: |op| is one of "reduceL1", "reduceL2", "reduceLogSum", "reduceLogSumExp", "reduceMax", "reduceMean", "reduceMin", "reduceProduct", "reduceSum", "reduceSumSquare". + 1. If |input| is not an instance of {{MLOperand}}, then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. If |options| is `undefined`, let |options| be a new {{MLReduceOptions}} object with |options|.{{MLReduceOptions/keepDimensions}} set to `false` and |options|.{{MLReduceOptions/axes}} set to `null`. + 1. If any of the following sub-steps fail, throw an "{{OperationError}}" {{DOMException}} and stop. + 1. Let |output| be the result of invoking the copy MLOperand steps given |input|. + 1. Make a request to the underlying platform to: + 1. Let |opImpl| be an [=implementation-defined=] platform operator for the |op| reduce operation, given |options|. + 1. Store a reference of |opImpl| in |output|.{{MLOperand/[[operator]]}}. + 1. Create an [=implementation-defined=] platform operand |outputImpl| to represent the output, given |output| and |opImpl|. + 1. Store a reference to |outputImpl| in |output|.{{MLOperand/[[operand]]}}. + 1. Connect |input|.{{MLOperand/[[operand]]}} as input to |opImpl|. + 1. Connect |output|.{{MLOperand/[[operand]]}} as output to |opImpl|. + 1. Return |output|. +
+
+ +
+ + The following reduce algorithms are supported. + + The {{MLGraphBuilder/reduceL1(input, options)}} steps are: + 1. Let |output| be the result of running the [=MLGraphBuilder/reduce-op | create reduce operation =] given "reduceL1", |input| and |options|. + 1. If that throws an error, then re-throw the error and stop. + 1. Return |output|. + + The {{MLGraphBuilder/reduceL2(input, options)}} steps are: + 1. Let |output| be the result of running the [=MLGraphBuilder/reduce-op | create reduce operation =] given "reduceL2", |input| and |options|. + 1. If that throws an error, then re-throw the error and stop. + 1. Return |output|. + + The {{MLGraphBuilder/reduceLogSum(input, options)}} steps are: + 1. Let |output| be the result of running the [=MLGraphBuilder/reduce-op | create reduce operation =] given "reduceLogSum", |input| and |options|. + 1. If that throws an error, then re-throw the error and stop. + 1. Return |output|. + + The {{MLGraphBuilder/reduceLogSumExp(input, options)}} steps are: + 1. Let |output| be the result of running the [=MLGraphBuilder/reduce-op | create reduce operation =] given "reduceLogSumExp", |input| and |options|. + 1. If that throws an error, then re-throw the error and stop. + 1. Return |output|. + + The {{MLGraphBuilder/reduceMax(input, options)}} steps are: + 1. Let |output| be the result of running the [=MLGraphBuilder/reduce-op | create reduce operation =] given "reduceMax", |input| and |options|. + 1. If that throws an error, then re-throw the error and stop. + 1. Return |output|. + + The {{MLGraphBuilder/reduceMean(input, options)}} steps are: + 1. Let |output| be the result of running the [=MLGraphBuilder/reduce-op | create reduce operation =] given "reduceMean", |input| and |options|. + 1. If that throws an error, then re-throw the error and stop. + 1. Return |output|. + + The {{MLGraphBuilder/reduceMin(input, options)}} steps are: + 1. Let |output| be the result of running the [=MLGraphBuilder/reduce-op | create reduce operation =] given "reduceMin", |input| and |options|. + 1. If that throws an error, then re-throw the error and stop. + 1. Return |output|. + + The {{MLGraphBuilder/reduceProduct(input, options)}} steps are: + 1. Let |output| be the result of running the [=MLGraphBuilder/reduce-op | create reduce operation =] given "reduceProduct", |input| and |options|. + 1. If that throws an error, then re-throw the error and stop. + 1. Return |output|. + + The {{MLGraphBuilder/reduceSum(input, options)}} steps are: + 1. Let |output| be the result of running the [=MLGraphBuilder/reduce-op | create reduce operation =] given "reduceSum", |input| and |options|. + 1. If that throws an error, then re-throw the error and stop. + 1. Return |output|. + + The {{MLGraphBuilder/reduceSumSquare(input, options)}} steps are: + 1. Let |output| be the result of running the [=MLGraphBuilder/reduce-op | create reduce operation =] given "reduceSumSquare", |input| and |options|. + 1. If that throws an error, then re-throw the error and stop. + 1. Return |output|. +
+ ### The relu() method ### {#api-mlgraphbuilder-relu} Compute the rectified linear function of the input tensor. -
- **Arguments:** - - *x*: an {{MLOperand}}. The input tensor. - **Returns:** - - an {{MLOperand}}. The output tensor of the same shape as *x*. - - an {{MLActivation}}. The activation function representing the relu operation. - -
+
The behavior of this operation can be generically emulated from the usage of other operations as follow. However, user agents typically have a more efficient implementation for it, therefore its usage is encouraged from the @@ -3240,9 +3234,56 @@ partial interface MLGraphBuilder {
     return builder.max(builder.constant(0), x);
     
-
+#### The {{MLGraphBuilder/relu(input)}} method #### {#api-mlgraphbuilder-relu-input} +
+ **Arguments:** + - *input*: an {{MLOperand}}. The input tensor. + + **Returns:** + - an {{MLOperand}}. The output tensor of the same shape as *x*. +
+ +
+ + The {{MLGraphBuilder/relu(input)}} steps are: + +
+ 1. If |input| is not an instance of {{MLOperand}}, then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. If any of the following sub-steps fail, throw an "{{OperationError}}" {{DOMException}} and stop. + 1. Let |output| be the result of invoking the copy MLOperand steps given |input|. + 1. Make a request to the underlying platform to: + 1. Let |opImpl| be an [=implementation-defined=] platform operator for the ReLU operation. + 1. Store a reference of |opImpl| in |output|.{{MLOperand/[[operator]]}}. + 1. Create an [=implementation-defined=] platform operand |outputImpl| to represent the output, given |output| and |opImpl|. + 1. Store a reference to |outputImpl| in |output|.{{MLOperand/[[operand]]}}. + 1. Connect |input|.{{MLOperand/[[operand]]}} as input to |opImpl|. + 1. Connect |output|.{{MLOperand/[[operand]]}} as output to |opImpl|. + 1. Return |output|. +
+
+ +#### The {{MLGraphBuilder/relu()}} method #### {#api-mlgraphbuilder-relu} +
+ **Arguments:** + - None. + + **Returns:** + - an {{MLActivation}}. The activation function representing the relu operation. +
+ +
+ + The {{MLGraphBuilder/relu()}} method steps are: + +
+ 1. Let |op| be the result of invoking the create MLActivation steps with `"relu"`. + 1. If that throws an error, re-throw the error and abort these steps. + 1. Return |op|. +
+
+ ### The resample2d() method ### {#api-mlgraphbuilder-resample2d} Resample the tensor values from the source to the destination spatial dimensions according to the scaling factors. -
+
**Arguments:** - *input*: an {{MLOperand}}. The input 4-D tensor. - *options*: an optional {{MLResample2dOptions}}. The optional parameters of the operation. - - *mode*: an {{MLInterpolationMode}}. The interpolation algorithm used to fill the output tensor values. - If not set, it is assumed to be the *Nearest Neighbor* interpolation. - - *scales*: a sequence of {{float}} of length 2. Each value represents the scaling factor used to scale in each spatial dimensions of input, [scale_height, scale_width]. If not set, the values are assumed to be [1.0, 1.0]. - - *sizes*: a sequence of {{unsigned long}} of length 2. The target sizes for each spatial dimensions of input, [size_height, size_width]. When the target sizes are specified, the *options.scales* argument is ignored as the scaling factor values are derived from the target sizes of each spatial dimension of input. - - *axes*: a sequence of {{unsigned long}} of length 2. The two consecutive dimensions of the input tensor to which the interpolation algorithm applies. The valid values in the sequence are [0, 1], [1, 2] or [2, 3]. When not specified, the sequence is assumed to be [2, 3]. **Returns:** an {{MLOperand}}. The output 4-D tensor.
-### The reshape() method ### {#api-mlgraphbuilder-reshape} +{{MLResample2dOptions}} has the following members: +
+ : mode + :: + An {{MLInterpolationMode}} [=string=]. + Specifies the interpolation algorithm used to fill the output tensor values. + The default value is `"nearest-neighbor"`, standing for *Nearest Neighbor* interpolation. + + : scales + :: + A sequence of {{float}} of length 2. + Specifies the scaling factor in each spatial dimensions of the input: [scale_height, scale_width]. + The default value is [1.0, 1.0]. + + : sizes + :: + A sequence of {{unsigned long}} of length 2. + Specifies the target sizes for each spatial dimensions of the input: [size_height, size_width]. When the target sizes are specified, the {{MLResample2dOptions/scales}} argument is ignored, since the scaling factor values are derived from the target sizes of each spatial dimension of the input. + + : axes + :: + A sequence of {{unsigned long}} of length 2. + Specifies the two consecutive dimensions of the input tensor to which the interpolation algorithm applies. The valid values in the sequence are [0, 1], [1, 2] or [2, 3]. + The default value is [2, 3]. +
+ +
+ + To check resample options given |options|, run the following steps: + +
+ 1. If |options| is `undefined`, let |options| be a new {{MLResample2dOptions}} object. + 1. If |options|.{{MLResample2dOptions/mode}} [=map/exists=]: + 1. If its value is not one of `"nearest-neighbor"` or `"linear"`, return `null`. + 1. Otherwise, set |options|.{{MLResample2dOptions/mode}} to `"nearest-neighbor"`. + 1. If |options|.{{MLResample2dOptions/scales}} [=map/exists=]: + 1. If its size is not `2`, or if any of its values is not greater than `0`, return `null`. + 1. Otherwise, set |options|.{{MLResample2dOptions/scales}} to `[1.0, 1.0]`. + 1. If |options|.{{MLResample2dOptions/sizes}} [=map/exists=]: if its size is not `2`, or if any of its values is not greater than `0`, return `null`. + 1. If |options|.{{MLResample2dOptions/axes}} [=map/exists=]: + 1. If its value is not one of `[0, 1], [1, 2], [2, 3]`, return `null`. + 1. Otherwise, set |options|.{{MLResample2dOptions/axes}} to `[2, 3]`. + 1. Return |options|. +
+
+ +
+ + To resample output sizes given |input| and |options|, run the following steps: + +
+ 1. Let |desc| be an {{MLOperandDescriptor}} initialized to |input|.{{MLOperand/[[descriptor]]}}. + 1. If |options|.{{MLResample2dOptions/sizes}} [=map/exists=], then set |desc|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}} to |options|.{{MLResample2dOptions/sizes}} and return |desc|. + 1. For |index| between `0` and the rank of |desc|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}: + 1. Let |inputSize| be the size of |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}[|index|]. + 1. Let |outputSize| be |inputSize| multiplied by |options|.{{MLResample2dOptions/scales}}. + 1. If that fails or |outputSize| is not a positive [=number=], then throw a "{{DataError}}" {{DOMException}} and stop. + 1. Set |desc|.{{MLOperandDescriptor/dimensions}}[|index|] to |outputSize|. + 1. Return |desc|. +
+
+ +
+ + The {{MLGraphBuilder/resample2d(input, options)}} steps are: + +
+ 1. Check if the input is a 4-dimensional tensor: if the size of |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}} is not `4`, throw a "{{DataError}}" {{DOMException}} and stop. + 1. Let |options| be the result of running the check resample options steps given |options|. + 1. If that returns `null`, then throw a "{{DataError}}" {{DOMException}} and stop. + 1. Let |desc| be the result of running the resample output sizes steps given |options|. + 1. If that throws an error, re-throw the error and stop. + 1. If any of the following sub-steps fail, throw an "{{OperationError}}" {{DOMException}} and stop. + 1. Let |output| be the result of invoking the create MLOperand steps given [=this=] and |desc|. + 1. Make a request to the underlying platform to: + 1. Let |opImpl| be an [=implementation-defined=] platform operator for the resample 2D operation, given |options|. + 1. Store a reference of |opImpl| in |output|.{{MLOperand/[[operator]]}}. + 1. Create an [=implementation-defined=] platform operand |outputImpl| to represent the output, given |output| and |opImpl|. + 1. Store a reference to |outputImpl| in |output|.{{MLOperand/[[operand]]}}. + 1. Connect |input|.{{MLOperand/[[operand]]}} as input to |opImpl|. + 1. Connect |output|.{{MLOperand/[[operand]]}} as output to |opImpl|. + 1. Return |output|. +
+
+ +### The reshape() method ### {#api-mlgraphbuilder-reshape-method} Alter the shape of a tensor to a new shape. Reshape does not copy or change the content of the tensor. It just changes the tensor's logical dimensions for the subsequent operations. -
+ +
**Arguments:** - *input*: an {{MLOperand}}. The input 4-D tensor. The logical shape is interpreted according to the value of *options.layout*. - *options*: an optional {{MLPool2dOptions}}. The optional parameters of the operation. - - *windowDimensions*: a sequence of {{unsigned long}} of length 2. The dimensions of the sliding window, - [window_height, window_width]. If not present, the window dimensions are assumed to be the height - and width dimensions of the input shape. - - *padding*: a sequence of {{unsigned long}} of length 4. The additional rows and columns added to the beginning and ending of each spatial dimension of *input*, [beginning_height, ending_height, beginning_width, ending_width]. If not present, the values are assumed to be [0,0,0,0]. - - *strides*: a sequence of {{unsigned long}} of length 2. The stride of the - sliding window for each spatial dimension of *input*, - [stride_height, stride_width]. If not present, the values are assumed to be [1,1]. - - *dilations*: a sequence of {{unsigned long}} of length 2. The dilation factor - for each spatial dimension of *input*, [dilation_height, dilation_width]. - If not present, the values are assumed to be [1,1]. - - *autoPad*: an {{MLAutoPad}}. The automatic input padding options. By default, this argument is set to *"explicit"*, which means that the values in the *options.padding* array should be used for input padding. When the option is set other than *"explicit"*, the values in the *options.padding* array are ignored. With the *"same-upper"* option, the padding values are automatically computed such that the additional ending padding of the spatial input dimensions would allow all of the input values in the corresponding dimension to be filtered. The *"same-lower"* option is similar but padding is applied to the beginning padding of the spatial input dimensions instead of the ending one. - - *layout*: an {{MLInputOperandLayout}}. The default value is *"nchw"*. This option specifies the - layout format of the input and output tensor as follow: - - "nchw": - - input tensor: [batches, channels, height, width] - - output tensor: [batches, channels, height, width] - - "nhwc": - - input tensor: [batches, height, width, channels] - - output tensor: [batches, height, width, channels] - - *roundingType*: an {{MLRoundingType}}. The option specifies the rounding function used to compute the output shape. - - *outputSizes*: a sequence of {{unsigned long}} of length 2. The sizes of the two spacial dimensions of the output tensor. When the output sizes are explicitly specified, the options.roundingType is ignored. If not specified, the output sizes are automatically computed. **Returns:** an {{MLOperand}}. The output 4-D tensor that contains the result of the reduction. The logical shape is interpreted according to the @@ -3133,16 +3111,134 @@ partial interface MLGraphBuilder { or if *options.roundingType* is *"ceil"*: *output size = ceil(1 + (input size - filter size + beginning padding + ending padding) / stride)* +
-
+
A *global* pooling operation such as one for the max pooling operation is a variant of pooling where the window dimensions is the spatial dimensions (last two dimensions) of the input shape, as follow.
     // 'global' max pooling
     builder.maxPool2d(input);
     
-
+{{MLPool2dOptions}} has the following members: +
+ : windowDimensions + :: + A sequence of {{unsigned long}} of length 2: [window_height, window_width]. + Specifies the dimensions of the sliding window. + The default value for the window dimensions are the height and width dimensions of the input shape. + + : padding + :: + A sequence of {{unsigned long}} of length 4: [beginning_height, ending_height, beginning_width, ending_width]. + Specifies the additional rows and columns added to the beginning and ending of each spatial dimension of the convolution input. + The default value is [0,0,0,0]. + + : strides + :: + A sequence of {{unsigned long}} of length 2: [stride_height, stride_width]. + Specifies the stride of the sliding window for each spatial dimension of the convolution input. + The default value is [1,1]. + + : dilations + :: + A sequence of {{unsigned long}} of length 2: [dilation_height, dilation_width]. Specifies the dilation factor for each spatial dimension applied on the convolution filter (kernel). + The default value is [1,1]. + + : autoPad + :: + An {{MLAutoPad}} [=string=]]. + Specifies the automatic input padding options. + The default value is *"explicit"*, which means that the values in the {{MLPool2dOptions/padding}} array should be used for input padding. + When the option is set other than *"explicit"*, the values in the {{MLPool2dOptions/padding}} array are ignored. + + With the *"same-upper"* option, the padding values are automatically computed such that the additional ending padding of the spatial input dimensions would allow all of the input values in the corresponding dimension to be filtered. + + The *"same-lower"* option is similar but padding is applied to the beginning padding of the spatial input dimensions instead of the ending one. + + : layout + :: + An {{MLInputOperandLayout}} [=string=]. + Specifies the layout format of the input and output tensor as follows: + - **"nchw"** + - input tensor: *[batches, input_channels, height, width]* + - output tensor: *[batches, output_channels, height, width]* + - **"nhwc"**: + - input tensor: *[batches, height, width, input_channels]* + - output tensor: *[batches, height, width, output_channels]* + The default value is *"nchw"*. + + : roundingType + :: + An {{MLRoundingType}} [=string=]. + Specifies the rounding function used to compute the output shape. + + : outputSizes + :: + A sequence of {{unsigned long}} of length 2. + Specifies the sizes of the two spacial dimensions of the output tensor. When the output sizes are explicitly specified, the {{MLPool2dOptions/roundingType}} is ignored. + + If not specified, the output sizes are automatically computed. + +
+ +
+ + To create pooling operation given |op|, |input| and |options|, run the following steps: + +
+ 1. [=Assert=]: |op| is one of "averagePool2d", "l2Pool2d", "maxPool2d". + 1. If |input| is not an instance of {{MLOperand}}, then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. If |options| is `undefined`, let |options| be a new {{MLPool2dOptions}} object. + 1. If |options|.{{MLPool2dOptions/outputSizes}} [=map/exists=], or if |options|.{{MLPool2dOptions/padding}} is `undefined`, set |options|.{{MLPool2dOptions/padding}} to `[0, 0, 0, 0]`. + 1. If |options|.{{MLPool2dOptions/strides}} is `undefined`, set |options|.{{MLPool2dOptions/strides}} to `[1, 1]`. + 1. If |options|.{{MLPool2dOptions/dilations}} is `undefined`, set |options|.{{MLPool2dOptions/dilations}} to `[1, 1]`. + 1. If |options|.{{MLPool2dOptions/autoPad}} is `undefined`, set |options|.{{MLPool2dOptions/autoPad}} to `"explicit`. + 1. If |options|.{{MLPool2dOptions/autoPad}} is not `"explicit"`, set |options|.{{MLPool2dOptions/padding}} to `[0, 0, 0, 0]`. + 1. If |options|.{{MLPool2dOptions/layout}} is `undefined`, set |options|.{{MLPool2dOptions/layout}} to `"nchw"`. + 1. If |options|.{{MLPool2dOptions/roundingType}} is `undefined`, set |options|.{{MLPool2dOptions/roundingType}} to `"floor"`. + 1. Let |desc| be a copy of |input|.{{MLOperand/[[descriptor]]}}. + 1. If any of the following sub-steps fail, throw an "{{OperationError}}" {{DOMException}} and stop. + 1. Make a request to the underlying platform to: + 1. Calculate the output dimensions given |input| and |options|. Let |desc|.{{MLOperandDescriptor/dimensions}} be the result of that. + 1. Let |output| be the result of invoking the create MLOperand steps given [=this=] and |desc|. + 1. Let |opImpl| be an [=implementation-defined=] platform operator for the |op| pooling operation, given |options|. + 1. Store a reference of |opImpl| in |output|.{{MLOperand/[[operator]]}}. + 1. Create an [=implementation-defined=] platform operand |outputImpl| to represent the output, given |output| and |opImpl|. + 1. Store a reference to |outputImpl| in |output|.{{MLOperand/[[operand]]}}. + 1. Connect |input|.{{MLOperand/[[operand]]}} as input to |opImpl|. + 1. Connect |output|.{{MLOperand/[[operand]]}} as output to |opImpl|. + 1. Return |output|. +
+
+ +
+ + The following pooling algorithms are supported. + +
+ The {{MLGraphBuilder/averagePool2d(input, options)}} steps are: + 1. Let |output| be the result of running the [=MLGraphBuilder/pooling-op | create pooling operation =] given `"averagePool2d"`, |input| and |options|. + 1. If that throws an error, then re-throw the error and stop. + 1. Return |output|. +
+ +
+ The {{MLGraphBuilder/l2Pool2d(input, options)}} steps are: + 1. Let |output| be the result of running the [=MLGraphBuilder/pooling-op | create pooling operation =] given `"l2Pool2d"`, |input| and |options|. + 1. If that throws an error, then re-throw the error and stop. + 1. Return |output|. +
+ +
+ The {{MLGraphBuilder/maxPool2d(input, options)}} steps are: + 1. Let |output| be the result of running the [=MLGraphBuilder/pooling-op | create pooling operation =] given `"maxPool2d"`, |input| and |options|. + 1. If that throws an error, then re-throw the error and stop. + 1. Return |output|. +
+
+ ### The prelu() method ### {#api-mlgraphbuilder-prelu} Calculate the parametric version of rectified linear function (Parametric Relu) on the input tensor element-wise. Parametric Relu is a type of leaky ReLU that, instead of having a scalar slope like 0.01, making the slope (coefficient of leakage) into a parameter that is learned during the model training phase of this operation. The calculation follows the expression `max(0, x) + slope ∗ min(0, x)`. -
+
**Arguments:** - *input*: an {{MLOperand}}. The input tensor. - *newShape*: a sequence of {{nullable}} {{unsigned long}}. The shape of the output tensor. @@ -3297,6 +3297,37 @@ partial interface MLGraphBuilder { tensor is specified by the *newShape* argument.
+
+ + The {{MLGraphBuilder/reshape(input, newShape)}} steps are: + +
+ 1. If |input| is not an instance of {{MLOperand}}, then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. Let |outputShape| be an empty array of {{unsigned long}}. + 1. If |newShape| is a scalar [=number=], set |outputShape| to `[ 1 ]`. + 1. Otherwise, if |newShape| is an array of {{unsigned long}}: + 1. If the size of |newShape| is `0`, set |outputShape| to `[ 1 ]` (reshaping to scalar). + 1. If |newShape| contains more than one `null` value, then throw a "{{DataError}}" {{DOMException}} and stop. + 1. If any value in |newShape| is `0`, then throw a "{{DataError}}" {{DOMException}} and stop. + 1. Let |inputElementCount| be the product of all elements in |inputs|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}. + 1. If |newShape| contains a `null` value, set that value to |inputElementCount| divided by the product of all other values in |newShape|. + 1. If that value is too large for {{unsigned long}}, then throw a "{{DataError}}" {{DOMException}} and stop. + 1. If product of all values in |newShape| is not equal to |inputElementCount|, then throw a "{{DataError}}" {{DOMException}} and stop. + 1. Let |desc| be a copy of |input|.{{MLOperand/[[descriptor]]}}. + 1. Set |desc|.{{MLOperandDescriptor/dimensions}} to |newShape|. + 1. If any of the following sub-steps fail, throw an "{{OperationError}}" {{DOMException}} and stop. + 1. Let |output| be the result of invoking the create MLOperand steps given [=this=] and |desc|. + 1. Make a request to the underlying platform to: + 1. Let |opImpl| be an [=implementation-defined=] platform operator for the reshape operation. + 1. Store a reference of |opImpl| in |output|.{{MLOperand/[[operator]]}}. + 1. Create an [=implementation-defined=] platform operand |outputImpl| to represent the output, given |output| and |opImpl|. + 1. Store a reference to |outputImpl| in |output|.{{MLOperand/[[operand]]}}. + 1. Connect |input|.{{MLOperand/[[operand]]}} as input to |opImpl|. + 1. Connect |output|.{{MLOperand/[[operand]]}} as output to |opImpl|. + 1. Return |output|. +
+
+ ### The sigmoid() method ### {#api-mlgraphbuilder-sigmoid} Compute the sigmoid function of the input tensor. The calculation follows the expression `1 / (exp(-x) + 1)`. -
- **Arguments:** - - *x*: an {{MLOperand}}. The input tensor. - - **Returns:** - - an {{MLOperand}}. The output tensor of the same shape as *x*. - - an {{MLActivation}}. The activation function representing the sigmoid operation. -
+
The behavior of this operation can be generically emulated from the usage of other operations as follow. However, user agents typically have a more efficient implementation for it, therefore its usage is encouraged from the @@ -3325,9 +3318,56 @@ partial interface MLGraphBuilder { builder.exp(builder.neg(x)), builder.constant(1))); -
+#### The {{MLGraphBuilder/sigmoid(input)}} method #### {#api-mlgraphbuilder-sigmoid-input} +
+ **Arguments:** + - *input*: an {{MLOperand}}. The input tensor. + + **Returns:** + - an {{MLOperand}}. The output tensor of the same shape as *input*. +
+ +
+ + The {{MLGraphBuilder/sigmoid(input)}} steps are: + +
+ 1. If |input| is not an instance of {{MLOperand}}, then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. If any of the following sub-steps fail, throw an "{{OperationError}}" {{DOMException}} and stop. + 1. Let |output| be the result of invoking the copy MLOperand steps given |input|. + 1. Make a request to the underlying platform to: + 1. Let |opImpl| be an [=implementation-defined=] platform operator for the sigmoid operation. + 1. Store a reference of |opImpl| in |output|.{{MLOperand/[[operator]]}}. + 1. Create an [=implementation-defined=] platform operand |outputImpl| to represent the output, given |output| and |opImpl|. + 1. Store a reference to |outputImpl| in |output|.{{MLOperand/[[operand]]}}. + 1. Connect |input|.{{MLOperand/[[operand]]}} as input to |opImpl|. + 1. Connect |output|.{{MLOperand/[[operand]]}} as output to |opImpl|. + 1. Return |output|. +
+
+ +#### The {{MLGraphBuilder/sigmoid()}} method #### {#api-mlgraphbuilder-sigmoid} +
+ **Arguments:** + - None. + + **Returns:** + - an {{MLActivation}}. The activation function representing the sigmoid operation. +
+ +
+ + The {{MLGraphBuilder/sigmoid()}} method steps are: + +
+ 1. Let |op| be the result of invoking the create MLActivation steps with `"sigmoid"`. + 1. If that throws an error, re-throw the error and abort these steps. + 1. Return |op|. +
+
+ ### The slice() method ### {#api-mlgraphbuilder-slice} Produce a slice of the input tensor. -
+
**Arguments:** - *input*: an {{MLOperand}}. The input tensor. - *starts*: a sequence of {{unsigned long}}. The sequence of unsigned integer values indicating the starting index to slice of each input dimension, of length N where N is the rank of the input tensor. For each dimension *d* of *input*, *starts[d]* indicates the starting index to slice in that dimension. The starting index must be in the range [0, input size - 1] in that dimension. @@ -3344,6 +3344,30 @@ partial interface MLGraphBuilder { **Returns:** an {{MLOperand}}. The output tensor of the same rank as the input tensor with tensor values stripped to the specified starting and ending indices in each dimension.
+
+ + The {{MLGraphBuilder/slice(input, starts, sizes)}} steps are: + +
+ 1. If |input| is not an instance of {{MLOperand}}, then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. If |starts| or |sizes| is not a sequence of {{long}}, then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. If |sizes|.size is 0, then throw a "{{TypeError}}" {{DOMException}} and stop. +
+ Further validation of |starts| and |sizes| given |input| is left [=implementation-defined=]. +
+ 1. If any of the following sub-steps fail, throw an "{{OperationError}}" {{DOMException}} and stop. + 1. Let |output| be the result of invoking the copy MLOperand steps given |input|. + 1. Make a request to the underlying platform to: + 1. Let |opImpl| be an [=implementation-defined=] platform operator for the slice operation, given |starts| and |sizes|. + 1. Store a reference of |opImpl| in |output|.{{MLOperand/[[operator]]}}. + 1. Create an [=implementation-defined=] platform operand |outputImpl| to represent the output, given |output| and |opImpl|. + 1. Store a reference to |outputImpl| in |output|.{{MLOperand/[[operand]]}}. + 1. Connect |input|.{{MLOperand/[[operand]]}} as input to |opImpl|. + 1. Connect |output|.{{MLOperand/[[operand]]}} as output to |opImpl|. + 1. Return |output|. +
+
+ ### The softmax() method ### {#api-mlgraphbuilder-softmax} Compute the [softmax](https://en.wikipedia.org/wiki/Softmax_function) values of the 2-D input tensor along axis 1. From b17eaac09387c8822345e91ea2a5f71cd451150a Mon Sep 17 00:00:00 2001 From: Zoltan Kis Date: Tue, 27 Jun 2023 19:01:39 +0300 Subject: [PATCH 049/112] Add the softmax algorithm Signed-off-by: Zoltan Kis --- index.bs | 70 +++++++++++++++++++++++++++++++++++++++++++++----------- 1 file changed, 57 insertions(+), 13 deletions(-) diff --git a/index.bs b/index.bs index 96cd6729..e29795a7 100644 --- a/index.bs +++ b/index.bs @@ -3344,29 +3344,25 @@ partial interface MLGraphBuilder { **Returns:** an {{MLOperand}}. The output tensor of the same rank as the input tensor with tensor values stripped to the specified starting and ending indices in each dimension.
-### The softmax() method ### {#api-mlgraphbuilder-softmax} +### The softmax() method ### {#api-mlgraphbuilder-softmax-method} Compute the [softmax](https://en.wikipedia.org/wiki/Softmax_function) values of the 2-D input tensor along axis 1. -
- **Arguments:** - - *x*: an {{MLOperand}}. The input 2-D tensor. - - **Returns:** - - an {{MLOperand}}. The output 2-D tensor that contains the softmax results, of the same shape as the input tensor. - - an {{MLActivation}}. The activation function representing the softmax operation. -
+
+
+ The behavior of this operation can be generically emulated from the usage of other operations as follow. However, user agents typically have a more efficient implementation for it, therefore its usage is encouraged from the performance standpoint. -
+  
+
     // This sample deploys a well-known implementation trick [1] to compute the
     // exponentials of the distances to the max value, instead of the exponentials
     // of the input values itself, in order to increase the numerical stability of
@@ -3375,10 +3371,58 @@ partial interface MLGraphBuilder {
     const max_x = builder.reduceMax(x, { axes: [1], keepDimensions: true });
     const exp_x = builder.exp(builder.sub(x, max_x));
     return builder.div(exp_x, builder.reduceSum(exp_x, { axes: [1], keepDimensions: true }));
-    
-
+ + +
+ +#### The {{MLGraphBuilder/softmax(input)}} method #### {#api-mlgraphbuilder-softmax-input} +
+ **Arguments:** + - *input*: an {{MLOperand}}. The input 2-D tensor. + + **Returns:** + - an {{MLOperand}}. The output 2-D tensor that contains the softmax results, of the same shape as the input tensor. +
+ +
+ + The {{MLGraphBuilder/softmax(input)}} steps are: + +
+ 1. If |input| is not an instance of {{MLOperand}}, then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. If any of the following sub-steps fail, throw an "{{OperationError}}" {{DOMException}} and stop. + 1. Let |output| be the result of invoking the copy MLOperand steps given |input|. + 1. Make a request to the underlying platform to: + 1. Let |opImpl| be an [=implementation-defined=] platform operator for the softmax operation. + 1. Store a reference of |opImpl| in |output|.{{MLOperand/[[operator]]}}. + 1. Create an [=implementation-defined=] platform operand |outputImpl| to represent the output, given |output| and |opImpl|. + 1. Store a reference to |outputImpl| in |output|.{{MLOperand/[[operand]]}}. + 1. Connect |input|.{{MLOperand/[[operand]]}} as input to |opImpl|. + 1. Connect |output|.{{MLOperand/[[operand]]}} as output to |opImpl|. + 1. Return |output|. +
+
+ +#### The {{MLGraphBuilder/softmax()}} method #### {#api-mlgraphbuilder-softmax} +
+ **Arguments:** + - None. + + **Returns:** + - an {{MLActivation}}. The activation function representing the softmax operation.
+
+ + The {{MLGraphBuilder/softmax()}} method steps are: + +
+ 1. Let |op| be the result of invoking the create MLActivation steps with `"softmax"`. + 1. If that throws an error, re-throw the error and abort these steps. + 1. Return |op|. +
+
+ ### The softplus() method ### {#api-mlgraphbuilder-softplus} Compute the softplus function of the input tensor. The calculation follows the expression `ln(1 + exp(steepness * x)) / steepness`. -
- **Arguments:** - - *x*: an {{MLOperand}}. The input tensor. - **Returns:** - - an {{MLOperand}}. The output tensor of the same shape as *x*. - - an {{MLActivation}}. The activation function representing the softplus operation. - -
+
The behavior of this operation can be generically emulated from the usage of other operations as follow. However, user agents typically have a more efficient implementation for it, therefore its usage is encouraged from the @@ -3412,9 +3405,81 @@ partial interface MLGraphBuilder { builder.constant(1))), builder.constant(options.steepness)); -
+{{MLSoftplusOptions}} has the following members: +
+ : steepness + :: + A {{float}} scalar parameter. + The default value is `1`. +
+ +
+ + To check softplus options given |options|, run the following steps: + +
+ 1. If |options| is not an [=object=], then return `false`. + 1. If |options|.{{MLSoftplusOptions/steepness}} is `undefined`, set |options|.{{MLSoftplusOptions/steepness}} to `1`. + 1. Else if |options|.{{MLSoftplusOptions/steepness}} is not a [=numeric type=], then then return `false`. + 1. Return `true`. +
+
+ +#### The {{MLGraphBuilder/softplus(input, options)}} method #### {#api-mlgraphbuilder-softplus-input-options} +
+ **Arguments:** + - *input*: an {{MLOperand}}. The input tensor. + - *options*: an optional {{MLSoftplusOptions}}. The optional parameters of the operation. + + **Returns:** + - an {{MLOperand}}. The output tensor of the same shape as *x*. +
+ +
+ + The {{MLGraphBuilder/softplus(input, options)}} method steps are: + +
+ 1. Let |input| be the first argument. + 1. Let |options| be the second argument. + 1. If running the check softplus options steps with |options| returns `false`, then throw a "{{TypeError}}" {{DOMException}} and abort these steps. + 1. If any of the following sub-steps fail, throw an "{{OperationError}}" {{DOMException}} and stop. + 1. Let |output| be the result of invoking the copy MLOperand steps given |input|. + 1. Make a request to the underlying platform to: + 1. Let |opImpl| be an [=implementation-defined=] platform operator for the softplus operation, given |options|. + 1. Store a reference of |opImpl| in |output|.{{MLOperand/[[operator]]}}. + 1. Create an [=implementation-defined=] platform operand |outputImpl| to represent the output, given |output| and |opImpl|. + 1. Store a reference to |outputImpl| in |output|.{{MLOperand/[[operand]]}}. + 1. Connect |input|.{{MLOperand/[[operand]]}} as input to |opImpl|. + 1. Connect |output|.{{MLOperand/[[operand]]}} as output to |opImpl|. + 1. Return |output|. +
+
+ +#### The {{MLGraphBuilder/softplus(options)}} method #### {#api-mlgraphbuilder-softplus-options} +
+ **Arguments:** + - *options*: an optional {{MLSoftplusOptions}}. The optional parameters of the operation. + + **Returns:** + - an {{MLActivation}}. The activation function representing the softplus operation. +
+ +
+ + The {{MLGraphBuilder/softplus(options)}} method steps are: + +
+ 1. Let |options| be the first argument. + 1. If running the check softplus options steps with |options| returns `false`, then throw a "{{TypeError}}" {{DOMException}} and abort these steps. + 1. Let |op| be the result of invoking the create MLActivation steps with `"softplus"` and |options|. + 1. If that throws an error, re-throw the error and abort these steps. + 1. Return |op|. +
+
+ ### The softsign() method ### {#api-mlgraphbuilder-softsign} Compute the softsign function of the input tensor. The calculation follows the expression `x / (1 + |x|)`. -
- **Arguments:** - - *x*: an {{MLOperand}}. The input tensor. - - **Returns:** - - an {{MLOperand}}. The output tensor of the same shape as *x*. - - an {{MLActivation}}. The activation function representing the softsign operation. -
+
The behavior of this operation can be generically emulated from the usage of other operations as follow. However, user agents typically have a more efficient implementation for it, therefore its usage is encouraged from the @@ -3439,9 +3432,56 @@ partial interface MLGraphBuilder {
     return builder.div(x, builder.add(builder.constant(1), builder.abs(x)));
     
-
+#### The {{MLGraphBuilder/softsign(input)}} method #### {#api-mlgraphbuilder-softsign-input} +
+ **Arguments:** + - *input*: an {{MLOperand}}. The input tensor. + + **Returns:** + - an {{MLOperand}}. The output tensor of the same shape as *x*. +
+ +
+ + The {{MLGraphBuilder/softsign(input)}} steps are: + +
+ 1. If |input| is not an instance of {{MLOperand}}, then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. If any of the following sub-steps fail, throw an "{{OperationError}}" {{DOMException}} and stop. + 1. Let |output| be the result of invoking the copy MLOperand steps given |input|. + 1. Make a request to the underlying platform to: + 1. Let |opImpl| be an [=implementation-defined=] platform operator for the softsign operation, given |options|. + 1. Store a reference of |opImpl| in |output|.{{MLOperand/[[operator]]}}. + 1. Create an [=implementation-defined=] platform operand |outputImpl| to represent the output, given |output| and |opImpl|. + 1. Store a reference to |outputImpl| in |output|.{{MLOperand/[[operand]]}}. + 1. Connect |input|.{{MLOperand/[[operand]]}} as input to |opImpl|. + 1. Connect |output|.{{MLOperand/[[operand]]}} as output to |opImpl|. + 1. Return |output|. +
+
+ +#### The {{MLGraphBuilder/softsign()}} method #### {#api-mlgraphbuilder-softsign} +
+ **Arguments:** + - None. + + **Returns:** + - an {{MLActivation}}. The activation function representing the softsign operation. +
+ +
+ + The {{MLGraphBuilder/softsign()}} method steps are: + +
+ 1. Let |op| be the result of invoking the create MLActivation steps with `"softsign"`. + 1. If that throws an error, re-throw the error and abort these steps. + 1. Return |op|. +
+
+ ### The split() method ### {#api-mlgraphbuilder-split} Split the input tensor into a number of sub tensors along the given axis. -
+ +
**Arguments:** - *input*: an {{MLOperand}}. The input tensor. - *splits*: an {{unsigned long}} or a sequence of {{unsigned long}}. If an {{unsigned long}}, it specifies the number of output tensors along the axis. The number must evenly divide the dimension size of *input* along *options.axis*. If a sequence of {{unsigned long}}, it specifies the sizes of each output tensor along the *options.axis*. The sum of sizes must equal to the dimension size of *input* along *options.axis*. - *options*: an optional {{MLSplitOptions}}. The optional parameters of the operation. - - *axis*: an {{unsigned long}} scalar. The dimension along which to split. Its value must be in the range [0, N-1] where N is the rank of input tensor. Default to 0. **Returns:** a sequence of {{MLOperand}}. The splitted output tensors. If *splits* is an {{unsigned long}}, the length of the output sequence equals to *splits*. The shape of each output tensor is the same as *input* except the dimension size of *axis* equals to the quotient of dividing the dimension size of *input* along *axis* by *splits*. If *splits* is a sequence of {{unsigned long}}, the length of the output sequence equals to the length of *splits*. The shape of the i-th output tensor is the same as as *input* except along *axis* where the dimension size is *splits[i]*. +
+ +{{MLSplitOptions}} has the following members: +
+ : axis + :: + An {{unsigned long}} scalar. The dimension along which to split. Its value must be in the range [0, N-1] where N is the rank of input tensor. + The default value is `0`. +
+ +
+ + The {{MLGraphBuilder/split(input, splits, options)}} steps are: + +
+ 1. If |input| is not an instance of {{MLOperand}}, then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. If |options| is `undefined`, let |options| be an empty [=object=]. + 1. If |options|.{{MLSplitOptions/axis}} is `undefined`, let |options|.{{MLSplitOptions/axis}} be `0`. + 1. If |splits| is not {{unsigned long}} or a sequence of {{unsigned long}}, then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. If any of the following sub-steps fail, throw an "{{OperationError}}" {{DOMException}} and stop. + 1. Let |output| be the result of invoking the copy MLOperand steps given |input|. + 1. Make a request to the underlying platform to: + 1. Let |opImpl| be an [=implementation-defined=] platform operator for the split operation, given |splits| and |options|. + 1. Store a reference of |opImpl| in |output|.{{MLOperand/[[operator]]}}. + 1. Create an [=implementation-defined=] platform operand |outputImpl| to represent the output, given |output| and |opImpl|. + 1. Store a reference to |outputImpl| in |output|.{{MLOperand/[[operand]]}}. + 1. Connect |input|.{{MLOperand/[[operand]]}} as input to |opImpl|. + 1. Connect |output|.{{MLOperand/[[operand]]}} as output to |opImpl|. + 1. Return |output|. +
+
From 5c2c0a1fdc28ec07423de5e1a75aaeb5934f898e Mon Sep 17 00:00:00 2001 From: Zoltan Kis Date: Tue, 27 Jun 2023 19:34:21 +0300 Subject: [PATCH 053/112] Add the tanh algorithm Signed-off-by: Zoltan Kis --- index.bs | 62 ++++++++++++++++++++++++++++++++++++++++++++++---------- 1 file changed, 51 insertions(+), 11 deletions(-) diff --git a/index.bs b/index.bs index 96cd6729..e61d71ab 100644 --- a/index.bs +++ b/index.bs @@ -3509,23 +3509,16 @@ partial interface MLGraphBuilder { **Returns:** an {{MLOperand}}. The output tensor of the same or reduced rank with the shape dimensions of size 1 eliminated.
-### The tanh() method ### {#api-mlgraphbuilder-tanh} +### The tanh() method ### {#api-mlgraphbuilder-tanh-method} Compute the hyperbolic tangent function of the input tensor. The calculation follows the expression `(exp(2 * x) - 1) / (exp(2 * x) + 1)`. -
- **Arguments:** - - *x*: an {{MLOperand}}. The input tensor. - - **Returns:** - - an {{MLOperand}}. The output tensor of the same shape as *x*. - - an {{MLActivation}}. The activation function representing the tanh operation. -
+
The behavior of this operation can be generically emulated from the usage of other operations as follow. However, user agents typically have a more efficient implementation for it, therefore its usage is encouraged from the @@ -3535,9 +3528,56 @@ partial interface MLGraphBuilder { builder.sub(builder.exp(builder.mul(builder.constant(2), x)), builder.constant(1)), builder.add(builder.exp(builder.mul(builder.constant(2), x)), builder.constant(1))); -
+#### The {{MLGraphBuilder/tanh(input)}} method #### {#api-mlgraphbuilder-tanh-input} +
+ **Arguments:** + - *input*: an {{MLOperand}}. The input tensor. + + **Returns:** + - an {{MLOperand}}. The output tensor of the same shape as *x*. +
+ +
+ + The {{MLGraphBuilder/tanh(input)}} steps are: + +
+ 1. If |input| is not an instance of {{MLOperand}}, then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. If any of the following sub-steps fail, throw an "{{OperationError}}" {{DOMException}} and stop. + 1. Let |output| be the result of invoking the copy MLOperand steps given |input|. + 1. Make a request to the underlying platform to: + 1. Let |opImpl| be an [=implementation-defined=] platform operator for the hyperbolic tangent operation. + 1. Store a reference of |opImpl| in |output|.{{MLOperand/[[operator]]}}. + 1. Create an [=implementation-defined=] platform operand |outputImpl| to represent the output, given |output| and |opImpl|. + 1. Store a reference to |outputImpl| in |output|.{{MLOperand/[[operand]]}}. + 1. Connect |input|.{{MLOperand/[[operand]]}} as input to |opImpl|. + 1. Connect |output|.{{MLOperand/[[operand]]}} as output to |opImpl|. + 1. Return |output|. +
+
+ +#### The {{MLGraphBuilder/tanh()}} method #### {#api-mlgraphbuilder-tanh} +
+ **Arguments:** + - None. + + **Returns:** + - an {{MLActivation}}. The activation function representing the tanh operation. +
+ +
+ + The {{MLGraphBuilder/tanh()}} method steps are: + +
+ 1. Let |op| be the result of invoking the create MLActivation steps with `"tanh"`. + 1. If that throws an error, re-throw the error and abort these steps. + 1. Return |op|. +
+
+ ### The transpose() method ### {#api-mlgraphbuilder-transpose} Permute the dimensions of the input tensor according to the *permutation* argument. -
+ +
**Arguments:** - *input*: an {{MLOperand}}. The input tensor. - *options*: an optional {{MLSqueezeOptions}}. The optional parameters of the operation. - - *axes*: a sequence of {{unsigned long}}. Indices to the shape dimensions of size 1 to eliminate. The values in the sequence must be in the range [0, N-1] where N is the rank of input tensor. When not specified, every shape dimensions of size 1 in the tensor are eliminated. **Returns:** an {{MLOperand}}. The output tensor of the same or reduced rank with the shape dimensions of size 1 eliminated.
+{{MLSqueezeOptions}} has the following members: +
+ : axes + :: + A sequence of {{unsigned long}}. + Specifies the indices to the shape dimensions of size 1 to eliminate. The values in the sequence must be in the range [0, N-1] where N is the rank of input tensor. + When not specified, every shape dimensions of size 1 in the tensor are eliminated. +
+ +
+ + The {{MLGraphBuilder/squeeze(input, options)}} steps are: + +
+ 1. If |input| is not an instance of {{MLOperand}}, then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. If |options| is `undefined`, let |options| be an empty [=object=]. + 1. If |options|.{{MLSqueezeOptions/axes}} [=map/exists=], then: + 1. Let |dimensions| be |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}. + 1. For |index| between 0 and the size of |options|.{{MLSqueezeOptions/axes}}: + 1. Let |oneDimIndex| be |options|.{{MLSqueezeOptions/axes}}[|index|]. + 1. If |dimensions|[|oneDimIndex|] is not `1`, then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. If any of the following sub-steps fail, throw an "{{OperationError}}" {{DOMException}} and stop. + 1. Let |output| be the result of invoking the copy MLOperand steps given |input|. + 1. Make a request to the underlying platform to: + 1. Let |opImpl| be an [=implementation-defined=] platform operator for the squeeze operation, given |options|. + 1. Store a reference of |opImpl| in |output|.{{MLOperand/[[operator]]}}. + 1. Create an [=implementation-defined=] platform operand |outputImpl| to represent the output, given |output| and |opImpl|. + 1. Store a reference to |outputImpl| in |output|.{{MLOperand/[[operand]]}}. + 1. Connect |input|.{{MLOperand/[[operand]]}} as input to |opImpl|. + 1. Connect |output|.{{MLOperand/[[operand]]}} as output to |opImpl|. + 1. Return |output|. +
+
+ ### The tanh() method ### {#api-mlgraphbuilder-tanh} Compute the hyperbolic tangent function of the input tensor. The calculation follows the expression `(exp(2 * x) - 1) / (exp(2 * x) + 1)`. -
+ +
**Arguments:** - *input*: an {{MLOperand}}. The input N-D tensor. - *options*: an optional {{MLTransposeOptions}}. The optional parameters of the operation. - - *permutation*: a sequence of {{unsigned long}} values. The values used to permute the output shape. When it's not specified, it's set to [N-1, ..., 0], where N is the rank of the input tensor, e.g. [2,1,0] for a 3-D tensor. These default values cause the output to become a transposed tensor of the input. When specified, the number of values in the sequence must be the same as the rank of the input tensor, and the values in the sequence must be within the range from 0 to N-1 with no two or more same values found in the sequence. **Returns:** an {{MLOperand}}. The permuted or transposed N-D tensor.
+{{MLTransposeOptions}} has the following members: +
+ : permutation + :: + A sequence of {{unsigned long}} values. + Specifies the values used to permute the output shape. + The default value is [N-1, ..., 0], where N is the rank of the input tensor, e.g. [2,1,0] for a 3-D tensor. + These default values cause the output to become a transposed tensor of the input. When specified, the number of values in the sequence must be the same as the rank of the input tensor, and the values in the sequence must be within the range from 0 to N-1 with no two or more same values found in the sequence. +
+ +
+ + The {{MLGraphBuilder/transpose(input, options)}} steps are: + +
+ 1. If |input| is not an instance of {{MLOperand}}, then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. If |options| is `undefined`, let |options| be an empty [=object=]. + 1. If |options|.{{MLTransposeOptions/permutation}} is `undefined`, let |options|.{{MLTransposeOptions/permutation}} be the reversed sequence of all indices for |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}. + 1. Otherwise if |options|.{{MLTransposeOptions/permutation}} [=map/exists=]: + 1. If |options|.{{MLTransposeOptions/permutation}} is not a sequence of {{unsigned long}}, then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. If the rank of |options|.{{MLTransposeOptions/permutation}} is not the same as the rank of |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}, then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. If the values in |options|.{{MLTransposeOptions/permutation}} are not between `0` and the rank of |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}} minus `1`, then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. If the values in |options|.{{MLTransposeOptions/permutation}} contain duplicate value, then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. If any of the following sub-steps fail, throw an "{{OperationError}}" {{DOMException}} and stop. + 1. Let |output| be the result of invoking the copy MLOperand steps given |input|. + 1. Make a request to the underlying platform to: + 1. Let |opImpl| be an [=implementation-defined=] platform operator for the transpose operation, given |options|. + 1. Store a reference of |opImpl| in |output|.{{MLOperand/[[operator]]}}. + 1. Create an [=implementation-defined=] platform operand |outputImpl| to represent the output, given |output| and |opImpl|. + 1. Store a reference to |outputImpl| in |output|.{{MLOperand/[[operand]]}}. + 1. Connect |input|.{{MLOperand/[[operand]]}} as input to |opImpl|. + 1. Connect |output|.{{MLOperand/[[operand]]}} as output to |opImpl|. + 1. Return |output|. +
+
+ Examples {#examples} ===================== From 41d2b7f225f72314f4199053e8ba8b052555c8d0 Mon Sep 17 00:00:00 2001 From: Zoltan Kis Date: Mon, 26 Jun 2023 22:03:56 +0300 Subject: [PATCH 056/112] Add the prelu() algorithm Signed-off-by: Zoltan Kis --- index.bs | 39 +++++++++++++++++++++++++++++++-------- 1 file changed, 31 insertions(+), 8 deletions(-) diff --git a/index.bs b/index.bs index c896ca16..858bb396 100644 --- a/index.bs +++ b/index.bs @@ -4445,21 +4445,46 @@ partial interface MLGraphBuilder { ### The prelu() method ### {#api-mlgraphbuilder-prelu} -Calculate the parametric version of rectified linear function (Parametric Relu) on the input tensor element-wise. Parametric Relu is a type of leaky ReLU that, instead of having a scalar slope like 0.01, making the slope (coefficient of leakage) into a parameter that is learned during the model training phase of this operation. The calculation follows the expression `max(0, x) + slope ∗ min(0, x)`. +Calculate the parametric version of rectified linear function (Parametric ReLU) on the input tensor element-wise. Parametric ReLU is a type of leaky ReLU that, instead of having a scalar slope like 0.01, making the slope (coefficient of leakage) into a parameter that is learned during the model training phase of this operation. The calculation follows the expression `max(0, x) + slope ∗ min(0, x)`. -
+ +
**Arguments:** - - *x*: an {{MLOperand}}. The input tensor. - - *slope*: an {{MLOperand}}. The slope tensor. Its shape is either the same as, or unidirectionally broadcastable to the shape of input tensor *x* according to [[!numpy-broadcasting-rule]]. + - *input*: an {{MLOperand}}. The input tensor. + - *slope*: an {{MLOperand}}. The slope tensor. Its shape is either the same as, or unidirectionally broadcastable to the shape of input tensor *input* according to [[!numpy-broadcasting-rule]]. **Returns:** - an {{MLOperand}}. The output tensor of the same shape as *x*. +
-
+
+ + The {{MLGraphBuilder/prelu(input, slope)}} steps are: + +
+ 1. If |input| or |slope| is not an instance of {{MLOperand}}, then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. Let |descriptor| be a new {{MLOperandDescriptor}}. + 1. Set |descriptor|.{{MLOperandDescriptor/dimensions}}.{{MLOperandDescriptor/type}} to |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/type}}. + 1. Let |descriptor|.{{MLOperandDescriptor/dimensions}} be the result of running the [=MLGraphBuilder/broadcast-shapes=] steps given |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}} and |slope|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}. + 1. If that throws an error, re-throw the error and stop. + 1. If any of the following sub-steps fail, throw an "{{OperationError}}" {{DOMException}} and stop. + 1. Let |output| be the result of invoking the create MLOperand steps given [=this=] and |descriptor|. + 1. Make a request to the underlying platform to: + 1. Let |opImpl| be an [=implementation-defined=] platform operator for the PreLU operation, given |slope|. + 1. Store a reference of |opImpl| in |output|.{{MLOperand/[[operator]]}}. + 1. Create an [=implementation-defined=] platform operand |outputImpl| to represent the output, given |output| and |opImpl|. + 1. Store a reference to |outputImpl| in |output|.{{MLOperand/[[operand]]}}. + 1. Connect |input|.{{MLOperand/[[operand]]}} as input to |opImpl|. + 1. Connect |output|.{{MLOperand/[[operand]]}} as output to |opImpl|. + 1. Return |output|. +
+
+ +
The behavior of this operation can be generically emulated from the usage of other operations as follow. However, user agents typically have a more efficient implementation for it, therefore its usage is encouraged from the @@ -4468,10 +4493,8 @@ partial interface MLGraphBuilder { return builder.add(builder.max(builder.constant(0), x), builder.mul(slope, builder.min(builder.constant(0), x))); -
- ### Reduction operations ### {#api-mlgraphbuilder-reduce} Reduce the input tensor along all dimensions, or along the axes specified in the {{MLReduceOptions/axes}} array parameter. For each specified axis, the dimension with that index is reduced, i.e. the resulting tensor will not contain it, unless the {{MLReduceOptions/keepDimensions}} option is specified. The values of the resulting tensor are calculated using the specified reduction function that takes as parameters all the values across the reduced dimension. -
+
**Arguments:** - *descriptor*: an optional {{GPUCommandBufferDescriptor}}. Descriptor of the command buffer. **Returns:** {{GPUCommandBuffer}}.
+
+ + The {{MLCommandEncoder/finish(descriptor)}} method steps are: + +
+ 1. If any of the following sub-steps fail, throw an "{{OperationError}}" {{DOMException}} and stop. + 1. Make a request to the underlying platform to complete the recording of the ML workload, given |descriptor|. +
+ See the related WebGPU steps. +
+ 1. Return a {{GPUCommandBuffer}} containing the recorded workload. +
+
+ ## The MLGraphBuilder interface ## {#api-mlgraphbuilder} The {{MLGraphBuilder}} interface defines a set of operations as identified by the [[#usecases]] that can be composed into a computational graph. It also represents the intermediate state of a graph building session. From b7eaee9add7dc6d6fb6fa085e9942c2ac1f82c71 Mon Sep 17 00:00:00 2001 From: Zoltan Kis Date: Wed, 12 Jul 2023 19:34:23 +0300 Subject: [PATCH 062/112] Fix #439: add missing conv2d() and convTranspose2d() validation steps Signed-off-by: Zoltan Kis --- index.bs | 4 ++++ 1 file changed, 4 insertions(+) diff --git a/index.bs b/index.bs index faf6b015..c39f6831 100644 --- a/index.bs +++ b/index.bs @@ -2064,6 +2064,8 @@ partial interface MLGraphBuilder { 1. If |options| is `undefined`, let |options| be an empty [=object=]. 1. If |options|.{{MLConv2dOptions/padding}} is `undefined`, set it to `[0, 0, 0, 0]`. 1. If |options|.{{MLConv2dOptions/strides}} is `undefined`, set it to `[1, 1]`. + 1. Else if |options|.{{MLConv2dOptions/strides}}.size() is not `2`, then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. If any element in |options|.{{MLConv2dOptions/strides}} is equal to 0, then throw a "{{TypeError}}" {{DOMException}} and stop. 1. If |options|.{{MLConv2dOptions/dilations}} is `undefined`, set it to `[1, 1]`. 1. If |options|.{{MLConv2dOptions/autoPad}} is `undefined`, set it to `"explicit"`. 1. If |options|.{{MLConv2dOptions/groups}} is `undefined`, set it to `1`. @@ -2230,6 +2232,8 @@ partial interface MLGraphBuilder { 1. If |options| is `undefined`, let |options| be an empty [=object=]. 1. If |options|.{{MLConvTranspose2dOptions/padding}} is `undefined`, set it to `[0, 0, 0, 0]`. 1. If |options|.{{MLConvTranspose2dOptions/strides}} is `undefined`, set it to `[1, 1]`. + 1. Else if |options|.{{MLConv2dOptions/strides}}.size() is not `2`, then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. If any element in |options|.{{MLConv2dOptions/strides}} is equal to 0, then throw a "{{TypeError}}" {{DOMException}} and stop. 1. If |options|.{{MLConvTranspose2dOptions/dilations}} is `undefined`, set it to `[1, 1]`. 1. If |options|.{{MLConvTranspose2dOptions/outputPadding}} is `undefined`, set it to `[0, 0]`. 1. If |options|.{{MLConvTranspose2dOptions/autoPad}} is `undefined`, set it to `"explicit"`. From a7991974719de945ff2a68822f8491b72bc20df3 Mon Sep 17 00:00:00 2001 From: Chai Chaoweeraprasit Date: Wed, 9 Aug 2023 10:40:44 -0700 Subject: [PATCH 063/112] Fix a few missing
tags for the relevant dev notes to stay consistent with the new style convention. (#445) --- index.bs | 45 ++++++++++++++++++++++++++++++++++++++++++++- 1 file changed, 44 insertions(+), 1 deletion(-) diff --git a/index.bs b/index.bs index 5708ca76..a3739089 100644 --- a/index.bs +++ b/index.bs @@ -1814,7 +1814,10 @@ partial interface MLGraphBuilder {
+
+ The behavior of this operation when the input tensor is 4-D of the *"nchw"* layout and the activation is of operator type *relu* can be generically emulated from the usage of other operations as follow. However, user agents typically have a more efficient implementation for it, therefore its usage is encouraged from the performance standpoint. +
     const shape = [1,null,1,1];
     return builder.relu(
@@ -1829,7 +1832,7 @@ partial interface MLGraphBuilder {
             )),
         builder.reshape(options.bias, shape)));
     
-
+
### The clamp() method ### {#api-mlgraphbuilder-clamp} @@ -2573,10 +2576,13 @@ partial interface MLGraphBuilder {
+
+ The behavior of this operation can be generically emulated from the usage of other operations as follow. However, user agents typically have a more efficient implementation for it, therefore its usage is encouraged from the performance standpoint. +
     return builder.add(
               builder.max(builder.constant(0), x),
@@ -2586,6 +2592,7 @@ partial interface MLGraphBuilder {
                   builder.exp(builder.min(builder.constant(0), x)),
                   builder.constant(1))));
     
+
@@ -3148,10 +3155,13 @@ partial interface MLGraphBuilder {
+
+ The behavior of this operation can be generically emulated from the usage of other operations as follow. However, user agents typically have a more efficient implementation for it, therefore its usage is encouraged from the performance standpoint. +
     return builder.max(
                builder.min(
@@ -3161,6 +3171,7 @@ partial interface MLGraphBuilder {
                    builder.constant(1)),
                builder.constant(0));
     
+
{{MLHardSigmoidOptions}} has the following members: @@ -3442,14 +3453,18 @@ partial interface MLGraphBuilder {
+
+ The behavior of this operation can be generically emulated from the usage of other operations as follow. However, user agents typically have a more efficient implementation for it, therefore its usage is encouraged from the performance standpoint. +
     return builder.add(builder.max(builder.constant(0), x),
               builder.mul(builder.constant(options.alpha), builder.min(builder.constant(0), x)));
     
+
{{MLLeakyReluOptions}} has the following members: @@ -3543,15 +3558,19 @@ partial interface MLGraphBuilder {
+
+ The behavior of this operation can be generically emulated from the usage of other operations as follow. However, user agents typically have a more efficient implementation for it, therefore its usage is encouraged from the performance standpoint. +
     return builder.add(
               builder.mul(x, builder.constant(options.alpha)),
               builder.constant(options.beta));
     
+
{{MLLinearOptions}} has the following members: @@ -4485,14 +4504,18 @@ partial interface MLGraphBuilder {
+
+ The behavior of this operation can be generically emulated from the usage of other operations as follow. However, user agents typically have a more efficient implementation for it, therefore its usage is encouraged from the performance standpoint. +
     return builder.add(builder.max(builder.constant(0), x),
                        builder.mul(slope, builder.min(builder.constant(0), x)));
     
+
### Reduction operations ### {#api-mlgraphbuilder-reduce} @@ -4630,13 +4653,17 @@ partial interface MLGraphBuilder {
+
+ The behavior of this operation can be generically emulated from the usage of other operations as follow. However, user agents typically have a more efficient implementation for it, therefore its usage is encouraged from the performance standpoint. +
     return builder.max(builder.constant(0), x);
     
+
#### The {{MLGraphBuilder/relu(input)}} method #### {#api-mlgraphbuilder-relu-input} @@ -4862,10 +4889,13 @@ partial interface MLGraphBuilder {
+
+ The behavior of this operation can be generically emulated from the usage of other operations as follow. However, user agents typically have a more efficient implementation for it, therefore its usage is encouraged from the performance standpoint. +
     return builder.div(
               builder.constant(1),
@@ -4873,6 +4903,7 @@ partial interface MLGraphBuilder {
                 builder.exp(builder.neg(x)),
                 builder.constant(1)));
     
+
#### The {{MLGraphBuilder/sigmoid(input)}} method #### {#api-mlgraphbuilder-sigmoid-input} @@ -5055,10 +5086,13 @@ partial interface MLGraphBuilder {
+
+ The behavior of this operation can be generically emulated from the usage of other operations as follow. However, user agents typically have a more efficient implementation for it, therefore its usage is encouraged from the performance standpoint. +
     return builder.div(
               builder.log(
@@ -5067,6 +5101,7 @@ partial interface MLGraphBuilder {
                   builder.constant(1))),
               builder.constant(options.steepness));
     
+
{{MLSoftplusOptions}} has the following members: @@ -5152,13 +5187,17 @@ partial interface MLGraphBuilder {
+
+ The behavior of this operation can be generically emulated from the usage of other operations as follow. However, user agents typically have a more efficient implementation for it, therefore its usage is encouraged from the performance standpoint. +
     return builder.div(x, builder.add(builder.constant(1), builder.abs(x)));
     
+
#### The {{MLGraphBuilder/softsign(input)}} method #### {#api-mlgraphbuilder-softsign-input} @@ -5350,15 +5389,19 @@ partial interface MLGraphBuilder {
+
+ The behavior of this operation can be generically emulated from the usage of other operations as follow. However, user agents typically have a more efficient implementation for it, therefore its usage is encouraged from the performance standpoint. +
     return builder.div(
               builder.sub(builder.exp(builder.mul(builder.constant(2), x)), builder.constant(1)),
               builder.add(builder.exp(builder.mul(builder.constant(2), x)), builder.constant(1)));
     
+
#### The {{MLGraphBuilder/tanh(input)}} method #### {#api-mlgraphbuilder-tanh-input} From b26235414f0f4ec177942a6a8a56ae609ba470ab Mon Sep 17 00:00:00 2001 From: Zoltan Kis Date: Thu, 10 Aug 2023 15:31:48 +0300 Subject: [PATCH 064/112] Fix sigmoid, hardswish and relu cross references Signed-off-by: Zoltan Kis --- index.bs | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/index.bs b/index.bs index 09577337..f44aa8df 100644 --- a/index.bs +++ b/index.bs @@ -1049,7 +1049,7 @@ These activations function types are used to create other operations. One such u #### Creating {{MLActivation}} #### {#api-mlactivation-create}
-The {{MLActivation}} objects (including the ones passed as input to methods) are created by the methods of {{MLGraphBuilder}} and are identified by their name. The |options| dictionary is defined by those methods. The actual creation of the activation function e.g. a [[#api-mlgraphbuilder-sigmoid]] or [[#api-mlgraphbuilder-relu]] can then be deferred until when the rest of the graph is ready to connect with it such as during the construction of [[#api-mlgraphbuilder-conv2d]] for example. +The {{MLActivation}} objects (including the ones passed as input to methods) are created by the methods of {{MLGraphBuilder}} and are identified by their name. The |options| dictionary is defined by those methods. The actual creation of the activation function e.g. a [[#api-mlgraphbuilder-sigmoid-method]] or [[#api-mlgraphbuilder-relu-method]] can then be deferred until when the rest of the graph is ready to connect with it such as during the construction of [[#api-mlgraphbuilder-conv2d]] for example.
@@ -3337,7 +3337,7 @@ partial interface MLGraphBuilder { The {{MLGraphBuilder/hardSwish()}} method steps are: -
+
1. Let |op| be the result of invoking the create MLActivation steps with `"hardSwish"`. 1. If that throws an error, re-throw the error and abort these steps. 1. Return |op|. From 565587008375a8b471697da7fcefd04b6de67811 Mon Sep 17 00:00:00 2001 From: Zoltan Kis Date: Mon, 14 Aug 2023 13:09:16 +0300 Subject: [PATCH 065/112] Fix buidSync() steps for #446 Signed-off-by: Zoltan Kis --- index.bs | 18 +++++++++--------- 1 file changed, 9 insertions(+), 9 deletions(-) diff --git a/index.bs b/index.bs index f44aa8df..1291700a 100644 --- a/index.bs +++ b/index.bs @@ -1653,26 +1653,26 @@ Build a composed graph up to a given output operand into a computational graph,
The permissions and context validity have been checked by [[#api-mlgraphbuilder-constructor]] steps.
- 1. If |outputs| is not an instance of {{MLNamedOperands}}, then throw an "{{TypeError}}" {{DOMException}} and stop. + 1. If |outputs| is not an instance of {{MLNamedOperands}} or otherwise if empty, then throw an "{{TypeError}}" {{DOMException}} and stop. 1. For each |element| in |outputs|: - 1. If |element|.key is not a [=string=], then throw an "{{TypeError}}" {{DOMException}} and stop. + 1. If |element|.key is not a [=string=] or otherwise if empty, then throw an "{{TypeError}}" {{DOMException}} and stop. 1. If |element|.value is not an instance of {{MLOperand}}, then throw an "{{TypeError}}" {{DOMException}} and stop. 1. If any of the following sub-steps fail, throw an "{{OperationError}}" {{DOMException}} and stop. 1. Let |graph| be a new {{MLGraph}}: 1. Set |graph|.{{MLGraph/[[context]]}} to [=this=].{{MLGraphBuilder/[[context]]}}. - 1. Set |graph|.{{MLGraph/[[outputDescriptors]]}} to |outputs|. 1. Make a request to the underlying platform to: 1. Connect |graph| to a new [=implementation-defined=] graph implementation |graphImpl| given |graph|. 1. Store a reference to |graphImpl| in |graph|.{{MLGraph/[[implementation]]}}. 1. Make a request to the underlying platform to initialize the graph: 1. For each |operand| in |outputs|: + 1. If running the validate MLOperand given |operand| and [=this=] returns `false`, then throw a "{{TypeError}}" {{DOMException}} and stop. 1. If |operand| was created as an input by the underlying platform: - 1. Add |operand| to |graph|.{{MLGraph/[[inputDescriptors]]}}. - 1. Initialize the weights of |operand|. + 1. If |operand|.{{MLOperand/[[name]]}}] is not unique for |graphImpl|, then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. Add |operand|.{{MLOperand/[[descriptor]]}} to |graph|.{{MLGraph/[[inputDescriptors]]}}[|operand|.{{MLOperand/[[name]]}}]. 1. If |operand| was created as a constant by the underlying platform: - 1. Preprocess and optimize the tensor data of |operand|. - 1. Update |graphImpl| with |operand|.{{MLOperand/[[operand]]}}. - 1. Update |graphImpl| with |operand|.{{MLOperand/[[operator]]}}. + 1. Implementations MAY preprocess and optimize the tensor data of |operand| for the underlying platform. + 1. Register |operand|.{{MLOperand/[[operand]]}} in |graphImpl| as graph output. + 1. Register |operand|.{{MLOperand/[[operator]]}} to |graphImpl|. 1. Return |graph|.
@@ -1995,7 +1995,7 @@ partial interface MLGraphBuilder { 1. Let |desc| be |inputs|[0].{{MLOperand/[[descriptor]]}}. 1. Let |desc|.{{MLOperandDescriptor/dimensions}}[|axis|] be `0`. 1. For each |index| between 0 and the rank of |inputs|: - 1. If running validate MLOperand given |inputs|[|index|] and [=this=] returns `false`, then fail. + 1. If running validate MLOperand given |inputs|[|index|] and [=this=] returns `false`, then fail. 1. For each |dim| between 0 and the rank of |inputs|[|index|]:
If the shape of each corresponding dimension and type of the operands, except for those of the dimension given by |axis|, is not the same, fail. From 6fff9d00c51d934721cc8fdef59091deed81af64 Mon Sep 17 00:00:00 2001 From: Zoltan Kis Date: Mon, 14 Aug 2023 23:18:42 +0300 Subject: [PATCH 066/112] Fix conv2d() steps according to review in #446 Signed-off-by: Zoltan Kis --- index.bs | 20 +++++++++++++++----- 1 file changed, 15 insertions(+), 5 deletions(-) diff --git a/index.bs b/index.bs index 1291700a..e23cd227 100644 --- a/index.bs +++ b/index.bs @@ -2144,21 +2144,31 @@ partial interface MLGraphBuilder { 1. If |input| or |filter| is not an instance of {{MLOperand}}, then then throw a "{{TypeError}}" {{DOMException}} and stop. 1. Let |input_size| be the size of |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}. 1. Let |filter_size| be the size of |filter|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}. - 1. If |input_size| is not `4`, then then throw a "{{DataError}}" {{DOMException}} and stop. - 1. If |filter_size| is not `4`, then then throw a "{{DataError}}" {{DOMException}} and stop. + 1. If |input_size| is not `4`, then throw a "{{DataError}}" {{DOMException}} and stop. + 1. If |filter_size| is not `4`, then throw a "{{DataError}}" {{DOMException}} and stop. + 1. If the type of |input| and |filter| is not the same, then throw a "{{TypeError}}" {{DOMException}} and stop. 1. If |options| is `undefined`, let |options| be an empty [=object=]. 1. If |options|.{{MLConv2dOptions/padding}} is `undefined`, set it to `[0, 0, 0, 0]`. + 1. Else if |options|.{{MLConv2dOptions/padding}}.length is not 4, then throw a "{{DataError}}" {{DOMException}} and stop. 1. If |options|.{{MLConv2dOptions/strides}} is `undefined`, set it to `[1, 1]`. - 1. Else if |options|.{{MLConv2dOptions/strides}}.size() is not `2`, then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. Else if |options|.{{MLConv2dOptions/strides}}.length is not `2`, then throw a "{{TypeError}}" {{DOMException}} and stop. 1. If any element in |options|.{{MLConv2dOptions/strides}} is equal to 0, then throw a "{{TypeError}}" {{DOMException}} and stop. 1. If |options|.{{MLConv2dOptions/dilations}} is `undefined`, set it to `[1, 1]`. + 1. Else if |options|.{{MLConv2dOptions/dilations}}.length is not `2`, then throw a "{{TypeError}}" {{DOMException}} and stop. 1. If |options|.{{MLConv2dOptions/autoPad}} is `undefined`, set it to `"explicit"`. 1. If |options|.{{MLConv2dOptions/groups}} is `undefined`, set it to `1`. + 1. Else if |options|.{{MLConv2dOptions/groups}} is 0, then throw a "{{DataError}}" {{DOMException}} and stop. + 1. If |input_size| / |options|.{{MLConv2dOptions/groups}} is not equal to |filter_size|, then throw a "{{DataError}}" {{DOMException}} and stop. + 1. Else if |input_size| % |options|.{{MLConv2dOptions/groups}} is not 0, then throw a "{{DataError}}" {{DOMException}} and stop. + 1. 1. If |options|.{{MLConv2dOptions/inputLayout}} is `undefined`, set it to `"nchw"`. 1. If |options|.{{MLConv2dOptions/filterLayout}} is `undefined`, set it to `"oihw"`. 1. If |options|.{{MLConv2dOptions/bias}} [=map/exists=] and it is not an instance of {{MLOperand}}, then then throw a "{{TypeError}}" {{DOMException}} and stop. - 1. If |options|.{{MLConv2dOptions/activation}} [=map/exists=] and it is not an instance of {{MLActivation}}, then then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. If the length of |options|.{{MLConv2dOptions/bias}}.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}} is not 1, then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. If |options|.{{MLConv2dOptions/activation}} [=map/exists=] and it is not an instance of {{MLActivation}}, then throw a "{{TypeError}}" {{DOMException}} and stop. 1. Let |output_shape| be the result of calculating output dimensions based on input, filter, dilation, padding and stride, taking into account |options|.{{MLConv2dOptions/inputLayout}}. + 1. If the shape of |options|.{{MLConv2dOptions/bias}}.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}} is not the same as |output_shape|, then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. If |options|.{{MLConv2dOptions/bias}}.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/type}} is not the same as |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/type}}, then throw a "{{TypeError}}" {{DOMException}} and stop. 1. Let |desc| a new {{MLOperandDescriptor}}. 1. Set |desc|.{{MLOperandDescriptor/type}} to |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/type}}. 1. Set |desc|.{{MLOperandDescriptor/dimensions}} to |output_shape|. @@ -2317,7 +2327,7 @@ partial interface MLGraphBuilder { 1. If |options| is `undefined`, let |options| be an empty [=object=]. 1. If |options|.{{MLConvTranspose2dOptions/padding}} is `undefined`, set it to `[0, 0, 0, 0]`. 1. If |options|.{{MLConvTranspose2dOptions/strides}} is `undefined`, set it to `[1, 1]`. - 1. Else if |options|.{{MLConv2dOptions/strides}}.size() is not `2`, then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. Else if |options|.{{MLConv2dOptions/strides}}.length is not `2`, then throw a "{{TypeError}}" {{DOMException}} and stop. 1. If any element in |options|.{{MLConv2dOptions/strides}} is equal to 0, then throw a "{{TypeError}}" {{DOMException}} and stop. 1. If |options|.{{MLConvTranspose2dOptions/dilations}} is `undefined`, set it to `[1, 1]`. 1. If |options|.{{MLConvTranspose2dOptions/outputPadding}} is `undefined`, set it to `[0, 0]`. From 0905ab7949627b2ffd4cf800be7b0df7df124310 Mon Sep 17 00:00:00 2001 From: Zoltan Kis Date: Mon, 14 Aug 2023 23:20:14 +0300 Subject: [PATCH 067/112] Remove duplicate 'then's Signed-off-by: Zoltan Kis --- index.bs | 32 ++++++++++++++++---------------- 1 file changed, 16 insertions(+), 16 deletions(-) diff --git a/index.bs b/index.bs index e23cd227..5511b04a 100644 --- a/index.bs +++ b/index.bs @@ -1730,7 +1730,7 @@ Create a constant {{MLOperand}} that can be used in {{MLGraphBuilder}} methods.
The permissions and context validity have been checked by [[#api-mlgraphbuilder-constructor]] steps.
- 1. If |value| is not a [=number=], then then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. If |value| is not a [=number=], then throw a "{{TypeError}}" {{DOMException}} and stop. 1. If |type| is `undefined`, let |type| be `"float32"`. 1. Otherwise, if |type| is not one of {{MLOperandType}}, then throw a "{{TypeError}}" {{DOMException}} and stop. 1. Let |descriptor| be a new {{MLOperandDescriptor}}. @@ -1895,7 +1895,7 @@ partial interface MLGraphBuilder {
1. If |options| is not an [=object=] that [=implements=] {{MLClampOptions}}, then return `false`. - 1. If |options|.{{MLClampOptions/minValue}} and |options|.{{MLClampOptions/maxValue}} are not a [=numeric type=], then then return `false`. + 1. If |options|.{{MLClampOptions/minValue}} and |options|.{{MLClampOptions/maxValue}} are not a [=numeric type=], then return `false`. 1. If |options|.{{MLClampOptions/minValue}} is greater than |options|.{{MLClampOptions/maxValue}}, then return `false`. 1. Return `true`.
@@ -2141,7 +2141,7 @@ partial interface MLGraphBuilder { The {{MLGraphBuilder/conv2d(input, filter, options)}} steps are:
- 1. If |input| or |filter| is not an instance of {{MLOperand}}, then then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. If |input| or |filter| is not an instance of {{MLOperand}}, then throw a "{{TypeError}}" {{DOMException}} and stop. 1. Let |input_size| be the size of |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}. 1. Let |filter_size| be the size of |filter|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}. 1. If |input_size| is not `4`, then throw a "{{DataError}}" {{DOMException}} and stop. @@ -2163,7 +2163,7 @@ partial interface MLGraphBuilder { 1. 1. If |options|.{{MLConv2dOptions/inputLayout}} is `undefined`, set it to `"nchw"`. 1. If |options|.{{MLConv2dOptions/filterLayout}} is `undefined`, set it to `"oihw"`. - 1. If |options|.{{MLConv2dOptions/bias}} [=map/exists=] and it is not an instance of {{MLOperand}}, then then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. If |options|.{{MLConv2dOptions/bias}} [=map/exists=] and it is not an instance of {{MLOperand}}, then throw a "{{TypeError}}" {{DOMException}} and stop. 1. If the length of |options|.{{MLConv2dOptions/bias}}.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}} is not 1, then throw a "{{TypeError}}" {{DOMException}} and stop. 1. If |options|.{{MLConv2dOptions/activation}} [=map/exists=] and it is not an instance of {{MLActivation}}, then throw a "{{TypeError}}" {{DOMException}} and stop. 1. Let |output_shape| be the result of calculating output dimensions based on input, filter, dilation, padding and stride, taking into account |options|.{{MLConv2dOptions/inputLayout}}. @@ -2319,11 +2319,11 @@ partial interface MLGraphBuilder { The {{MLGraphBuilder/convTranspose2d(input, filter, options)}} steps are:
- 1. If |input| or |filter| is not an instance of {{MLOperand}}, then then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. If |input| or |filter| is not an instance of {{MLOperand}}, then throw a "{{TypeError}}" {{DOMException}} and stop. 1. Let |input_size| be the size of |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}. 1. Let |filter_size| be the size of |filter|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}. - 1. If |input_size| is not `4`, then then throw a "{{DataError}}" {{DOMException}} and stop. - 1. If |filter_size| is not `4`, then then throw a "{{DataError}}" {{DOMException}} and stop. + 1. If |input_size| is not `4`, then throw a "{{DataError}}" {{DOMException}} and stop. + 1. If |filter_size| is not `4`, then throw a "{{DataError}}" {{DOMException}} and stop. 1. If |options| is `undefined`, let |options| be an empty [=object=]. 1. If |options|.{{MLConvTranspose2dOptions/padding}} is `undefined`, set it to `[0, 0, 0, 0]`. 1. If |options|.{{MLConvTranspose2dOptions/strides}} is `undefined`, set it to `[1, 1]`. @@ -2335,8 +2335,8 @@ partial interface MLGraphBuilder { 1. If |options|.{{MLConvTranspose2dOptions/groups}} is `undefined`, set it to `1`. 1. If |options|.{{MLConvTranspose2dOptions/inputLayout}} is `undefined`, set it to `"nchw"`. 1. If |options|.{{MLConvTranspose2dOptions/filterLayout}} is `undefined`, set it to `"iohw"`. - 1. If |options|.{{MLConvTranspose2dOptions/bias}} [=map/exists=] and it is not an instance of {{MLOperand}}, then then throw a "{{TypeError}}" {{DOMException}} and stop. - 1. If |options|.{{MLConvTranspose2dOptions/activation}} [=map/exists=] and it is not an instance of {{MLActivation}}, then then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. If |options|.{{MLConvTranspose2dOptions/bias}} [=map/exists=] and it is not an instance of {{MLOperand}}, then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. If |options|.{{MLConvTranspose2dOptions/activation}} [=map/exists=] and it is not an instance of {{MLActivation}}, then throw a "{{TypeError}}" {{DOMException}} and stop. 1. Let |output_shape| be the result of calculating output dimensions based on |input|, |filter|, |options|.{{MLConvTranspose2dOptions/dilations}}, |options|.{{MLConvTranspose2dOptions/padding}} and |options|.{{MLConvTranspose2dOptions/strides}}, taking into account |options|.{{MLConvTranspose2dOptions/inputLayout}}. 1. Let |desc| a new {{MLOperandDescriptor}}. 1. Set |desc|.{{MLOperandDescriptor/type}} to |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/type}}. @@ -2629,7 +2629,7 @@ partial interface MLGraphBuilder {
1. If |options| is not an [=object=] that [=implements=] {{MLEluOptions}}, then return `false`. 1. If |options|.{{MLEluOptions/alpha}} is `undefined`, set |options|.{{MLEluOptions/alpha}} to `1`. - 1. Else if |options|.{{MLEluOptions/alpha}} is not a [=numeric type=], then then return `false`. + 1. Else if |options|.{{MLEluOptions/alpha}} is not a [=numeric type=], then return `false`. 1. Return `true`.
@@ -3220,9 +3220,9 @@ partial interface MLGraphBuilder {
1. If |options| is not an [=object=] that [=implements=] {{MLHardSigmoidOptions}}, then return `false`. 1. If |options|.{{MLEluOptions/alpha}} is `undefined`, set |options|.{{MLHardSigmoidOptions/alpha}} to `0.2`. - 1. Else if |options|.{{MLHardSigmoidOptions/alpha}} is not a [=numeric type=], then then return `false`. + 1. Else if |options|.{{MLHardSigmoidOptions/alpha}} is not a [=numeric type=], then return `false`. 1. If |options|.{{MLHardSigmoidOptions/beta}} is `undefined`, set |options|.{{MLHardSigmoidOptions/beta}} to `0.5`. - 1. Else if |options|.{{MLHardSigmoidOptions/beta}} is not a [=numeric type=], then then return `false`. + 1. Else if |options|.{{MLHardSigmoidOptions/beta}} is not a [=numeric type=], then return `false`. 1. Return `true`.
@@ -3509,7 +3509,7 @@ partial interface MLGraphBuilder {
1. If |options| is not an [=object=] that [=implements=] {{MLLeakyReluOptions}}, then return `false`. 1. If |options|.{{MLLeakyReluOptions/alpha}} is `undefined`, set |options|.{{MLLeakyReluOptions/alpha}} to `1`. - 1. Else if |options|.{{MLLeakyReluOptions/alpha}} is not a [=numeric type=], then then return `false`. + 1. Else if |options|.{{MLLeakyReluOptions/alpha}} is not a [=numeric type=], then return `false`. 1. Return `true`.
@@ -3619,9 +3619,9 @@ partial interface MLGraphBuilder {
1. If |options| is not an [=object=] that [=implements=] {{MLLinearOptions}}, then return `false`. 1. If |options|.{{MLEluOptions/alpha}} is `undefined`, set |options|.{{MLLinearOptions/alpha}} to `1`. - 1. Else if |options|.{{MLLinearOptions/alpha}} is not a [=numeric type=], then then return `false`. + 1. Else if |options|.{{MLLinearOptions/alpha}} is not a [=numeric type=], then return `false`. 1. If |options|.{{MLLinearOptions/beta}} is `undefined`, set |options|.{{MLLinearOptions/beta}} to `0`. - 1. Else if |options|.{{MLLinearOptions/beta}} is not a [=numeric type=], then then return `false`. + 1. Else if |options|.{{MLLinearOptions/beta}} is not a [=numeric type=], then return `false`. 1. Return `true`.
@@ -5146,7 +5146,7 @@ partial interface MLGraphBuilder {
1. If |options| is not an [=object=], then return `false`. 1. If |options|.{{MLSoftplusOptions/steepness}} is `undefined`, set |options|.{{MLSoftplusOptions/steepness}} to `1`. - 1. Else if |options|.{{MLSoftplusOptions/steepness}} is not a [=numeric type=], then then return `false`. + 1. Else if |options|.{{MLSoftplusOptions/steepness}} is not a [=numeric type=], then return `false`. 1. Return `true`.
From b77369130c020fcd04d083d3296ae8930ae5689a Mon Sep 17 00:00:00 2001 From: Zoltan Kis Date: Mon, 14 Aug 2023 23:23:33 +0300 Subject: [PATCH 068/112] Fix typos Signed-off-by: Zoltan Kis --- index.bs | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/index.bs b/index.bs index 5511b04a..40a7a9ee 100644 --- a/index.bs +++ b/index.bs @@ -4399,7 +4399,7 @@ partial interface MLGraphBuilder { : autoPad :: - An {{MLAutoPad}} [=string=]]. + An {{MLAutoPad}} [=string=]. Specifies the automatic input padding options. The default value is *"explicit"*, which means that the values in the {{MLPool2dOptions/padding}} array should be used for input padding. When the option is set other than *"explicit"*, the values in the {{MLPool2dOptions/padding}} array are ignored. @@ -4445,7 +4445,7 @@ partial interface MLGraphBuilder { 1. If |options|.{{MLPool2dOptions/outputSizes}} [=map/exists=], or if |options|.{{MLPool2dOptions/padding}} is `undefined`, set |options|.{{MLPool2dOptions/padding}} to `[0, 0, 0, 0]`. 1. If |options|.{{MLPool2dOptions/strides}} is `undefined`, set |options|.{{MLPool2dOptions/strides}} to `[1, 1]`. 1. If |options|.{{MLPool2dOptions/dilations}} is `undefined`, set |options|.{{MLPool2dOptions/dilations}} to `[1, 1]`. - 1. If |options|.{{MLPool2dOptions/autoPad}} is `undefined`, set |options|.{{MLPool2dOptions/autoPad}} to `"explicit`. + 1. If |options|.{{MLPool2dOptions/autoPad}} is `undefined`, set |options|.{{MLPool2dOptions/autoPad}} to `"explicit"`. 1. If |options|.{{MLPool2dOptions/autoPad}} is not `"explicit"`, set |options|.{{MLPool2dOptions/padding}} to `[0, 0, 0, 0]`. 1. If |options|.{{MLPool2dOptions/layout}} is `undefined`, set |options|.{{MLPool2dOptions/layout}} to `"nchw"`. 1. If |options|.{{MLPool2dOptions/roundingType}} is `undefined`, set |options|.{{MLPool2dOptions/roundingType}} to `"floor"`. From 0ed9441a4d3520e5530155520e3b2bf622651670 Mon Sep 17 00:00:00 2001 From: Zoltan Kis Date: Mon, 14 Aug 2023 23:34:10 +0300 Subject: [PATCH 069/112] Fix clamp() steps according to the review in #446 Signed-off-by: Zoltan Kis --- index.bs | 10 +++++----- 1 file changed, 5 insertions(+), 5 deletions(-) diff --git a/index.bs b/index.bs index 40a7a9ee..8d51b98f 100644 --- a/index.bs +++ b/index.bs @@ -1873,16 +1873,16 @@ partial interface MLGraphBuilder {
     if (options.minValue === undefined) {
       if (options.maxValue === undefined) {
-        return x;
+        return operand;
       } else {
-        return builder.min(x, builder.constant(options.maxValue));
+        return builder.min(operand, builder.constant(options.maxValue));
       }
     } else {
       if (options.maxValue === undefined) {
-        return builder.max(x, builder.constant(options.minValue));
+        return builder.max(operand, builder.constant(options.minValue));
       } else {
         return builder.min(
-            builder.max(x, builder.constant(options.minValue)),
+            builder.max(operand, builder.constant(options.minValue)),
             builder.constant(options.maxValue));
       }
     }
@@ -1923,7 +1923,7 @@ partial interface MLGraphBuilder {
     1. If any of the following sub-steps fail, throw an "{{OperationError}}" {{DOMException}} and stop.
         1. Let |output| be the result of invoking the copy MLOperand steps given |operand|.
         1. Make a request to the underlying platform to:
-            1. Create an [=implementation-defined=] platform operator |clampImpl| for this method, given |options|.{{MLClampOptions/minValue}} and |options|.{{MLClampOptions/minValue}}.
+            1. Create an [=implementation-defined=] platform operator |clampImpl| for this method, given |options|.{{MLClampOptions/minValue}} and |options|.{{MLClampOptions/maxValue}}.
             1. Store a reference of |clampImpl| in |output|.{{MLOperand/[[operator]]}}.
             1. Create an [=implementation-defined=] platform operand |outputImpl| to represent clamp output, given |output| and |clampImpl|.
             1. Store a reference to |outputImpl| in |output|.{{MLOperand/[[operand]]}}.

From 8b36f8d85cf05239389a80ca4af042f5310073bf Mon Sep 17 00:00:00 2001
From: Zoltan Kis 
Date: Mon, 14 Aug 2023 23:49:53 +0300
Subject: [PATCH 070/112] Fix the pooling steps according to the review in #446

Signed-off-by: Zoltan Kis 
---
 index.bs | 6 ++++++
 1 file changed, 6 insertions(+)

diff --git a/index.bs b/index.bs
index 8d51b98f..b8a2ece9 100644
--- a/index.bs
+++ b/index.bs
@@ -4441,10 +4441,16 @@ partial interface MLGraphBuilder {
   
1. [=Assert=]: |op| is one of "averagePool2d", "l2Pool2d", "maxPool2d". 1. If |input| is not an instance of {{MLOperand}}, then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. If the length of of |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}} is not 4, then throw a "{{DataError}}" {{DOMException}} and stop. 1. If |options| is `undefined`, let |options| be a new {{MLPool2dOptions}} object. 1. If |options|.{{MLPool2dOptions/outputSizes}} [=map/exists=], or if |options|.{{MLPool2dOptions/padding}} is `undefined`, set |options|.{{MLPool2dOptions/padding}} to `[0, 0, 0, 0]`. + 1. If the length of of |padding|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}} is not 4, then throw a "{{DataError}}" {{DOMException}} and stop. 1. If |options|.{{MLPool2dOptions/strides}} is `undefined`, set |options|.{{MLPool2dOptions/strides}} to `[1, 1]`. + 1. If the length of of |options|.{{MLPool2dOptions/strides}} is not 2, then throw a "{{DataError}}" {{DOMException}} and stop. + 1. If any value in |options|.{{MLPool2dOptions/strides}} is not greater than 0, then throw a "{{DataError}}" {{DOMException}} and stop. 1. If |options|.{{MLPool2dOptions/dilations}} is `undefined`, set |options|.{{MLPool2dOptions/dilations}} to `[1, 1]`. + 1. If the length of of |options|.{{MLPool2dOptions/dilations}} is not 2, then throw a "{{DataError}}" {{DOMException}} and stop. + 1. If any value in |options|.{{MLPool2dOptions/dilations}} is not greater than 0, then throw a "{{DataError}}" {{DOMException}} and stop. 1. If |options|.{{MLPool2dOptions/autoPad}} is `undefined`, set |options|.{{MLPool2dOptions/autoPad}} to `"explicit"`. 1. If |options|.{{MLPool2dOptions/autoPad}} is not `"explicit"`, set |options|.{{MLPool2dOptions/padding}} to `[0, 0, 0, 0]`. 1. If |options|.{{MLPool2dOptions/layout}} is `undefined`, set |options|.{{MLPool2dOptions/layout}} to `"nchw"`. From 7397e1ac65a889a18f2e2fb3211c4305312f2216 Mon Sep 17 00:00:00 2001 From: Zoltan Kis Date: Tue, 15 Aug 2023 13:07:53 +0300 Subject: [PATCH 071/112] Fix the convTranspose2d() steps and improve conv2d(). Fix typos. Signed-off-by: Zoltan Kis --- index.bs | 32 +++++++++++++++++++++----------- 1 file changed, 21 insertions(+), 11 deletions(-) diff --git a/index.bs b/index.bs index b8a2ece9..e441c728 100644 --- a/index.bs +++ b/index.bs @@ -2160,15 +2160,14 @@ partial interface MLGraphBuilder { 1. Else if |options|.{{MLConv2dOptions/groups}} is 0, then throw a "{{DataError}}" {{DOMException}} and stop. 1. If |input_size| / |options|.{{MLConv2dOptions/groups}} is not equal to |filter_size|, then throw a "{{DataError}}" {{DOMException}} and stop. 1. Else if |input_size| % |options|.{{MLConv2dOptions/groups}} is not 0, then throw a "{{DataError}}" {{DOMException}} and stop. - 1. 1. If |options|.{{MLConv2dOptions/inputLayout}} is `undefined`, set it to `"nchw"`. 1. If |options|.{{MLConv2dOptions/filterLayout}} is `undefined`, set it to `"oihw"`. 1. If |options|.{{MLConv2dOptions/bias}} [=map/exists=] and it is not an instance of {{MLOperand}}, then throw a "{{TypeError}}" {{DOMException}} and stop. 1. If the length of |options|.{{MLConv2dOptions/bias}}.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}} is not 1, then throw a "{{TypeError}}" {{DOMException}} and stop. - 1. If |options|.{{MLConv2dOptions/activation}} [=map/exists=] and it is not an instance of {{MLActivation}}, then throw a "{{TypeError}}" {{DOMException}} and stop. - 1. Let |output_shape| be the result of calculating output dimensions based on input, filter, dilation, padding and stride, taking into account |options|.{{MLConv2dOptions/inputLayout}}. - 1. If the shape of |options|.{{MLConv2dOptions/bias}}.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}} is not the same as |output_shape|, then throw a "{{TypeError}}" {{DOMException}} and stop. 1. If |options|.{{MLConv2dOptions/bias}}.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/type}} is not the same as |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/type}}, then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. If |options|.{{MLConv2dOptions/activation}} [=map/exists=] and it is not an instance of {{MLActivation}}, then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. Let |output_shape| be the result of invoking the underlying implementation for calculating output dimensions, given |options|. + 1. If |output_shape| is not the same as the shape of |options|.{{MLConv2dOptions/bias}}.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}, then throw a "{{DataError}}" {{DOMException}} and stop. 1. Let |desc| a new {{MLOperandDescriptor}}. 1. Set |desc|.{{MLOperandDescriptor/type}} to |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/type}}. 1. Set |desc|.{{MLOperandDescriptor/dimensions}} to |output_shape|. @@ -2324,20 +2323,32 @@ partial interface MLGraphBuilder { 1. Let |filter_size| be the size of |filter|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}. 1. If |input_size| is not `4`, then throw a "{{DataError}}" {{DOMException}} and stop. 1. If |filter_size| is not `4`, then throw a "{{DataError}}" {{DOMException}} and stop. + 1. If the type of |input| and |filter| is not the same, then throw a "{{TypeError}}" {{DOMException}} and stop. 1. If |options| is `undefined`, let |options| be an empty [=object=]. 1. If |options|.{{MLConvTranspose2dOptions/padding}} is `undefined`, set it to `[0, 0, 0, 0]`. + 1. Else if |options|.{{MLConvTranspose2dOptions/padding}}.length is not 4, then throw a "{{DataError}}" {{DOMException}} and stop. 1. If |options|.{{MLConvTranspose2dOptions/strides}} is `undefined`, set it to `[1, 1]`. - 1. Else if |options|.{{MLConv2dOptions/strides}}.length is not `2`, then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. Else if |options|.{{MLConvTranspose2dOptions/strides}}.length is not `2`, then throw a "{{TypeError}}" {{DOMException}} and stop. 1. If any element in |options|.{{MLConv2dOptions/strides}} is equal to 0, then throw a "{{TypeError}}" {{DOMException}} and stop. 1. If |options|.{{MLConvTranspose2dOptions/dilations}} is `undefined`, set it to `[1, 1]`. + 1. Else if |options|.{{MLConvTranspose2dOptions/dilations}}.length is not `2`, then throw a "{{TypeError}}" {{DOMException}} and stop. 1. If |options|.{{MLConvTranspose2dOptions/outputPadding}} is `undefined`, set it to `[0, 0]`. + 1. Else if |options|.{{MLConvTranspose2dOptions/outputPadding}}.length is not `2`, then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. If |options|.{{MLConvTranspose2dOptions/outputSizes}} [=map/exists=]: + 1. If |options|.{{MLConvTranspose2dOptions/outputSizes}}.length is not `2`, then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. If the elements of |options|.{{MLConvTranspose2dOptions/outputSizes}} are not smaller than the elements at the same dimension (index) for |options|.{{MLConvTranspose2dOptions/strides}}, then throw a "{{DataError}}" {{DOMException}} and stop. 1. If |options|.{{MLConvTranspose2dOptions/autoPad}} is `undefined`, set it to `"explicit"`. 1. If |options|.{{MLConvTranspose2dOptions/groups}} is `undefined`, set it to `1`. + 1. If |input_size| / |options|.{{MLConvTranspose2dOptions/groups}} is not equal to |filter_size|, then throw a "{{DataError}}" {{DOMException}} and stop. + 1. Else if |input_size| % |options|.{{MLConvTranspose2dOptions/groups}} is not 0, then throw a "{{DataError}}" {{DOMException}} and stop. 1. If |options|.{{MLConvTranspose2dOptions/inputLayout}} is `undefined`, set it to `"nchw"`. 1. If |options|.{{MLConvTranspose2dOptions/filterLayout}} is `undefined`, set it to `"iohw"`. 1. If |options|.{{MLConvTranspose2dOptions/bias}} [=map/exists=] and it is not an instance of {{MLOperand}}, then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. If the length of |options|.{{MLConvTranspose2dOptions/bias}}.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}} is not 1, then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. If |options|.{{MLConvTranspose2dOptions/bias}}.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/type}} is not the same as |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/type}}, then throw a "{{TypeError}}" {{DOMException}} and stop. 1. If |options|.{{MLConvTranspose2dOptions/activation}} [=map/exists=] and it is not an instance of {{MLActivation}}, then throw a "{{TypeError}}" {{DOMException}} and stop. - 1. Let |output_shape| be the result of calculating output dimensions based on |input|, |filter|, |options|.{{MLConvTranspose2dOptions/dilations}}, |options|.{{MLConvTranspose2dOptions/padding}} and |options|.{{MLConvTranspose2dOptions/strides}}, taking into account |options|.{{MLConvTranspose2dOptions/inputLayout}}. + 1. Let |output_shape| be the result of invoking the underlying implementation for calculating output dimensions, given |options|. + 1. If |output_shape| is not the same as the shape of |options|.{{MLConvTranspose2dOptions/bias}}.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}, then throw a "{{DataError}}" {{DOMException}} and stop. 1. Let |desc| a new {{MLOperandDescriptor}}. 1. Set |desc|.{{MLOperandDescriptor/type}} to |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/type}}. 1. Set |desc|.{{MLOperandDescriptor/dimensions}} to |output_shape|. @@ -2356,8 +2367,7 @@ partial interface MLGraphBuilder { ### Element-wise binary operations ### {#api-mlgraphbuilder-binary} -Compute the element-wise binary addition, subtraction, multiplication, division, -maximum and minimum of the two input tensors. +Compute the element-wise binary addition, subtraction, multiplication, division, power, maximum and minimum of the two input tensors. The element-wise binary operations will be broadcasted according to [[!numpy-broadcasting-rule]]. The rank of the output tensor is the maximum @@ -2404,7 +2414,7 @@ partial interface MLGraphBuilder { 1. If |a| or |b| is not an instance of {{MLOperand}}, then throw a "{{TypeError}}" {{DOMException}} and stop. 1. If |a|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/type}} is not equal to |b|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/type}}, then throw a "{{DataError}}" {{DOMException}} and stop. 1. Let |descriptor| be a new {{MLOperandDescriptor}}. - 1. Set |descriptor|.{{MLOperandDescriptor/dimensions}}.{{MLOperandDescriptor/type}} to |a|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/type}}. + 1. Set |descriptor|.{{MLOperandDescriptor/type}} to |a|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/type}}. 1. Let |descriptor|.{{MLOperandDescriptor/dimensions}} be the result of running the [=MLGraphBuilder/broadcast-shapes=] steps given |a|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}} and |b|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}. 1. If that throws an error, re-throw the error and stop. 1. If any of the following sub-steps fail, throw an "{{OperationError}}" {{DOMException}} and stop. @@ -4520,7 +4530,7 @@ partial interface MLGraphBuilder {
1. If |input| or |slope| is not an instance of {{MLOperand}}, then throw a "{{TypeError}}" {{DOMException}} and stop. 1. Let |descriptor| be a new {{MLOperandDescriptor}}. - 1. Set |descriptor|.{{MLOperandDescriptor/dimensions}}.{{MLOperandDescriptor/type}} to |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/type}}. + 1. Set |descriptor|.{{MLOperandDescriptor/type}} to |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/type}}. 1. Let |descriptor|.{{MLOperandDescriptor/dimensions}} be the result of running the [=MLGraphBuilder/broadcast-shapes=] steps given |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}} and |slope|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}. 1. If that throws an error, re-throw the error and stop. 1. If any of the following sub-steps fail, throw an "{{OperationError}}" {{DOMException}} and stop. @@ -4893,7 +4903,7 @@ partial interface MLGraphBuilder { 1. If the size of |newShape| is `0`, set |outputShape| to `[ 1 ]` (reshaping to scalar). 1. If |newShape| contains more than one `null` value, then throw a "{{DataError}}" {{DOMException}} and stop. 1. If any value in |newShape| is `0`, then throw a "{{DataError}}" {{DOMException}} and stop. - 1. Let |inputElementCount| be the product of all elements in |inputs|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}. + 1. Let |inputElementCount| be the product of all elements in |inputs|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}. 1. If |newShape| contains a `null` value, set that value to |inputElementCount| divided by the product of all other values in |newShape|. 1. If that value is too large for {{unsigned long}}, then throw a "{{DataError}}" {{DOMException}} and stop. 1. If product of all values in |newShape| is not equal to |inputElementCount|, then throw a "{{DataError}}" {{DOMException}} and stop. From 4d57aadf38d4e8a134d5c2b199111712dfe9a907 Mon Sep 17 00:00:00 2001 From: Zoltan Kis Date: Tue, 15 Aug 2023 17:21:17 +0300 Subject: [PATCH 072/112] Fix the softmax() steps to address the review in #446 Signed-off-by: Zoltan Kis --- index.bs | 1 + 1 file changed, 1 insertion(+) diff --git a/index.bs b/index.bs index e441c728..d52a80c3 100644 --- a/index.bs +++ b/index.bs @@ -5083,6 +5083,7 @@ partial interface MLGraphBuilder {
1. If |input| is not an instance of {{MLOperand}}, then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. If the length of |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}} is not 2, then throw a "{{DataError}}" {{DOMException}} and stop. 1. If any of the following sub-steps fail, throw an "{{OperationError}}" {{DOMException}} and stop. 1. Let |output| be the result of invoking the copy MLOperand steps given |input|. 1. Make a request to the underlying platform to: From ad720c3f386f1450efc5c59dfe86cab339656fa7 Mon Sep 17 00:00:00 2001 From: Zoltan Kis Date: Tue, 15 Aug 2023 22:56:53 +0300 Subject: [PATCH 073/112] Fix #450: replace unnecessary type checks with asserts Signed-off-by: Zoltan Kis --- index.bs | 136 +++++++++++++++++++++++++++++-------------------------- 1 file changed, 71 insertions(+), 65 deletions(-) diff --git a/index.bs b/index.bs index d52a80c3..a394d073 100644 --- a/index.bs +++ b/index.bs @@ -965,7 +965,7 @@ The {{MLOperand}} objects are created by the methods of {{MLGraphBuilder}}, inte To create MLOperand given |builder| and |desc|, run the following steps:
- 1. If |builder| is not an instance of {{MLGraphBuilder}}, then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. [=Assert=]: the type of |builder| is {{MLGraphBuilder}}. 1. If |desc| is not an [=object=] that [=implements=] {{MLOperandDescriptor}}, then throw a "{{TypeError}}" {{DOMException}} and stop. 1. Let |operand| be a new [=object=]. 1. Set |operand|.{{MLOperand/[[builder]]}} to |builder|. @@ -979,7 +979,7 @@ The {{MLOperand}} objects are created by the methods of {{MLGraphBuilder}}, inte To copy MLOperand given |operand|, run the following steps:
- 1. If |operand| is not an instance of {{MLOperand}}, then throw a "{{TypeError}}" and stop. + 1. [=Assert=]: the type of |operand| is {{MLOperand}}. 1. Let |result| be a new [=object=]. 1. Set |result|.{{MLOperand/[[builder]]}} to |operand|.{{MLOperand/[[builder]]}}. 1. Set |result|.{{MLOperand/[[descriptor]]}} to |operand|.{{MLOperand/[[descriptor]]}}. @@ -1006,7 +1006,7 @@ The {{MLOperand}} objects are created by the methods of {{MLGraphBuilder}}, inte To validate MLOperand given |operand| and |builder|, run the following steps:
- 1. If |operand|.{{MLOperand/[[builder]]}} is not an instance of {{MLGraphBuilder}}, return `false`. + 1. [=Assert=]: the type of |operand|.{{MLOperand/[[builder]]}} is {{MLGraphBuilder}}. 1. If |builder| is not `undefined` and is not equal to |operand|.{{MLOperand/[[builder]]}}, return `false`. 1. Let |desc| be |operand|.{{MLOperand/[[descriptor]]}}. 1. If |desc| is not an [=object=] that [=implements=] {{MLOperandDescriptor}}, return `false`. @@ -1057,7 +1057,7 @@ The {{MLActivation}} objects (including the ones passed as input to methods) are To create MLActivation given |builder|, |name|, |options| and |init-steps|, run the following steps:
- 1. If |builder| is not an instance of {{MLGraphBuilder}}, throw a "{{TypeError}}" and abort these steps. + 1. [=Assert=]: the type of |builder| is {{MLGraphBuilder}}. 1. If |name| is `undefined` or `null`, throw a "{{TypeError}}" and abort these steps. 1. Let |activation| be a new [=object=]. 1. Set |activation|.{{MLActivation/[[builder]]}} to |builder|. @@ -1653,10 +1653,12 @@ Build a composed graph up to a given output operand into a computational graph,
The permissions and context validity have been checked by [[#api-mlgraphbuilder-constructor]] steps.
- 1. If |outputs| is not an instance of {{MLNamedOperands}} or otherwise if empty, then throw an "{{TypeError}}" {{DOMException}} and stop. + 1. [=Assert=]: the type of |outputs| is {{MLNamedOperands}}. + 1. If |outputs| is empty, then throw an "{{TypeError}}" {{DOMException}} and stop. 1. For each |element| in |outputs|: - 1. If |element|.key is not a [=string=] or otherwise if empty, then throw an "{{TypeError}}" {{DOMException}} and stop. - 1. If |element|.value is not an instance of {{MLOperand}}, then throw an "{{TypeError}}" {{DOMException}} and stop. + 1. [=Assert=]: the type of |element|.key is [=string=]. + 1. If |element|.key is empty, then throw an "{{TypeError}}" {{DOMException}} and stop. + 1. [=Assert=]: the type of |element|.value is {{MLOperand}}. 1. If any of the following sub-steps fail, throw an "{{OperationError}}" {{DOMException}} and stop. 1. Let |graph| be a new {{MLGraph}}: 1. Set |graph|.{{MLGraph/[[context]]}} to [=this=].{{MLGraphBuilder/[[context]]}}. @@ -2141,7 +2143,7 @@ partial interface MLGraphBuilder { The {{MLGraphBuilder/conv2d(input, filter, options)}} steps are:
- 1. If |input| or |filter| is not an instance of {{MLOperand}}, then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. [=Assert=]: the type of |input| and |filter| is {{MLOperand}}. 1. Let |input_size| be the size of |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}. 1. Let |filter_size| be the size of |filter|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}. 1. If |input_size| is not `4`, then throw a "{{DataError}}" {{DOMException}} and stop. @@ -2162,10 +2164,12 @@ partial interface MLGraphBuilder { 1. Else if |input_size| % |options|.{{MLConv2dOptions/groups}} is not 0, then throw a "{{DataError}}" {{DOMException}} and stop. 1. If |options|.{{MLConv2dOptions/inputLayout}} is `undefined`, set it to `"nchw"`. 1. If |options|.{{MLConv2dOptions/filterLayout}} is `undefined`, set it to `"oihw"`. - 1. If |options|.{{MLConv2dOptions/bias}} [=map/exists=] and it is not an instance of {{MLOperand}}, then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. If |options|.{{MLConv2dOptions/bias}} [=map/exists=]: + 1. [=Assert=]: the type of |options|.{{MLConv2dOptions/bias}} is {{MLOperand}}. 1. If the length of |options|.{{MLConv2dOptions/bias}}.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}} is not 1, then throw a "{{TypeError}}" {{DOMException}} and stop. 1. If |options|.{{MLConv2dOptions/bias}}.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/type}} is not the same as |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/type}}, then throw a "{{TypeError}}" {{DOMException}} and stop. - 1. If |options|.{{MLConv2dOptions/activation}} [=map/exists=] and it is not an instance of {{MLActivation}}, then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. If |options|.{{MLConv2dOptions/activation}} [=map/exists=]: + 1. [=Assert=]: the type of |options|.{{MLConv2dOptions/activation}} is {{MLActivation}}. 1. Let |output_shape| be the result of invoking the underlying implementation for calculating output dimensions, given |options|. 1. If |output_shape| is not the same as the shape of |options|.{{MLConv2dOptions/bias}}.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}, then throw a "{{DataError}}" {{DOMException}} and stop. 1. Let |desc| a new {{MLOperandDescriptor}}. @@ -2318,7 +2322,7 @@ partial interface MLGraphBuilder { The {{MLGraphBuilder/convTranspose2d(input, filter, options)}} steps are:
- 1. If |input| or |filter| is not an instance of {{MLOperand}}, then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. [=Assert=]: the type of |input| and |filter| is {{MLOperand}}. 1. Let |input_size| be the size of |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}. 1. Let |filter_size| be the size of |filter|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}. 1. If |input_size| is not `4`, then throw a "{{DataError}}" {{DOMException}} and stop. @@ -2343,10 +2347,12 @@ partial interface MLGraphBuilder { 1. Else if |input_size| % |options|.{{MLConvTranspose2dOptions/groups}} is not 0, then throw a "{{DataError}}" {{DOMException}} and stop. 1. If |options|.{{MLConvTranspose2dOptions/inputLayout}} is `undefined`, set it to `"nchw"`. 1. If |options|.{{MLConvTranspose2dOptions/filterLayout}} is `undefined`, set it to `"iohw"`. - 1. If |options|.{{MLConvTranspose2dOptions/bias}} [=map/exists=] and it is not an instance of {{MLOperand}}, then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. If |options|.{{MLConvTranspose2dOptions/bias}} [=map/exists=]: + 1. [=Assert=]: the type of |options|.{{MLConvTranspose2dOptions/bias}} is {{MLOperand}}. 1. If the length of |options|.{{MLConvTranspose2dOptions/bias}}.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}} is not 1, then throw a "{{TypeError}}" {{DOMException}} and stop. 1. If |options|.{{MLConvTranspose2dOptions/bias}}.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/type}} is not the same as |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/type}}, then throw a "{{TypeError}}" {{DOMException}} and stop. - 1. If |options|.{{MLConvTranspose2dOptions/activation}} [=map/exists=] and it is not an instance of {{MLActivation}}, then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. If |options|.{{MLConvTranspose2dOptions/activation}} [=map/exists=]: + 1. [=Assert=]: the type of |options|.{{MLConvTranspose2dOptions/activation}} is {{MLActivation}}. 1. Let |output_shape| be the result of invoking the underlying implementation for calculating output dimensions, given |options|. 1. If |output_shape| is not the same as the shape of |options|.{{MLConvTranspose2dOptions/bias}}.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}, then throw a "{{DataError}}" {{DOMException}} and stop. 1. Let |desc| a new {{MLOperandDescriptor}}. @@ -2411,7 +2417,7 @@ partial interface MLGraphBuilder {
1. [=Assert=]: |op| is one of "add", "sub", "mul", "div", "max", "min", "pow". - 1. If |a| or |b| is not an instance of {{MLOperand}}, then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. [=Assert=]: the type of |a| and |b| is {{MLOperand}}. 1. If |a|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/type}} is not equal to |b|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/type}}, then throw a "{{DataError}}" {{DOMException}} and stop. 1. Let |descriptor| be a new {{MLOperandDescriptor}}. 1. Set |descriptor|.{{MLOperandDescriptor/type}} to |a|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/type}}. @@ -2530,7 +2536,7 @@ partial interface MLGraphBuilder {
1. [=Assert=]: |op| is one of "abs", "ceil", "cos", "exp", "floor", "log", "neg", "sin", "tan". - 1. If |input| is not an instance of {{MLOperand}}, then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. [=Assert=]: the type of |input| is {{MLOperand}}. 1. Let |kind| be `"output"`. 1. Let |descriptor| be a new {{MLOperandDescriptor}}. 1. If any of the following sub-steps fail, throw an "{{OperationError}}" {{DOMException}} and stop. @@ -2753,7 +2759,7 @@ partial interface MLGraphBuilder { The {{MLGraphBuilder/gemm(a, b, options)}} steps are:
- 1. If |a| or |b| is not an instance of {{MLOperand}}, then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. [=Assert=]: the type of |a| and |b| is {{MLOperand}}. 1. If |options| is `undefined`, let |options| be an empty [=object=]. 1. If |options|.{{MLGemmOptions/alpha}} is `undefined`, set it to `1.0`. 1. If |options|.{{MLGemmOptions/beta}} is `undefined`, set it to `1.0`. @@ -2891,18 +2897,18 @@ partial interface MLGraphBuilder { The {{MLGraphBuilder/gru(input, weight, recurrentWeight, steps, hiddenSize, options)}} steps are:
- 1. If |input|, |weight| or |recurrentWeight| is not an instance of {{MLOperand}}, then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. [=Assert=]: the type of |input|, |weight| and |recurrentWeight| is {{MLOperand}}. 1. If the rank of |input| or |weight| is not `3`, then throw a "{{DataError}}" {{DOMException}} and stop. 1. If the rank of |weight| or |recurrentWeight| is not `2`, then throw a "{{DataError}}" {{DOMException}} and stop. 1. If |options| is `undefined`, let |options| be an empty [=object=]. 1. If |options|.{{MLGruOptions/bias}} [=map/exists=]. - 1. If it is not an instance of {{MLOperand}}, then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. [=Assert=]: its type is {{MLOperand}}. 1. If its rank is not `2`, then throw a "{{DataError}}" {{DOMException}} and stop. 1. If |options|.{{MLGruOptions/recurrentBias}} [=map/exists=]. - 1. If it is not an instance of {{MLOperand}}, then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. [=Assert=]: its type is {{MLOperand}}. 1. If its rank is not `2`, then throw a "{{DataError}}" {{DOMException}} and stop. 1. If |options|.{{MLGruOptions/initialHiddenState}} [=map/exists=]. - 1. If it is not an instance of {{MLOperand}}, then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. [=Assert=]: its type is {{MLOperand}}. 1. If its rank is not `3`, then throw a "{{DataError}}" {{DOMException}} and stop. 1. If |options|.{{MLGruOptions/resetAfter}} is `undefined`, set it to `true`. 1. If |options|.{{MLGruOptions/returnSequence}} is `undefined`, set it to `false`. @@ -2910,7 +2916,7 @@ partial interface MLGraphBuilder { 1. If |options|.{{MLGruOptions/direction}} is not one of {{MLRecurrentNetworkDirection}}, then throw a "{{TypeError}}" {{DOMException}} and stop. 1. If |options|.{{MLGruOptions/layout}} is `undefined`, set it to `"zrn"`. 1. If |options|.{{MLGruOptions/layout}} is not one of {{MLGruWeightLayout}}, then throw a "{{TypeError}}" {{DOMException}} and stop. - 1. If |options|.{{MLGruOptions/activations}} [=map/exists=] and is not an array of size `2`, or if any of its elements is not an instance of {{MLActivation}}, then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. If |options|.{{MLGruOptions/activations}} [=map/exists=] and is not an array of {{MLActivation}} objects with size `2`, then throw a "{{TypeError}}" {{DOMException}} and stop. 1. If |steps| is not a [=number=] or it is `0`, then throw a "{{TypeError}}" {{DOMException}} and stop. 1. Let |output| be an empty sequence of {{MLOperand}} objects. 1. If any of the following sub-steps fail, throw an "{{OperationError}}" {{DOMException}} and stop. @@ -3046,20 +3052,20 @@ partial interface MLGraphBuilder { The {{MLGraphBuilder/gruCell(input, weight, recurrentWeight, hiddenState, hiddenSize, options)}} steps are:
- 1. If |input|, |weight| or |recurrentWeight| is not an instance of {{MLOperand}}, then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. [=Assert=]: the type of |input|, |weight| and |recurrentWeight| is {{MLOperand}}. 1. If the rank of |input| or |weight| is not `3`, then throw a "{{DataError}}" {{DOMException}} and stop. 1. If the rank of |weight| or |recurrentWeight| is not `2`, then throw a "{{DataError}}" {{DOMException}} and stop. 1. If |options| is `undefined`, let |options| be an empty [=object=]. - 1. If |options|.{{MLGruOptions/bias}} [=map/exists=]. - 1. If it is not an instance of {{MLOperand}}, then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. If |options|.{{MLGruOptions/bias}} [=map/exists=]: + 1. [=Assert=]: its type is {{MLOperand}}. 1. If its rank is not `1`, then throw a "{{DataError}}" {{DOMException}} and stop. - 1. If |options|.{{MLGruOptions/recurrentBias}} [=map/exists=]. - 1. If it is not an instance of {{MLOperand}}, then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. If |options|.{{MLGruOptions/recurrentBias}} [=map/exists=]: + 1. [=Assert=]: its type is {{MLOperand}}. 1. If its rank is not `1`, then throw a "{{DataError}}" {{DOMException}} and stop. 1. If |options|.{{MLGruOptions/resetAfter}} is `undefined`, set it to `true`. 1. If |options|.{{MLGruOptions/layout}} is `undefined`, set it to `"zrn"`. 1. If |options|.{{MLGruOptions/layout}} is not one of {{MLGruWeightLayout}}, then throw a "{{TypeError}}" {{DOMException}} and stop. - 1. If |options|.{{MLGruOptions/activations}} [=map/exists=] and is not an array of size `2`, or if any of its elements is not an instance of {{MLActivation}}, then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. If |options|.{{MLGruOptions/activations}} [=map/exists=] and is not an array of {{MLActivation}} objects with size `2`, then throw a "{{TypeError}}" {{DOMException}} and stop. 1. Let |desc| a new {{MLOperandDescriptor}}. 1. Set |desc|.{{MLOperandDescriptor/dimensions}} to [ |input|.{{MLOperandDescriptor/dimensions}}[0], |hiddenSize| ]. 1. Set |desc|.{{MLOperandDescriptor/type}} to |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/type}}. @@ -3414,12 +3420,12 @@ The {{MLInstanceNormalizationOptions}} members are: The {{MLGraphBuilder/instanceNormalization(input, options)}} steps are:
- 1. If |input| is not an instance of {{MLOperand}}, then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. [=Assert=]: the type of |input| is {{MLOperand}}. 1. If the rank of |input| is not `4`, then throw a "{{DataError}}" {{DOMException}} and stop. 1. If |options| is `undefined`, let |options| be an empty [=object=]. - 1. If |options|.{{MLInstanceNormalizationOptions/scale}} is not an instance of {{MLOperand}}, then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. [=Assert=]: the type of |options|.{{MLInstanceNormalizationOptions/scale}} is {{MLOperand}}. 1. If the rank of |options|.{{MLInstanceNormalizationOptions/scale}} is not equal to the size of the channel dimension of |input|, then throw a "{{DataError}}" {{DOMException}} and stop. - 1. If |options|.{{MLInstanceNormalizationOptions/bias}} is not an instance of {{MLOperand}}, then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. [=Assert=]: the type of |options|.{{MLInstanceNormalizationOptions/bias}} is {{MLOperand}}. 1. If the rank of |options|.{{MLInstanceNormalizationOptions/bias}} is not equal to the size of the channel dimension of |input|, then throw a "{{DataError}}" {{DOMException}} and stop. 1. If |options|.{{MLInstanceNormalizationOptions/epsilon}} is `undefined`, let it be `0.00001`. 1. If |options|.{{MLInstanceNormalizationOptions/layout}} is `undefined`, let it be `"nchw"`. @@ -3777,35 +3783,35 @@ partial interface MLGraphBuilder { 1. If |options|.{{MLLstmOptions/direction}} is `undefined`, set it to `"forward"`. 1. If |options|.{{MLLstmOptions/direction}} is not one of {{MLRecurrentNetworkDirection}}, then throw a "{{TypeError}}" {{DOMException}} and stop. 1. Let |num_directions| be `1` if |options|.{{MLLstmOptions/direction}} is `"forward"`, or otherwise let it be `2`. - 1. If |input|, |weight| or |recurrentWeight| is not an instance of {{MLOperand}}, then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. [=Assert=]: the type of |input|, |weight| and |recurrentWeight| is {{MLOperand}}.
The shape of |input|, |weight| or |recurrentWeight| could be also checked here.
1. If |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}[0] is not equal to |steps|, then throw a "{{DataError}}" {{DOMException}} and stop. 1. Let |batch_size| be |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}[1]. - 1. If |options|.{{MLLstmOptions/bias}} [=map/exists=]. - 1. If it is not an instance of {{MLOperand}}, then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. If |options|.{{MLLstmOptions/bias}} [=map/exists=]: + 1. [=Assert=]: its type is {{MLOperand}}. 1. If its rank is not `2`, then throw a "{{DataError}}" {{DOMException}} and stop. 1. If |options|.{{MLLstmOptions/bias}}.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}[0] is not |num_directions|, then throw a "{{DataError}}" {{DOMException}} and stop. 1. If |options|.{{MLLstmOptions/bias}}.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}[1] is not 4 * |hiddenSize|, then throw a "{{DataError}}" {{DOMException}} and stop. - 1. If |options|.{{MLLstmOptions/recurrentBias}} [=map/exists=]. - 1. If it is not an instance of {{MLOperand}}, then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. If |options|.{{MLLstmOptions/recurrentBias}} [=map/exists=]: + 1. [=Assert=]: its type is {{MLOperand}}. 1. If its rank is not `2`, then throw a "{{DataError}}" {{DOMException}} and stop. 1. If |options|.{{MLLstmOptions/recurrentBias}}.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}[0] is not |num_directions|, then throw a "{{DataError}}" {{DOMException}} and stop. 1. If |options|.{{MLLstmOptions/recurrentBias}}.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}[1] is not 4 * |hiddenSize|, then throw a "{{DataError}}" {{DOMException}} and stop. - 1. If |options|.{{MLLstmOptions/peepholeWeight}} [=map/exists=]. - 1. If it is not an instance of {{MLOperand}}, then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. If |options|.{{MLLstmOptions/peepholeWeight}} [=map/exists=]: + 1. [=Assert=]: its type is {{MLOperand}}. 1. If its rank is not `2`, then throw a "{{DataError}}" {{DOMException}} and stop. 1. If |options|.{{MLLstmOptions/peepholeWeight}}.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}[0] is not |num_directions|, then throw a "{{DataError}}" {{DOMException}} and stop. 1. If |options|.{{MLLstmOptions/peepholeWeight}}.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}[1] is not 4 * |hiddenSize|, then throw a "{{DataError}}" {{DOMException}} and stop. - 1. If |options|.{{MLLstmOptions/initialHiddenState}} [=map/exists=]. - 1. If it is not an instance of {{MLOperand}}, then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. If |options|.{{MLLstmOptions/initialHiddenState}} [=map/exists=]: + 1. [=Assert=]: its type is {{MLOperand}}. 1. If its rank is not `3`, then throw a "{{DataError}}" {{DOMException}} and stop. 1. If |options|.{{MLLstmOptions/initialHiddenState}}.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}[0] is not |num_directions|, then throw a "{{DataError}}" {{DOMException}} and stop. 1. If |options|.{{MLLstmOptions/initialHiddenState}}.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}[1] is not equal to |batch_size|, then throw a "{{DataError}}" {{DOMException}} and stop. 1. If |options|.{{MLLstmOptions/initialHiddenState}}.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}[2] is not |hiddenSize|, then throw a "{{DataError}}" {{DOMException}} and stop. - 1. If |options|.{{MLLstmOptions/initialCellState}} [=map/exists=]. - 1. If it is not an instance of {{MLOperand}}, then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. If |options|.{{MLLstmOptions/initialCellState}} [=map/exists=]: + 1. [=Assert=]: its type is {{MLOperand}}. 1. If its rank is not `3`, then throw a "{{DataError}}" {{DOMException}} and stop. 1. If |options|.{{MLLstmOptions/initialCellState}}.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}[0] is not |num_directions|, then throw a "{{DataError}}" {{DOMException}} and stop. 1. If |options|.{{MLLstmOptions/initialCellState}}.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}[1] is not equal to |batch_size|, then throw a "{{DataError}}" {{DOMException}} and stop. @@ -3815,7 +3821,7 @@ partial interface MLGraphBuilder { 1. If |options|.{{MLLstmOptions/layout}} is not one of {{MLLstmWeightLayout}}, then throw a "{{TypeError}}" {{DOMException}} and stop. 1. If |options|.{{MLLstmOptions/activations}} [=map/exists=]: 1. If it is not an array of size `3`, then throw a "{{TypeError}}" {{DOMException}} and stop. - 1. If any of its elements is not an instance of {{MLActivation}}, then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. [=Assert=]: the type of its elements is {{MLActivation}}. 1. If any of the following sub-steps fail, throw an "{{OperationError}}" {{DOMException}} and stop. 1. Let |desc| a new {{MLOperandDescriptor}}. 1. Set |desc|.{{MLOperandDescriptor/dimensions}} to [ |nume_directions|, |batch_size|, |hiddenSize| ]. @@ -3977,27 +3983,27 @@ partial interface MLGraphBuilder { The {{MLGraphBuilder/lstmCell(input, weight, recurrentWeight, hiddenState, cellState, hiddenSize, options)}} steps are:
- 1. If |input|, |weight|, |recurrentWeight|, |hiddenState| or |cellState| is not an instance of {{MLOperand}}, then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. [=Assert=]: the type of |input|, |weight|, |recurrentWeight|, |hiddenState| and |cellState| is {{MLOperand}}. 1. If the rank of |input|, |weight|, |recurrentWeight|, |hiddenState| or |cellState| is not `2`, then throw a "{{DataError}}" {{DOMException}} and stop. 1. Let |batch_size| be |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}[0]. 1. If |options| is `undefined`, let |options| be an empty [=object=]. - 1. If |options|.{{MLLstmCellOptions/bias}} [=map/exists=]. - 1. If it is not an instance of {{MLOperand}}, then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. If |options|.{{MLLstmCellOptions/bias}} [=map/exists=]: + 1. [=Assert=]: its type is {{MLOperand}}. 1. If its rank is not `1`, then throw a "{{DataError}}" {{DOMException}} and stop. 1. If |options|.{{MLLstmCellOptions/bias}}.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}[0] is not 4 * |hiddenSize|, then throw a "{{DataError}}" {{DOMException}} and stop. - 1. If |options|.{{MLLstmCellOptions/recurrentBias}} [=map/exists=]. - 1. If it is not an instance of {{MLOperand}}, then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. If |options|.{{MLLstmCellOptions/recurrentBias}} [=map/exists=]: + 1. [=Assert=]: its type is {{MLOperand}}. 1. If its rank is not `1`, then throw a "{{DataError}}" {{DOMException}} and stop. 1. If |options|.{{MLLstmCellOptions/recurrentBias}}.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}[0] is not 4 * |hiddenSize|, then throw a "{{DataError}}" {{DOMException}} and stop. - 1. If |options|.{{MLLstmCellOptions/peepholeWeight}} [=map/exists=]. - 1. If it is not an instance of {{MLOperand}}, then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. If |options|.{{MLLstmCellOptions/peepholeWeight}} [=map/exists=]: + 1. [=Assert=]: its type is {{MLOperand}}. 1. If its rank is not `1`, then throw a "{{DataError}}" {{DOMException}} and stop. 1. If |options|.{{MLLstmCellOptions/peepholeWeight}}.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}[0] is not 3 * |hiddenSize|, then throw a "{{DataError}}" {{DOMException}} and stop. 1. If |options|.{{MLLstmCellOptions/layout}} is `undefined`, set it to `"iofg"`. 1. If |options|.{{MLLstmCellOptions/layout}} is not one of {{MLLstmWeightLayout}}, then throw a "{{TypeError}}" {{DOMException}} and stop. 1. If |options|.{{MLLstmCellOptions/activations}} [=map/exists=]: 1. If it is not an array of size `3`, then throw a "{{TypeError}}" {{DOMException}} and stop. - 1. If any of its elements is not an instance of {{MLActivation}}, then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. [=Assert=]: the type of its elements is {{MLActivation}}. 1. Let |desc| a new {{MLOperandDescriptor}}. 1. Set |desc|.{{MLOperandDescriptor/dimensions}} to [ |batch_size|, |hiddenSize| ]. 1. Set |desc|.{{MLOperandDescriptor/type}} to |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/type}}. @@ -4181,7 +4187,7 @@ partial interface MLGraphBuilder { The {{MLGraphBuilder/matmul(a, b)}} steps are:
- 1. If |a| or |b| is not an instance of {{MLOperand}}, then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. [=Assert=]: the type of |a| and |b| is {{MLOperand}}. 1. Let |desc| a new {{MLOperandDescriptor}}. 1. Set |desc|.{{MLOperandDescriptor/dimensions}} to the result of invoking the calculate matmul output sizes given |a| and |b|. 1. Set |desc|.{{MLOperandDescriptor/type}} to |a|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/type}}. @@ -4266,7 +4272,7 @@ partial interface MLGraphBuilder { The {{MLGraphBuilder/pad(input, beginningPadding, endingPadding, options)}} steps are:
- 1. If |input| is not an instance of {{MLOperand}}, then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. [=Assert=]: the type of |input| is {{MLOperand}}. 1. If |beginningPadding| or |endingPadding| is not a sequence of {{unsigned long}}, then throw a "{{TypeError}}" {{DOMException}} and stop. 1. If |options| is `undefined`, let |options| be an empty [=object=]. 1. If |options|.{{MLPadOptions/mode}} is `undefined`, set it to `"constant"`. @@ -4450,7 +4456,7 @@ partial interface MLGraphBuilder {
1. [=Assert=]: |op| is one of "averagePool2d", "l2Pool2d", "maxPool2d". - 1. If |input| is not an instance of {{MLOperand}}, then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. [=Assert=]: the type of |input| is {{MLOperand}}. 1. If the length of of |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}} is not 4, then throw a "{{DataError}}" {{DOMException}} and stop. 1. If |options| is `undefined`, let |options| be a new {{MLPool2dOptions}} object. 1. If |options|.{{MLPool2dOptions/outputSizes}} [=map/exists=], or if |options|.{{MLPool2dOptions/padding}} is `undefined`, set |options|.{{MLPool2dOptions/padding}} to `[0, 0, 0, 0]`. @@ -4528,7 +4534,7 @@ partial interface MLGraphBuilder { The {{MLGraphBuilder/prelu(input, slope)}} steps are:
- 1. If |input| or |slope| is not an instance of {{MLOperand}}, then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. [=Assert=]: the type of |input| and |slope| is {{MLOperand}}. 1. Let |descriptor| be a new {{MLOperandDescriptor}}. 1. Set |descriptor|.{{MLOperandDescriptor/type}} to |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/type}}. 1. Let |descriptor|.{{MLOperandDescriptor/dimensions}} be the result of running the [=MLGraphBuilder/broadcast-shapes=] steps given |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}} and |slope|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}. @@ -4615,7 +4621,7 @@ partial interface MLGraphBuilder {
1. [=Assert=]: |op| is one of "reduceL1", "reduceL2", "reduceLogSum", "reduceLogSumExp", "reduceMax", "reduceMean", "reduceMin", "reduceProduct", "reduceSum", "reduceSumSquare". - 1. If |input| is not an instance of {{MLOperand}}, then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. [=Assert=]: the type of |input| is {{MLOperand}}. 1. If |options| is `undefined`, let |options| be a new {{MLReduceOptions}} object with |options|.{{MLReduceOptions/keepDimensions}} set to `false` and |options|.{{MLReduceOptions/axes}} set to `null`. 1. If any of the following sub-steps fail, throw an "{{OperationError}}" {{DOMException}} and stop. 1. Let |output| be the result of invoking the copy MLOperand steps given |input|. @@ -4723,7 +4729,7 @@ partial interface MLGraphBuilder { The {{MLGraphBuilder/relu(input)}} steps are:
- 1. If |input| is not an instance of {{MLOperand}}, then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. [=Assert=]: the type of |input| is {{MLOperand}}. 1. If any of the following sub-steps fail, throw an "{{OperationError}}" {{DOMException}} and stop. 1. Let |output| be the result of invoking the copy MLOperand steps given |input|. 1. Make a request to the underlying platform to: @@ -4896,7 +4902,7 @@ partial interface MLGraphBuilder { The {{MLGraphBuilder/reshape(input, newShape)}} steps are:
- 1. If |input| is not an instance of {{MLOperand}}, then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. [=Assert=]: the type of |input| is {{MLOperand}}. 1. Let |outputShape| be an empty array of {{unsigned long}}. 1. If |newShape| is a scalar [=number=], set |outputShape| to `[ 1 ]`. 1. Otherwise, if |newShape| is an array of {{unsigned long}}: @@ -4963,7 +4969,7 @@ partial interface MLGraphBuilder { The {{MLGraphBuilder/sigmoid(input)}} steps are:
- 1. If |input| is not an instance of {{MLOperand}}, then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. [=Assert=]: the type of |input| is {{MLOperand}}. 1. If any of the following sub-steps fail, throw an "{{OperationError}}" {{DOMException}} and stop. 1. Let |output| be the result of invoking the copy MLOperand steps given |input|. 1. Make a request to the underlying platform to: @@ -5018,7 +5024,7 @@ partial interface MLGraphBuilder { The {{MLGraphBuilder/slice(input, starts, sizes)}} steps are:
- 1. If |input| is not an instance of {{MLOperand}}, then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. [=Assert=]: the type of |input| is {{MLOperand}}. 1. If |starts| or |sizes| is not a sequence of {{long}}, then throw a "{{TypeError}}" {{DOMException}} and stop. 1. If |sizes|.size is 0, then throw a "{{TypeError}}" {{DOMException}} and stop.
@@ -5082,7 +5088,7 @@ partial interface MLGraphBuilder { The {{MLGraphBuilder/softmax(input)}} steps are:
- 1. If |input| is not an instance of {{MLOperand}}, then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. [=Assert=]: the type of |input| is {{MLOperand}}. 1. If the length of |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}} is not 2, then throw a "{{DataError}}" {{DOMException}} and stop. 1. If any of the following sub-steps fail, throw an "{{OperationError}}" {{DOMException}} and stop. 1. Let |output| be the result of invoking the copy MLOperand steps given |input|. @@ -5258,7 +5264,7 @@ partial interface MLGraphBuilder { The {{MLGraphBuilder/softsign(input)}} steps are:
- 1. If |input| is not an instance of {{MLOperand}}, then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. [=Assert=]: the type of |input| is {{MLOperand}}. 1. If any of the following sub-steps fail, throw an "{{OperationError}}" {{DOMException}} and stop. 1. Let |output| be the result of invoking the copy MLOperand steps given |input|. 1. Make a request to the underlying platform to: @@ -5327,7 +5333,7 @@ partial interface MLGraphBuilder { The {{MLGraphBuilder/split(input, splits, options)}} steps are:
- 1. If |input| is not an instance of {{MLOperand}}, then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. [=Assert=]: the type of |input| is {{MLOperand}}. 1. If |options| is `undefined`, let |options| be an empty [=object=]. 1. If |options|.{{MLSplitOptions/axis}} is `undefined`, let |options|.{{MLSplitOptions/axis}} be `0`. 1. If |splits| is not {{unsigned long}} or a sequence of {{unsigned long}}, then throw a "{{TypeError}}" {{DOMException}} and stop. @@ -5403,7 +5409,7 @@ partial interface MLGraphBuilder { The {{MLGraphBuilder/squeeze(input, options)}} steps are:
- 1. If |input| is not an instance of {{MLOperand}}, then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. [=Assert=]: the type of |input| is {{MLOperand}}. 1. If |options| is `undefined`, let |options| be an empty [=object=]. 1. If |options|.{{MLSqueezeOptions/axes}} [=map/exists=], then: 1. Let |dimensions| be |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}. @@ -5462,7 +5468,7 @@ partial interface MLGraphBuilder { The {{MLGraphBuilder/tanh(input)}} steps are:
- 1. If |input| is not an instance of {{MLOperand}}, then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. [=Assert=]: the type of |input| is {{MLOperand}}. 1. If any of the following sub-steps fail, throw an "{{OperationError}}" {{DOMException}} and stop. 1. Let |output| be the result of invoking the copy MLOperand steps given |input|. 1. Make a request to the underlying platform to: @@ -5531,7 +5537,7 @@ partial interface MLGraphBuilder { The {{MLGraphBuilder/transpose(input, options)}} steps are:
- 1. If |input| is not an instance of {{MLOperand}}, then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. [=Assert=]: the type of |input| is {{MLOperand}}. 1. If |options| is `undefined`, let |options| be an empty [=object=]. 1. If |options|.{{MLTransposeOptions/permutation}} is `undefined`, let |options|.{{MLTransposeOptions/permutation}} be the reversed sequence of all indices for |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}. 1. Otherwise if |options|.{{MLTransposeOptions/permutation}} [=map/exists=]: From 434a9e09d172840b08727a16b6bc2ecd28681917 Mon Sep 17 00:00:00 2001 From: Zoltan Kis Date: Tue, 15 Aug 2023 23:04:09 +0300 Subject: [PATCH 074/112] Fix #450: use TypeError simple exceptions Signed-off-by: Zoltan Kis --- index.bs | 125 +++++++++++++++++++++++++++---------------------------- 1 file changed, 62 insertions(+), 63 deletions(-) diff --git a/index.bs b/index.bs index a394d073..2abbbae1 100644 --- a/index.bs +++ b/index.bs @@ -966,7 +966,7 @@ The {{MLOperand}} objects are created by the methods of {{MLGraphBuilder}}, inte
1. [=Assert=]: the type of |builder| is {{MLGraphBuilder}}. - 1. If |desc| is not an [=object=] that [=implements=] {{MLOperandDescriptor}}, then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. If |desc| is not an [=object=] that [=implements=] {{MLOperandDescriptor}}, then [=exception/throw=] a {{TypeError}} and stop. 1. Let |operand| be a new [=object=]. 1. Set |operand|.{{MLOperand/[[builder]]}} to |builder|. 1. Set |operand|.{{MLOperand/[[descriptor]]}} to |desc|. @@ -1607,9 +1607,9 @@ Create a named {{MLOperand}} based on a descriptor, that can be used as an input The permissions and context validity have been checked by [[#api-mlgraphbuilder-constructor]] steps.
1. Let |name| be the first argument. - 1. If |name| is `undefined` or an empty [=string=], then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. If |name| is `undefined` or an empty [=string=], then [=exception/throw=] a {{TypeError}} and stop. 1. Let |descriptor| be the second argument. - 1. If |descriptor| is not an an object that [=implements=] {{MLOperandDescriptor}}, then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. If |descriptor| is not an an object that [=implements=] {{MLOperandDescriptor}}, then [=exception/throw=] a {{TypeError}} and stop. 1. [=Assert=]: If |descriptor|.{{MLOperandDescriptor/dimensions}} does not [=map/exist=], then |descriptor| defines a scalar input. 1. If |descriptor|.{{MLOperandDescriptor/dimensions}} [=map/exists=]: 1. If the [=check dimensions=] steps given |descriptor|.{{MLOperandDescriptor/type}} and |descriptor|.{{MLOperandDescriptor/dimensions}} return `false`, throw a "{{DataError}}" {{DOMException}} and stop. @@ -1654,10 +1654,10 @@ Build a composed graph up to a given output operand into a computational graph, The permissions and context validity have been checked by [[#api-mlgraphbuilder-constructor]] steps.
1. [=Assert=]: the type of |outputs| is {{MLNamedOperands}}. - 1. If |outputs| is empty, then throw an "{{TypeError}}" {{DOMException}} and stop. + 1. If |outputs| is empty, then [=exception/throw=] a {{TypeError}} and stop. 1. For each |element| in |outputs|: 1. [=Assert=]: the type of |element|.key is [=string=]. - 1. If |element|.key is empty, then throw an "{{TypeError}}" {{DOMException}} and stop. + 1. If |element|.key is empty, then [=exception/throw=] a {{TypeError}} and stop. 1. [=Assert=]: the type of |element|.value is {{MLOperand}}. 1. If any of the following sub-steps fail, throw an "{{OperationError}}" {{DOMException}} and stop. 1. Let |graph| be a new {{MLGraph}}: @@ -1667,9 +1667,9 @@ Build a composed graph up to a given output operand into a computational graph, 1. Store a reference to |graphImpl| in |graph|.{{MLGraph/[[implementation]]}}. 1. Make a request to the underlying platform to initialize the graph: 1. For each |operand| in |outputs|: - 1. If running the validate MLOperand given |operand| and [=this=] returns `false`, then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. If running the validate MLOperand given |operand| and [=this=] returns `false`, then [=exception/throw=] a {{TypeError}} and stop. 1. If |operand| was created as an input by the underlying platform: - 1. If |operand|.{{MLOperand/[[name]]}}] is not unique for |graphImpl|, then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. If |operand|.{{MLOperand/[[name]]}}] is not unique for |graphImpl|, then [=exception/throw=] a {{TypeError}} and stop. 1. Add |operand|.{{MLOperand/[[descriptor]]}} to |graph|.{{MLGraph/[[inputDescriptors]]}}[|operand|.{{MLOperand/[[name]]}}]. 1. If |operand| was created as a constant by the underlying platform: 1. Implementations MAY preprocess and optimize the tensor data of |operand| for the underlying platform. @@ -1699,11 +1699,11 @@ Create a constant {{MLOperand}} that can be used in {{MLGraphBuilder}} methods. The permissions and context validity have been checked by [[#api-mlgraphbuilder-constructor]] steps.
1. Let |descriptor| be the first argument. - 1. If |descriptor| is not an an object that [=implements=] {{MLOperandDescriptor}}, then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. If |descriptor| is not an an object that [=implements=] {{MLOperandDescriptor}}, then [=exception/throw=] a {{TypeError}} and stop. 1. If the [=byte length=] of |descriptor| is not supported by the underlying platform, then throw a "{{DataError}}" {{DOMException}} and stop. 1. If the [=check dimensions=] steps given |descriptor|.{{MLOperandDescriptor/type}} and |descriptor|.{{MLOperandDescriptor/dimensions}} return `false`, throw a "{{DataError}}" {{DOMException}} and stop. 1. Let |bufferView| be the second argument. - 1. If invoking validate buffer with descriptor given |bufferView| and |descriptor| return `false`, then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. If invoking validate buffer with descriptor given |bufferView| and |descriptor| return `false`, then [=exception/throw=] a {{TypeError}} and stop. 1. If any of the following sub-steps fail, throw an "{{OperationError}}" {{DOMException}} and stop. 1. Let |operand| be the result of invoking the create MLOperand steps with [=this=] and |descriptor|. 1. Let |bytes| be the result of invoking [[=get a copy of the bytes held by the buffer source=]] given |bufferView|. @@ -1732,9 +1732,8 @@ Create a constant {{MLOperand}} that can be used in {{MLGraphBuilder}} methods.
The permissions and context validity have been checked by [[#api-mlgraphbuilder-constructor]] steps.
- 1. If |value| is not a [=number=], then throw a "{{TypeError}}" {{DOMException}} and stop. - 1. If |type| is `undefined`, let |type| be `"float32"`. - 1. Otherwise, if |type| is not one of {{MLOperandType}}, then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. If |value| is not a [=number=], then [=exception/throw=] a {{TypeError}} and stop. + 1. Otherwise, if |type| is not one of {{MLOperandType}}, then [=exception/throw=] a {{TypeError}} and stop. 1. Let |descriptor| be a new {{MLOperandDescriptor}}. 1. Set |descriptor|.{{MLOperandDescriptor/type}} to |type|. 1. Set |descriptor|.{{MLOperandDescriptor/dimensions}} to `undefined`. @@ -1808,14 +1807,14 @@ partial interface MLGraphBuilder {
1. Let |input| be the first argument. To validate |input|, run these substeps: - 1. If |input| is not an [=object=] that [=implements=] {{MLOperand}}, then throw a "{{TypeError}}" {{DOMException}} and abort these steps. + 1. If |input| is not an [=object=] that [=implements=] {{MLOperand}}, then [=exception/throw=] a {{TypeError}} and abort these steps. 1. Let |mean| be the second argument, representing a vector with the moving mean values for |input|. To validate |mean|, run the following substeps: - 1. If |mean| is not an [=object=] that [=implements=] {{MLOperand}}, then throw a "{{TypeError}}" {{DOMException}} and abort these steps. - 1. If |mean|.{{MLOperand/[[descriptor]]}}.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}} is not equal with |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}} from which the dimension represented by |options|.axis is removed, then throw a "{{TypeError}}" {{DOMException}} and abort these steps. + 1. If |mean| is not an [=object=] that [=implements=] {{MLOperand}}, then [=exception/throw=] a {{TypeError}} and abort these steps. + 1. If |mean|.{{MLOperand/[[descriptor]]}}.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}} is not equal with |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}} from which the dimension represented by |options|.axis is removed, then [=exception/throw=] a {{TypeError}} and abort these steps. 1. Let |variance| be the third argument, representing the moving variance values of |input|. 1. Let |options| be the fourth argument. To validate |options|, run these substeps: 1. If |options|.axis does not [=map/exist=], let |options|."axis" be 1. - 1. If |options|.axis is not a number between 0 and the rank of |input|, then throw a "{{TypeError}}" {{DOMException}} and abort these steps. + 1. If |options|.axis is not a number between 0 and the rank of |input|, then [=exception/throw=] a {{TypeError}} and abort these steps. 1. If |input| is a 4-D tensor of the *"nchw"* layout, set |options|.axis to 1. 1. If |input| is a 4-D tensor of the *"nhwc"* layout, set |options|.axis to 3. 1. If any of the following sub-steps fail, throw an "{{OperationError}}" {{DOMException}} and stop. @@ -1921,7 +1920,7 @@ partial interface MLGraphBuilder {
1. Let |operand| be the first argument. 1. Let |options| be the second argument. - 1. If running the check clamp options steps with |options| returns `false`, then throw a "{{TypeError}}" {{DOMException}} and abort these steps. + 1. If running the check clamp options steps with |options| returns `false`, then [=exception/throw=] a {{TypeError}} and abort these steps. 1. If any of the following sub-steps fail, throw an "{{OperationError}}" {{DOMException}} and stop. 1. Let |output| be the result of invoking the copy MLOperand steps given |operand|. 1. Make a request to the underlying platform to: @@ -1951,7 +1950,7 @@ partial interface MLGraphBuilder {
1. Let |options| be the first argument. - 1. If running the check clamp options steps with |options| returns `false`, then throw a "{{TypeError}}" {{DOMException}} and abort these steps. + 1. If running the check clamp options steps with |options| returns `false`, then [=exception/throw=] a {{TypeError}} and abort these steps. 1. Let |op| be the result of invoking the create MLActivation steps with `"clamp"` and |options|. 1. If that throws an error, re-throw the error and abort these steps. 1. Return |op|. @@ -2148,15 +2147,15 @@ partial interface MLGraphBuilder { 1. Let |filter_size| be the size of |filter|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}. 1. If |input_size| is not `4`, then throw a "{{DataError}}" {{DOMException}} and stop. 1. If |filter_size| is not `4`, then throw a "{{DataError}}" {{DOMException}} and stop. - 1. If the type of |input| and |filter| is not the same, then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. If the type of |input| and |filter| is not the same, then [=exception/throw=] a {{TypeError}} and stop. 1. If |options| is `undefined`, let |options| be an empty [=object=]. 1. If |options|.{{MLConv2dOptions/padding}} is `undefined`, set it to `[0, 0, 0, 0]`. 1. Else if |options|.{{MLConv2dOptions/padding}}.length is not 4, then throw a "{{DataError}}" {{DOMException}} and stop. 1. If |options|.{{MLConv2dOptions/strides}} is `undefined`, set it to `[1, 1]`. - 1. Else if |options|.{{MLConv2dOptions/strides}}.length is not `2`, then throw a "{{TypeError}}" {{DOMException}} and stop. - 1. If any element in |options|.{{MLConv2dOptions/strides}} is equal to 0, then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. Else if |options|.{{MLConv2dOptions/strides}}.length is not `2`, then [=exception/throw=] a {{TypeError}} and stop. + 1. If any element in |options|.{{MLConv2dOptions/strides}} is equal to 0, then [=exception/throw=] a {{TypeError}} and stop. 1. If |options|.{{MLConv2dOptions/dilations}} is `undefined`, set it to `[1, 1]`. - 1. Else if |options|.{{MLConv2dOptions/dilations}}.length is not `2`, then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. Else if |options|.{{MLConv2dOptions/dilations}}.length is not `2`, then [=exception/throw=] a {{TypeError}} and stop. 1. If |options|.{{MLConv2dOptions/autoPad}} is `undefined`, set it to `"explicit"`. 1. If |options|.{{MLConv2dOptions/groups}} is `undefined`, set it to `1`. 1. Else if |options|.{{MLConv2dOptions/groups}} is 0, then throw a "{{DataError}}" {{DOMException}} and stop. @@ -2166,8 +2165,8 @@ partial interface MLGraphBuilder { 1. If |options|.{{MLConv2dOptions/filterLayout}} is `undefined`, set it to `"oihw"`. 1. If |options|.{{MLConv2dOptions/bias}} [=map/exists=]: 1. [=Assert=]: the type of |options|.{{MLConv2dOptions/bias}} is {{MLOperand}}. - 1. If the length of |options|.{{MLConv2dOptions/bias}}.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}} is not 1, then throw a "{{TypeError}}" {{DOMException}} and stop. - 1. If |options|.{{MLConv2dOptions/bias}}.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/type}} is not the same as |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/type}}, then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. If the length of |options|.{{MLConv2dOptions/bias}}.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}} is not 1, then [=exception/throw=] a {{TypeError}} and stop. + 1. If |options|.{{MLConv2dOptions/bias}}.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/type}} is not the same as |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/type}}, then [=exception/throw=] a {{TypeError}} and stop. 1. If |options|.{{MLConv2dOptions/activation}} [=map/exists=]: 1. [=Assert=]: the type of |options|.{{MLConv2dOptions/activation}} is {{MLActivation}}. 1. Let |output_shape| be the result of invoking the underlying implementation for calculating output dimensions, given |options|. @@ -2327,19 +2326,19 @@ partial interface MLGraphBuilder { 1. Let |filter_size| be the size of |filter|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}. 1. If |input_size| is not `4`, then throw a "{{DataError}}" {{DOMException}} and stop. 1. If |filter_size| is not `4`, then throw a "{{DataError}}" {{DOMException}} and stop. - 1. If the type of |input| and |filter| is not the same, then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. If the type of |input| and |filter| is not the same, then [=exception/throw=] a {{TypeError}} and stop. 1. If |options| is `undefined`, let |options| be an empty [=object=]. 1. If |options|.{{MLConvTranspose2dOptions/padding}} is `undefined`, set it to `[0, 0, 0, 0]`. 1. Else if |options|.{{MLConvTranspose2dOptions/padding}}.length is not 4, then throw a "{{DataError}}" {{DOMException}} and stop. 1. If |options|.{{MLConvTranspose2dOptions/strides}} is `undefined`, set it to `[1, 1]`. - 1. Else if |options|.{{MLConvTranspose2dOptions/strides}}.length is not `2`, then throw a "{{TypeError}}" {{DOMException}} and stop. - 1. If any element in |options|.{{MLConv2dOptions/strides}} is equal to 0, then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. Else if |options|.{{MLConvTranspose2dOptions/strides}}.length is not `2`, then [=exception/throw=] a {{TypeError}} and stop. + 1. If any element in |options|.{{MLConv2dOptions/strides}} is equal to 0, then [=exception/throw=] a {{TypeError}} and stop. 1. If |options|.{{MLConvTranspose2dOptions/dilations}} is `undefined`, set it to `[1, 1]`. - 1. Else if |options|.{{MLConvTranspose2dOptions/dilations}}.length is not `2`, then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. Else if |options|.{{MLConvTranspose2dOptions/dilations}}.length is not `2`, then [=exception/throw=] a {{TypeError}} and stop. 1. If |options|.{{MLConvTranspose2dOptions/outputPadding}} is `undefined`, set it to `[0, 0]`. - 1. Else if |options|.{{MLConvTranspose2dOptions/outputPadding}}.length is not `2`, then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. Else if |options|.{{MLConvTranspose2dOptions/outputPadding}}.length is not `2`, then [=exception/throw=] a {{TypeError}} and stop. 1. If |options|.{{MLConvTranspose2dOptions/outputSizes}} [=map/exists=]: - 1. If |options|.{{MLConvTranspose2dOptions/outputSizes}}.length is not `2`, then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. If |options|.{{MLConvTranspose2dOptions/outputSizes}}.length is not `2`, then [=exception/throw=] a {{TypeError}} and stop. 1. If the elements of |options|.{{MLConvTranspose2dOptions/outputSizes}} are not smaller than the elements at the same dimension (index) for |options|.{{MLConvTranspose2dOptions/strides}}, then throw a "{{DataError}}" {{DOMException}} and stop. 1. If |options|.{{MLConvTranspose2dOptions/autoPad}} is `undefined`, set it to `"explicit"`. 1. If |options|.{{MLConvTranspose2dOptions/groups}} is `undefined`, set it to `1`. @@ -2349,8 +2348,8 @@ partial interface MLGraphBuilder { 1. If |options|.{{MLConvTranspose2dOptions/filterLayout}} is `undefined`, set it to `"iohw"`. 1. If |options|.{{MLConvTranspose2dOptions/bias}} [=map/exists=]: 1. [=Assert=]: the type of |options|.{{MLConvTranspose2dOptions/bias}} is {{MLOperand}}. - 1. If the length of |options|.{{MLConvTranspose2dOptions/bias}}.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}} is not 1, then throw a "{{TypeError}}" {{DOMException}} and stop. - 1. If |options|.{{MLConvTranspose2dOptions/bias}}.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/type}} is not the same as |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/type}}, then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. If the length of |options|.{{MLConvTranspose2dOptions/bias}}.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}} is not 1, then [=exception/throw=] a {{TypeError}} and stop. + 1. If |options|.{{MLConvTranspose2dOptions/bias}}.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/type}} is not the same as |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/type}}, then [=exception/throw=] a {{TypeError}} and stop. 1. If |options|.{{MLConvTranspose2dOptions/activation}} [=map/exists=]: 1. [=Assert=]: the type of |options|.{{MLConvTranspose2dOptions/activation}} is {{MLActivation}}. 1. Let |output_shape| be the result of invoking the underlying implementation for calculating output dimensions, given |options|. @@ -2668,7 +2667,7 @@ partial interface MLGraphBuilder {
1. Let |input| be the first argument. 1. Let |options| be the second argument. - 1. If running the check ELU options steps with |options| returns `false`, then throw a "{{TypeError}}" {{DOMException}} and abort these steps. + 1. If running the check ELU options steps with |options| returns `false`, then [=exception/throw=] a {{TypeError}} and abort these steps. 1. If any of the following sub-steps fail, throw an "{{OperationError}}" {{DOMException}} and stop. 1. Let |output| be the result of invoking the copy MLOperand steps given |input|. 1. Make a request to the underlying platform to: @@ -2699,7 +2698,7 @@ partial interface MLGraphBuilder {
1. Let |options| be the first argument. 1. If |options| is `undefined`, let |options| be a new {{MLEluOptions}} object. - 1. If running the check ELU options steps with |options| returns `false`, then throw a "{{TypeError}}" {{DOMException}} and abort these steps. + 1. If running the check ELU options steps with |options| returns `false`, then [=exception/throw=] a {{TypeError}} and abort these steps. 1. Let |op| be the result of invoking the create MLActivation steps with `"elu"` and |options|. 1. Return |op|.
@@ -2913,11 +2912,11 @@ partial interface MLGraphBuilder { 1. If |options|.{{MLGruOptions/resetAfter}} is `undefined`, set it to `true`. 1. If |options|.{{MLGruOptions/returnSequence}} is `undefined`, set it to `false`. 1. If |options|.{{MLGruOptions/direction}} is `undefined`, set it to `"forward"`. - 1. If |options|.{{MLGruOptions/direction}} is not one of {{MLRecurrentNetworkDirection}}, then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. If |options|.{{MLGruOptions/direction}} is not one of {{MLRecurrentNetworkDirection}}, then [=exception/throw=] a {{TypeError}} and stop. 1. If |options|.{{MLGruOptions/layout}} is `undefined`, set it to `"zrn"`. - 1. If |options|.{{MLGruOptions/layout}} is not one of {{MLGruWeightLayout}}, then throw a "{{TypeError}}" {{DOMException}} and stop. - 1. If |options|.{{MLGruOptions/activations}} [=map/exists=] and is not an array of {{MLActivation}} objects with size `2`, then throw a "{{TypeError}}" {{DOMException}} and stop. - 1. If |steps| is not a [=number=] or it is `0`, then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. If |options|.{{MLGruOptions/layout}} is not one of {{MLGruWeightLayout}}, then [=exception/throw=] a {{TypeError}} and stop. + 1. If |options|.{{MLGruOptions/activations}} [=map/exists=] and is not an array of {{MLActivation}} objects with size `2`, then [=exception/throw=] a {{TypeError}} and stop. + 1. If |steps| is not a [=number=] or it is `0`, then [=exception/throw=] a {{TypeError}} and stop. 1. Let |output| be an empty sequence of {{MLOperand}} objects. 1. If any of the following sub-steps fail, throw an "{{OperationError}}" {{DOMException}} and stop. 1. Make a request to the underlying platform to: @@ -3064,8 +3063,8 @@ partial interface MLGraphBuilder { 1. If its rank is not `1`, then throw a "{{DataError}}" {{DOMException}} and stop. 1. If |options|.{{MLGruOptions/resetAfter}} is `undefined`, set it to `true`. 1. If |options|.{{MLGruOptions/layout}} is `undefined`, set it to `"zrn"`. - 1. If |options|.{{MLGruOptions/layout}} is not one of {{MLGruWeightLayout}}, then throw a "{{TypeError}}" {{DOMException}} and stop. - 1. If |options|.{{MLGruOptions/activations}} [=map/exists=] and is not an array of {{MLActivation}} objects with size `2`, then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. If |options|.{{MLGruOptions/layout}} is not one of {{MLGruWeightLayout}}, then [=exception/throw=] a {{TypeError}} and stop. + 1. If |options|.{{MLGruOptions/activations}} [=map/exists=] and is not an array of {{MLActivation}} objects with size `2`, then [=exception/throw=] a {{TypeError}} and stop. 1. Let |desc| a new {{MLOperandDescriptor}}. 1. Set |desc|.{{MLOperandDescriptor/dimensions}} to [ |input|.{{MLOperandDescriptor/dimensions}}[0], |hiddenSize| ]. 1. Set |desc|.{{MLOperandDescriptor/type}} to |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/type}}. @@ -3256,7 +3255,7 @@ partial interface MLGraphBuilder { The {{MLGraphBuilder/hardSigmoid(input, options)}} method steps are: 1. Let |input| be the first argument. 1. Let |options| be the second argument. - 1. If running the check hard-sigmoid options steps with |options| returns `false`, then throw a "{{TypeError}}" {{DOMException}} and abort these steps. + 1. If running the check hard-sigmoid options steps with |options| returns `false`, then [=exception/throw=] a {{TypeError}} and abort these steps. 1. If any of the following sub-steps fail, throw an "{{OperationError}}" {{DOMException}} and stop. 1. Let |output| be the result of invoking the copy MLOperand steps given |input|. 1. Make a request to the underlying platform to: @@ -3284,7 +3283,7 @@ partial interface MLGraphBuilder {
1. Let |options| be the first argument. - 1. If running the check hard-sigmoid options steps with |options| returns `false`, then throw a "{{TypeError}}" {{DOMException}} and abort these steps. + 1. If running the check hard-sigmoid options steps with |options| returns `false`, then [=exception/throw=] a {{TypeError}} and abort these steps. 1. Let |op| be the result of invoking the create MLActivation steps with `"hardSigmoid"` and |options|. 1. If that throws an error, re-throw the error and abort these steps. 1. Return |op|. @@ -3548,7 +3547,7 @@ partial interface MLGraphBuilder { 1. Let |input| be the first argument. 1. Let |options| be the second argument. 1. If |options| is `undefined`, let |options| be a new {{MLLeakyReluOptions}} object. - 1. If running the check leaky-relu options steps with |options| returns `false`, then throw a "{{TypeError}}" {{DOMException}} and abort these steps. + 1. If running the check leaky-relu options steps with |options| returns `false`, then [=exception/throw=] a {{TypeError}} and abort these steps. 1. If any of the following sub-steps fail, throw an "{{OperationError}}" {{DOMException}} and stop. 1. Let |output| be the result of invoking the copy MLOperand steps given |input|. 1. Make a request to the underlying platform to: @@ -3578,7 +3577,7 @@ partial interface MLGraphBuilder {
1. Let |options| be the first argument. 1. If |options| is `undefined`, let |options| be a new {{MLLeakyReluOptions}} object. - 1. If running the check leaky-relu options steps with |options| returns `false`, then throw a "{{TypeError}}" {{DOMException}} and abort these steps. + 1. If running the check leaky-relu options steps with |options| returns `false`, then [=exception/throw=] a {{TypeError}} and abort these steps. 1. Let |op| be the result of invoking the create MLActivation steps with `"leakyRelu"` and |options|. 1. If that throws an error, re-throw the error and abort these steps. 1. Return |op|. @@ -3659,7 +3658,7 @@ partial interface MLGraphBuilder {
1. Let |input| be the first argument. 1. Let |options| be the second argument. - 1. If running the check linear options steps with |options| returns `false`, then throw a "{{TypeError}}" {{DOMException}} and abort these steps. + 1. If running the check linear options steps with |options| returns `false`, then [=exception/throw=] a {{TypeError}} and abort these steps. 1. If any of the following sub-steps fail, throw an "{{OperationError}}" {{DOMException}} and stop. 1. Let |output| be the result of invoking the copy MLOperand steps given |input|. 1. Make a request to the underlying platform to: @@ -3688,7 +3687,7 @@ partial interface MLGraphBuilder {
1. Let |options| be the first argument. - 1. If running the check linear options steps with |options| returns `false`, then throw a "{{TypeError}}" {{DOMException}} and abort these steps. + 1. If running the check linear options steps with |options| returns `false`, then [=exception/throw=] a {{TypeError}} and abort these steps. 1. Let |op| be the result of invoking the create MLActivation steps with `"linear"` and |options|. 1. If that throws an error, re-throw the error and abort these steps. 1. Return |op|. @@ -3781,7 +3780,7 @@ partial interface MLGraphBuilder {
1. If |options| is `undefined`, let |options| be an empty [=object=]. 1. If |options|.{{MLLstmOptions/direction}} is `undefined`, set it to `"forward"`. - 1. If |options|.{{MLLstmOptions/direction}} is not one of {{MLRecurrentNetworkDirection}}, then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. If |options|.{{MLLstmOptions/direction}} is not one of {{MLRecurrentNetworkDirection}}, then [=exception/throw=] a {{TypeError}} and stop. 1. Let |num_directions| be `1` if |options|.{{MLLstmOptions/direction}} is `"forward"`, or otherwise let it be `2`. 1. [=Assert=]: the type of |input|, |weight| and |recurrentWeight| is {{MLOperand}}.
@@ -3818,9 +3817,9 @@ partial interface MLGraphBuilder { 1. If |options|.{{MLLstmOptions/initialCellState}}.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}[2] is not |hiddenSize|, then throw a "{{DataError}}" {{DOMException}} and stop. 1. If |options|.{{MLLstmOptions/returnSequence}} is `undefined`, set it to `false`. 1. If |options|.{{MLLstmOptions/layout}} is `undefined`, set it to `"iofg"`. - 1. If |options|.{{MLLstmOptions/layout}} is not one of {{MLLstmWeightLayout}}, then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. If |options|.{{MLLstmOptions/layout}} is not one of {{MLLstmWeightLayout}}, then [=exception/throw=] a {{TypeError}} and stop. 1. If |options|.{{MLLstmOptions/activations}} [=map/exists=]: - 1. If it is not an array of size `3`, then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. If it is not an array of size `3`, then [=exception/throw=] a {{TypeError}} and stop. 1. [=Assert=]: the type of its elements is {{MLActivation}}. 1. If any of the following sub-steps fail, throw an "{{OperationError}}" {{DOMException}} and stop. 1. Let |desc| a new {{MLOperandDescriptor}}. @@ -4000,9 +3999,9 @@ partial interface MLGraphBuilder { 1. If its rank is not `1`, then throw a "{{DataError}}" {{DOMException}} and stop. 1. If |options|.{{MLLstmCellOptions/peepholeWeight}}.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}[0] is not 3 * |hiddenSize|, then throw a "{{DataError}}" {{DOMException}} and stop. 1. If |options|.{{MLLstmCellOptions/layout}} is `undefined`, set it to `"iofg"`. - 1. If |options|.{{MLLstmCellOptions/layout}} is not one of {{MLLstmWeightLayout}}, then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. If |options|.{{MLLstmCellOptions/layout}} is not one of {{MLLstmWeightLayout}}, then [=exception/throw=] a {{TypeError}} and stop. 1. If |options|.{{MLLstmCellOptions/activations}} [=map/exists=]: - 1. If it is not an array of size `3`, then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. If it is not an array of size `3`, then [=exception/throw=] a {{TypeError}} and stop. 1. [=Assert=]: the type of its elements is {{MLActivation}}. 1. Let |desc| a new {{MLOperandDescriptor}}. 1. Set |desc|.{{MLOperandDescriptor/dimensions}} to [ |batch_size|, |hiddenSize| ]. @@ -4273,10 +4272,10 @@ partial interface MLGraphBuilder {
1. [=Assert=]: the type of |input| is {{MLOperand}}. - 1. If |beginningPadding| or |endingPadding| is not a sequence of {{unsigned long}}, then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. If |beginningPadding| or |endingPadding| is not a sequence of {{unsigned long}}, then [=exception/throw=] a {{TypeError}} and stop. 1. If |options| is `undefined`, let |options| be an empty [=object=]. 1. If |options|.{{MLPadOptions/mode}} is `undefined`, set it to `"constant"`. - 1. Otherwise, if |options|.{{MLPadOptions/mode}} is not one of {{MLPaddingMode}}, then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. Otherwise, if |options|.{{MLPadOptions/mode}} is not one of {{MLPaddingMode}}, then [=exception/throw=] a {{TypeError}} and stop. 1. If |options|.{{MLPadOptions/value}} is `undefined`, set it to `0`. 1. Let |desc| be a copy of |input|.{{MLOperand/[[descriptor]]}}. 1. Set |desc|.{{MLOperandDescriptor/dimensions}} to the result of invoking the calculate padding output sizes given |input|, |beginningPadding| and |endingPadding|. @@ -5025,8 +5024,8 @@ partial interface MLGraphBuilder {
1. [=Assert=]: the type of |input| is {{MLOperand}}. - 1. If |starts| or |sizes| is not a sequence of {{long}}, then throw a "{{TypeError}}" {{DOMException}} and stop. - 1. If |sizes|.size is 0, then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. If |starts| or |sizes| is not a sequence of {{long}}, then [=exception/throw=] a {{TypeError}} and stop. + 1. If |sizes|.size is 0, then [=exception/throw=] a {{TypeError}} and stop.
Further validation of |starts| and |sizes| given |input| is left [=implementation-defined=].
@@ -5191,7 +5190,7 @@ partial interface MLGraphBuilder {
1. Let |input| be the first argument. 1. Let |options| be the second argument. - 1. If running the check softplus options steps with |options| returns `false`, then throw a "{{TypeError}}" {{DOMException}} and abort these steps. + 1. If running the check softplus options steps with |options| returns `false`, then [=exception/throw=] a {{TypeError}} and abort these steps. 1. If any of the following sub-steps fail, throw an "{{OperationError}}" {{DOMException}} and stop. 1. Let |output| be the result of invoking the copy MLOperand steps given |input|. 1. Make a request to the underlying platform to: @@ -5220,7 +5219,7 @@ partial interface MLGraphBuilder {
1. Let |options| be the first argument. - 1. If running the check softplus options steps with |options| returns `false`, then throw a "{{TypeError}}" {{DOMException}} and abort these steps. + 1. If running the check softplus options steps with |options| returns `false`, then [=exception/throw=] a {{TypeError}} and abort these steps. 1. Let |op| be the result of invoking the create MLActivation steps with `"softplus"` and |options|. 1. If that throws an error, re-throw the error and abort these steps. 1. Return |op|. @@ -5336,7 +5335,7 @@ partial interface MLGraphBuilder { 1. [=Assert=]: the type of |input| is {{MLOperand}}. 1. If |options| is `undefined`, let |options| be an empty [=object=]. 1. If |options|.{{MLSplitOptions/axis}} is `undefined`, let |options|.{{MLSplitOptions/axis}} be `0`. - 1. If |splits| is not {{unsigned long}} or a sequence of {{unsigned long}}, then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. If |splits| is not {{unsigned long}} or a sequence of {{unsigned long}}, then [=exception/throw=] a {{TypeError}} and stop. 1. If any of the following sub-steps fail, throw an "{{OperationError}}" {{DOMException}} and stop. 1. Let |output| be the result of invoking the copy MLOperand steps given |input|. 1. Make a request to the underlying platform to: @@ -5415,7 +5414,7 @@ partial interface MLGraphBuilder { 1. Let |dimensions| be |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}. 1. For |index| between 0 and the size of |options|.{{MLSqueezeOptions/axes}}: 1. Let |oneDimIndex| be |options|.{{MLSqueezeOptions/axes}}[|index|]. - 1. If |dimensions|[|oneDimIndex|] is not `1`, then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. If |dimensions|[|oneDimIndex|] is not `1`, then [=exception/throw=] a {{TypeError}} and stop. 1. If any of the following sub-steps fail, throw an "{{OperationError}}" {{DOMException}} and stop. 1. Let |output| be the result of invoking the copy MLOperand steps given |input|. 1. Make a request to the underlying platform to: @@ -5541,10 +5540,10 @@ partial interface MLGraphBuilder { 1. If |options| is `undefined`, let |options| be an empty [=object=]. 1. If |options|.{{MLTransposeOptions/permutation}} is `undefined`, let |options|.{{MLTransposeOptions/permutation}} be the reversed sequence of all indices for |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}. 1. Otherwise if |options|.{{MLTransposeOptions/permutation}} [=map/exists=]: - 1. If |options|.{{MLTransposeOptions/permutation}} is not a sequence of {{unsigned long}}, then throw a "{{TypeError}}" {{DOMException}} and stop. - 1. If the rank of |options|.{{MLTransposeOptions/permutation}} is not the same as the rank of |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}, then throw a "{{TypeError}}" {{DOMException}} and stop. - 1. If the values in |options|.{{MLTransposeOptions/permutation}} are not between `0` and the rank of |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}} minus `1`, then throw a "{{TypeError}}" {{DOMException}} and stop. - 1. If the values in |options|.{{MLTransposeOptions/permutation}} contain duplicate value, then throw a "{{TypeError}}" {{DOMException}} and stop. + 1. If |options|.{{MLTransposeOptions/permutation}} is not a sequence of {{unsigned long}}, then [=exception/throw=] a {{TypeError}} and stop. + 1. If the rank of |options|.{{MLTransposeOptions/permutation}} is not the same as the rank of |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}, then [=exception/throw=] a {{TypeError}} and stop. + 1. If the values in |options|.{{MLTransposeOptions/permutation}} are not between `0` and the rank of |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}} minus `1`, then [=exception/throw=] a {{TypeError}} and stop. + 1. If the values in |options|.{{MLTransposeOptions/permutation}} contain duplicate value, then [=exception/throw=] a {{TypeError}} and stop. 1. If any of the following sub-steps fail, throw an "{{OperationError}}" {{DOMException}} and stop. 1. Let |output| be the result of invoking the copy MLOperand steps given |input|. 1. Make a request to the underlying platform to: From 7e6177f467889b4e045b1ff280273d74b845fe6b Mon Sep 17 00:00:00 2001 From: Zoltan Kis Date: Wed, 16 Aug 2023 14:42:22 +0300 Subject: [PATCH 075/112] Fix #450: remove unnecessary checks that are covered by WebIDL bindings Signed-off-by: Zoltan Kis --- index.bs | 232 ++++++++++++++----------------------------------------- 1 file changed, 58 insertions(+), 174 deletions(-) diff --git a/index.bs b/index.bs index 2abbbae1..c00028c0 100644 --- a/index.bs +++ b/index.bs @@ -811,13 +811,12 @@ Its default allowlist is 'self'. ### The {{ML/createContext()}} method ### {#api-ml-createcontext}
- The {{ML/createContext()}} method steps are: + The {{ML/createContext(options)}} method steps are:
1. If [=this=]'s [=relevant global object=]'s [=associated Document=] is not [=allowed to use=] the [=webnn-feature|webnn=] feature, return [=a new promise=] [=rejected=] with a "{{SecurityError}}" {{DOMException}} and abort these steps. 1. Let |promise| be [=a new promise=]. 1. Return |promise| and run the following steps [=in parallel=]. - 1. Let |options| be the first argument. 1. Run the create context steps given |options|: 1. Let |context| be a new {{MLContext}} object. 1. If |options| is a {{GPUDevice}} object, @@ -836,11 +835,10 @@ Its default allowlist is 'self'. ### The {{ML/createContextSync()}} method ### {#api-ml-createcontextsync}
- The {{ML/createContextSync()}} method steps are: + The {{ML/createContextSync(options)}} method steps are:
1. If [=this=]'s [=relevant global object=]'s [=associated Document=] is not [=allowed to use=] the [=webnn-feature|webnn=] feature, throw a "{{SecurityError}}" {{DOMException}} and abort these steps. - 1. Let |options| be the first argument. 1. Let |context| be the result of running the create context steps given |options|. 1. If the validate MLContext steps given |context| return `false`, throw a "{{NotSupportedError}}" {{DOMException}} and abort these steps. 1. Return |context|. @@ -1578,11 +1576,10 @@ Both {{MLGraphBuilder}}.{{MLGraphBuilder/build()}} and {{MLGraphBuilder}}.{{MLGr ### The {{MLGraphBuilder}} constructor ### {#api-mlgraphbuilder-constructor}
- The [=new=] {{MLGraphBuilder}} constructor steps are: + The [=new=] {{MLGraphBuilder(context)}} constructor steps are:
1. If [=this=]'s [=relevant global object=]'s [=associated Document=] is not [=allowed to use=] the [=webnn-feature|webnn=] feature, throw a "{{SecurityError}}" {{DOMException}} and abort these steps. - 1. Let |context| be the first argument. 1. If the validate MLContext steps given |context| return `false`, throw a "{{TypeError}}" and abort these steps. 1. Set {{MLGraphBuilder/[[context]]}} to |context|.
@@ -1606,14 +1603,12 @@ Create a named {{MLOperand}} based on a descriptor, that can be used as an input
The permissions and context validity have been checked by [[#api-mlgraphbuilder-constructor]] steps.
- 1. Let |name| be the first argument. - 1. If |name| is `undefined` or an empty [=string=], then [=exception/throw=] a {{TypeError}} and stop. - 1. Let |descriptor| be the second argument. - 1. If |descriptor| is not an an object that [=implements=] {{MLOperandDescriptor}}, then [=exception/throw=] a {{TypeError}} and stop. - 1. [=Assert=]: If |descriptor|.{{MLOperandDescriptor/dimensions}} does not [=map/exist=], then |descriptor| defines a scalar input. - 1. If |descriptor|.{{MLOperandDescriptor/dimensions}} [=map/exists=]: - 1. If the [=check dimensions=] steps given |descriptor|.{{MLOperandDescriptor/type}} and |descriptor|.{{MLOperandDescriptor/dimensions}} return `false`, throw a "{{DataError}}" {{DOMException}} and stop. - 1. If the [=byte length=] of |descriptor| is not supported by the underlying platform, then throw a "{{DataError}}" {{DOMException}} and stop. + 1. If |name| is empty, then [=exception/throw=] a {{TypeError}} and stop. + 1. [=Assert=]: the type of |descriptor| is {{MLOperandDescriptor}}. + 1. [=Assert=]: If |descriptor|.{{MLOperandDescriptor/dimensions}} does not [=map/exist=], then |descriptor| defines a scalar input. + 1. If |descriptor|.{{MLOperandDescriptor/dimensions}} [=map/exists=]: + 1. If the [=check dimensions=] steps given |descriptor|.{{MLOperandDescriptor/type}} and |descriptor|.{{MLOperandDescriptor/dimensions}} return `false`, throw a "{{DataError}}" {{DOMException}} and stop. + 1. If the [=byte length=] of |descriptor| is not supported by the underlying platform, then throw a "{{DataError}}" {{DOMException}} and stop. 1. If any of the following sub-steps fail, throw an "{{OperationError}}" {{DOMException}} and stop. 1. Let |operand| be the result of invoking the create MLOperand steps with [=this=] and |descriptor|. 1. Set |operand|.{{MLOperand/[[name]]}} to |name|. @@ -1698,12 +1693,10 @@ Create a constant {{MLOperand}} that can be used in {{MLGraphBuilder}} methods.
The permissions and context validity have been checked by [[#api-mlgraphbuilder-constructor]] steps.
- 1. Let |descriptor| be the first argument. - 1. If |descriptor| is not an an object that [=implements=] {{MLOperandDescriptor}}, then [=exception/throw=] a {{TypeError}} and stop. - 1. If the [=byte length=] of |descriptor| is not supported by the underlying platform, then throw a "{{DataError}}" {{DOMException}} and stop. - 1. If the [=check dimensions=] steps given |descriptor|.{{MLOperandDescriptor/type}} and |descriptor|.{{MLOperandDescriptor/dimensions}} return `false`, throw a "{{DataError}}" {{DOMException}} and stop. - 1. Let |bufferView| be the second argument. - 1. If invoking validate buffer with descriptor given |bufferView| and |descriptor| return `false`, then [=exception/throw=] a {{TypeError}} and stop. + 1. [=Assert=]: the type of |descriptor| is {{MLOperandDescriptor}}. + 1. If the [=byte length=] of |descriptor| is not supported by the underlying platform, then throw a "{{DataError}}" {{DOMException}} and stop. + 1. If the [=check dimensions=] steps given |descriptor|.{{MLOperandDescriptor/type}} and |descriptor|.{{MLOperandDescriptor/dimensions}} return `false`, throw a "{{DataError}}" {{DOMException}} and stop. + 1. If invoking validate buffer with descriptor given |bufferView| and |descriptor| return `false`, then [=exception/throw=] a {{TypeError}} and stop. 1. If any of the following sub-steps fail, throw an "{{OperationError}}" {{DOMException}} and stop. 1. Let |operand| be the result of invoking the create MLOperand steps with [=this=] and |descriptor|. 1. Let |bytes| be the result of invoking [[=get a copy of the bytes held by the buffer source=]] given |bufferView|. @@ -1803,17 +1796,14 @@ partial interface MLGraphBuilder {
- The {{MLGraphBuilder/batchNormalization()}} method steps are: + The {{MLGraphBuilder/batchNormalization(input, mean, variance, options)}} method steps are:
- 1. Let |input| be the first argument. To validate |input|, run these substeps: - 1. If |input| is not an [=object=] that [=implements=] {{MLOperand}}, then [=exception/throw=] a {{TypeError}} and abort these steps. - 1. Let |mean| be the second argument, representing a vector with the moving mean values for |input|. To validate |mean|, run the following substeps: - 1. If |mean| is not an [=object=] that [=implements=] {{MLOperand}}, then [=exception/throw=] a {{TypeError}} and abort these steps. + 1. [=Assert=]: the type of |input|, |mean| and |variance| is {{MLOperand}}. + 1. [=Assert=]: the type of |mean| is + 1. To validate |mean|, run the following substeps: 1. If |mean|.{{MLOperand/[[descriptor]]}}.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}} is not equal with |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}} from which the dimension represented by |options|.axis is removed, then [=exception/throw=] a {{TypeError}} and abort these steps. - 1. Let |variance| be the third argument, representing the moving variance values of |input|. - 1. Let |options| be the fourth argument. To validate |options|, run these substeps: - 1. If |options|.axis does not [=map/exist=], let |options|."axis" be 1. + 1. To validate |options|, run these substeps: 1. If |options|.axis is not a number between 0 and the rank of |input|, then [=exception/throw=] a {{TypeError}} and abort these steps. 1. If |input| is a 4-D tensor of the *"nchw"* layout, set |options|.axis to 1. 1. If |input| is a 4-D tensor of the *"nhwc"* layout, set |options|.axis to 3. @@ -1895,8 +1885,6 @@ partial interface MLGraphBuilder { To check clamp options given |options|, run the following steps:
- 1. If |options| is not an [=object=] that [=implements=] {{MLClampOptions}}, then return `false`. - 1. If |options|.{{MLClampOptions/minValue}} and |options|.{{MLClampOptions/maxValue}} are not a [=numeric type=], then return `false`. 1. If |options|.{{MLClampOptions/minValue}} is greater than |options|.{{MLClampOptions/maxValue}}, then return `false`. 1. Return `true`.
@@ -1918,9 +1906,8 @@ partial interface MLGraphBuilder { The {{MLGraphBuilder/clamp(operand, options)}} method steps are:
- 1. Let |operand| be the first argument. - 1. Let |options| be the second argument. - 1. If running the check clamp options steps with |options| returns `false`, then [=exception/throw=] a {{TypeError}} and abort these steps. + 1. [=Assert=]: the type of |operand| is {{MLOperand}}. + 1. If running the check clamp options steps with |options| returns `false`, then [=exception/throw=] a {{TypeError}} and abort these steps. 1. If any of the following sub-steps fail, throw an "{{OperationError}}" {{DOMException}} and stop. 1. Let |output| be the result of invoking the copy MLOperand steps given |operand|. 1. Make a request to the underlying platform to: @@ -1949,8 +1936,7 @@ partial interface MLGraphBuilder { The {{MLGraphBuilder/clamp(options)}} method steps are:
- 1. Let |options| be the first argument. - 1. If running the check clamp options steps with |options| returns `false`, then [=exception/throw=] a {{TypeError}} and abort these steps. + 1. If running the check clamp options steps with |options| returns `false`, then [=exception/throw=] a {{TypeError}} and abort these steps. 1. Let |op| be the result of invoking the create MLActivation steps with `"clamp"` and |options|. 1. If that throws an error, re-throw the error and abort these steps. 1. Return |op|. @@ -1984,36 +1970,33 @@ partial interface MLGraphBuilder {
The permissions and context validity have been checked by [[#api-mlgraphbuilder-constructor]] steps.
- 1. Let |inputs| be the first argument. - 1. [=Assert=]: the type of |inputs| is sequence of {{MLOperand}} objects. - 1. [=Assert=]: the type of |axis| is `unsigned long`. - 1. [=Assert=]: the shape, i.e. {{MLOperandDescriptor/dimensions}}) of each operand in |inputs| is the same, except on the dimension given by |axis| on which they are concatenated. - 1. [=Assert=]: the {{MLOperandDescriptor/type}} of each operand in |inputs| is the same. - 1. If any of the following steps fail, then throw a "{{DataError}}" {{DOMException}} and stop. - 1. If |inputs| is not an array of [=objects=], fail. - 1. If |axis| is not a positive integer [=number=], fail. - 1. If |axis| is greater than or equal to the rank of |inputs|, fail. - 1. Let |desc| be |inputs|[0].{{MLOperand/[[descriptor]]}}. - 1. Let |desc|.{{MLOperandDescriptor/dimensions}}[|axis|] be `0`. - 1. For each |index| between 0 and the rank of |inputs|: - 1. If running validate MLOperand given |inputs|[|index|] and [=this=] returns `false`, then fail. - 1. For each |dim| between 0 and the rank of |inputs|[|index|]: -
- If the shape of each corresponding dimension and type of the operands, except for those of the dimension given by |axis|, is not the same, fail. -
- 1. If |dim| is not equal to |axis| and if |inputs|[|index|].{{MLOperandDescriptor/dimensions}}[|dim|] is not equal to |inputs|[0].{{MLOperandDescriptor/dimensions}}[|dim|], fail. - 1. If |inputs|[|dim|].{{MLOperandDescriptor/type}} is not equal to |inputs|[0].{{MLOperandDescriptor/type}}. - 1. If |dim| is equal to |axis|, add to |desc|.{{MLOperandDescriptor/dimensions}}[|axis|] the value of |inputs|[|index|].{{MLOperandDescriptor/dimensions}}[|dim|]. - 1. If any of the following sub-steps fail, throw an "{{OperationError}}" {{DOMException}} and stop. - 1. Let |output| be the result of invoking the create MLOperand steps given [=this=] and |desc|. - 1. Make a request to the underlying platform to: - 1. Create an [=implementation-defined=] platform operator |concatImpl| for this method, given |inputs| and |axis|. - 1. Store a reference of |concatImpl| in |output|.{{MLOperand/[[operator]]}}. - 1. Create an [=implementation-defined=] platform operand |outputImpl| to represent output,given |output| and |concatImpl|. - 1. Store a reference to |outputImpl| in |output|.{{MLOperand/[[operand]]}}. - 1. Connect |inputs| as input to |concatImpl|. - 1. Connect |output|.{{MLOperand/[[operand]]}} as output to |concatImpl|. - 1. Return |output|. + 1. [=Assert=]: the type of |inputs| is sequence of {{MLOperand}} objects. + 1. [=Assert=]: the type of |axis| is `unsigned long`. + 1. [=Assert=]: the shape, i.e. {{MLOperandDescriptor/dimensions}}) of each operand in |inputs| is the same, except on the dimension given by |axis| on which they are concatenated. + 1. [=Assert=]: the {{MLOperandDescriptor/type}} of each operand in |inputs| is the same. + 1. If any of the following steps fail, then throw a "{{DataError}}" {{DOMException}} and stop. + 1. If |axis| is greater than or equal to the rank of |inputs|, fail. + 1. Let |desc| be |inputs|[0].{{MLOperand/[[descriptor]]}}. + 1. Let |desc|.{{MLOperandDescriptor/dimensions}}[|axis|] be `0`. + 1. For each |index| between 0 and the rank of |inputs|: + 1. If running validate MLOperand given |inputs|[|index|] and [=this=] returns `false`, then fail. + 1. For each |dim| between 0 and the rank of |inputs|[|index|]: +
+ If the shape of each corresponding dimension and type of the operands, except for those of the dimension given by |axis|, is not the same, fail. +
+ 1. If |dim| is not equal to |axis| and if |inputs|[|index|].{{MLOperandDescriptor/dimensions}}[|dim|] is not equal to |inputs|[0].{{MLOperandDescriptor/dimensions}}[|dim|], fail. + 1. If |inputs|[|dim|].{{MLOperandDescriptor/type}} is not equal to |inputs|[0].{{MLOperandDescriptor/type}}. + 1. If |dim| is equal to |axis|, add to |desc|.{{MLOperandDescriptor/dimensions}}[|axis|] the value of |inputs|[|index|].{{MLOperandDescriptor/dimensions}}[|dim|]. + 1. If any of the following sub-steps fail, throw an "{{OperationError}}" {{DOMException}} and stop. + 1. Let |output| be the result of invoking the create MLOperand steps given [=this=] and |desc|. + 1. Make a request to the underlying platform to: + 1. Create an [=implementation-defined=] platform operator |concatImpl| for this method, given |inputs| and |axis|. + 1. Store a reference of |concatImpl| in |output|.{{MLOperand/[[operator]]}}. + 1. Create an [=implementation-defined=] platform operand |outputImpl| to represent output,given |output| and |concatImpl|. + 1. Store a reference to |outputImpl| in |output|.{{MLOperand/[[operand]]}}. + 1. Connect |inputs| as input to |concatImpl|. + 1. Connect |output|.{{MLOperand/[[operand]]}} as output to |concatImpl|. + 1. Return |output|.
@@ -2637,18 +2620,6 @@ partial interface MLGraphBuilder {
-
- - To check ELU options given |options|, run the following steps: - -
- 1. If |options| is not an [=object=] that [=implements=] {{MLEluOptions}}, then return `false`. - 1. If |options|.{{MLEluOptions/alpha}} is `undefined`, set |options|.{{MLEluOptions/alpha}} to `1`. - 1. Else if |options|.{{MLEluOptions/alpha}} is not a [=numeric type=], then return `false`. - 1. Return `true`. -
-
- #### The {{MLGraphBuilder/elu(input, options)}} method #### {#api-mlgraphbuilder-elu-input-options}
**Arguments:** @@ -2665,9 +2636,6 @@ partial interface MLGraphBuilder { The {{MLGraphBuilder/elu(input, options)}} method steps are:
- 1. Let |input| be the first argument. - 1. Let |options| be the second argument. - 1. If running the check ELU options steps with |options| returns `false`, then [=exception/throw=] a {{TypeError}} and abort these steps. 1. If any of the following sub-steps fail, throw an "{{OperationError}}" {{DOMException}} and stop. 1. Let |output| be the result of invoking the copy MLOperand steps given |input|. 1. Make a request to the underlying platform to: @@ -2696,9 +2664,6 @@ partial interface MLGraphBuilder { The {{MLGraphBuilder/elu(options)}} method steps are:
- 1. Let |options| be the first argument. - 1. If |options| is `undefined`, let |options| be a new {{MLEluOptions}} object. - 1. If running the check ELU options steps with |options| returns `false`, then [=exception/throw=] a {{TypeError}} and abort these steps. 1. Let |op| be the result of invoking the create MLActivation steps with `"elu"` and |options|. 1. Return |op|.
@@ -3228,20 +3193,6 @@ partial interface MLGraphBuilder { The default value is `0.5`. -
- - To check hard-sigmoid options given |options|, run the following steps: - -
- 1. If |options| is not an [=object=] that [=implements=] {{MLHardSigmoidOptions}}, then return `false`. - 1. If |options|.{{MLEluOptions/alpha}} is `undefined`, set |options|.{{MLHardSigmoidOptions/alpha}} to `0.2`. - 1. Else if |options|.{{MLHardSigmoidOptions/alpha}} is not a [=numeric type=], then return `false`. - 1. If |options|.{{MLHardSigmoidOptions/beta}} is `undefined`, set |options|.{{MLHardSigmoidOptions/beta}} to `0.5`. - 1. Else if |options|.{{MLHardSigmoidOptions/beta}} is not a [=numeric type=], then return `false`. - 1. Return `true`. -
-
- #### The {{MLGraphBuilder/hardSigmoid(input, options)}} method #### {#api-mlgraphbuilder-hardsigmoid-input-options}
**Arguments:** @@ -3253,9 +3204,6 @@ partial interface MLGraphBuilder {
The {{MLGraphBuilder/hardSigmoid(input, options)}} method steps are: - 1. Let |input| be the first argument. - 1. Let |options| be the second argument. - 1. If running the check hard-sigmoid options steps with |options| returns `false`, then [=exception/throw=] a {{TypeError}} and abort these steps. 1. If any of the following sub-steps fail, throw an "{{OperationError}}" {{DOMException}} and stop. 1. Let |output| be the result of invoking the copy MLOperand steps given |input|. 1. Make a request to the underlying platform to: @@ -3282,8 +3230,6 @@ partial interface MLGraphBuilder { The {{MLGraphBuilder/hardSigmoid(options)}} method steps are:
- 1. Let |options| be the first argument. - 1. If running the check hard-sigmoid options steps with |options| returns `false`, then [=exception/throw=] a {{TypeError}} and abort these steps. 1. Let |op| be the result of invoking the create MLActivation steps with `"hardSigmoid"` and |options|. 1. If that throws an error, re-throw the error and abort these steps. 1. Return |op|. @@ -3335,7 +3281,6 @@ partial interface MLGraphBuilder { The {{MLGraphBuilder/hardSwish(input)}} method steps are:
- 1. Let |input| be the first argument. 1. If any of the following sub-steps fail, throw an "{{OperationError}}" {{DOMException}} and stop. 1. Let |output| be the result of invoking the copy MLOperand steps given |input|. 1. Make a request to the underlying platform to: @@ -3517,18 +3462,6 @@ partial interface MLGraphBuilder { The default value is `0.01`. -
- - To check leaky-relu options given |options|, run the following steps: - -
- 1. If |options| is not an [=object=] that [=implements=] {{MLLeakyReluOptions}}, then return `false`. - 1. If |options|.{{MLLeakyReluOptions/alpha}} is `undefined`, set |options|.{{MLLeakyReluOptions/alpha}} to `1`. - 1. Else if |options|.{{MLLeakyReluOptions/alpha}} is not a [=numeric type=], then return `false`. - 1. Return `true`. -
-
- #### The {{MLGraphBuilder/leakyRelu(input, options)}} method #### {#api-mlgraphbuilder-leaky-relu-input-options}
**Arguments:** @@ -3544,10 +3477,6 @@ partial interface MLGraphBuilder { The {{MLGraphBuilder/leakyRelu(input, options)}} method steps are:
- 1. Let |input| be the first argument. - 1. Let |options| be the second argument. - 1. If |options| is `undefined`, let |options| be a new {{MLLeakyReluOptions}} object. - 1. If running the check leaky-relu options steps with |options| returns `false`, then [=exception/throw=] a {{TypeError}} and abort these steps. 1. If any of the following sub-steps fail, throw an "{{OperationError}}" {{DOMException}} and stop. 1. Let |output| be the result of invoking the copy MLOperand steps given |input|. 1. Make a request to the underlying platform to: @@ -3572,12 +3501,9 @@ partial interface MLGraphBuilder {
- The {{MLGraphBuilder/elu(options)}} method steps are: + The {{MLGraphBuilder/leakyRelu(options)}} method steps are:
- 1. Let |options| be the first argument. - 1. If |options| is `undefined`, let |options| be a new {{MLLeakyReluOptions}} object. - 1. If running the check leaky-relu options steps with |options| returns `false`, then [=exception/throw=] a {{TypeError}} and abort these steps. 1. Let |op| be the result of invoking the create MLActivation steps with `"leakyRelu"` and |options|. 1. If that throws an error, re-throw the error and abort these steps. 1. Return |op|. @@ -3627,20 +3553,6 @@ partial interface MLGraphBuilder { The default value is `0`. -
- - To check linear options given |options|, run the following steps: - -
- 1. If |options| is not an [=object=] that [=implements=] {{MLLinearOptions}}, then return `false`. - 1. If |options|.{{MLEluOptions/alpha}} is `undefined`, set |options|.{{MLLinearOptions/alpha}} to `1`. - 1. Else if |options|.{{MLLinearOptions/alpha}} is not a [=numeric type=], then return `false`. - 1. If |options|.{{MLLinearOptions/beta}} is `undefined`, set |options|.{{MLLinearOptions/beta}} to `0`. - 1. Else if |options|.{{MLLinearOptions/beta}} is not a [=numeric type=], then return `false`. - 1. Return `true`. -
-
- #### The {{MLGraphBuilder/linear(input, options)}} method #### {#api-mlgraphbuilder-linear-input-options}
**Arguments:** @@ -3648,7 +3560,7 @@ partial interface MLGraphBuilder { - *options*: an optional {{MLLinearOptions}}. The optional parameters of the operation. **Returns:** - - an {{MLOperand}}. The output tensor of the same shape as *x*. + - an {{MLOperand}}. The output tensor of the same shape as *input*.
@@ -3656,9 +3568,6 @@ partial interface MLGraphBuilder { The {{MLGraphBuilder/linear(input, options)}} method steps are:
- 1. Let |input| be the first argument. - 1. Let |options| be the second argument. - 1. If running the check linear options steps with |options| returns `false`, then [=exception/throw=] a {{TypeError}} and abort these steps. 1. If any of the following sub-steps fail, throw an "{{OperationError}}" {{DOMException}} and stop. 1. Let |output| be the result of invoking the copy MLOperand steps given |input|. 1. Make a request to the underlying platform to: @@ -3686,8 +3595,6 @@ partial interface MLGraphBuilder { The {{MLGraphBuilder/linear(options)}} method steps are:
- 1. Let |options| be the first argument. - 1. If running the check linear options steps with |options| returns `false`, then [=exception/throw=] a {{TypeError}} and abort these steps. 1. Let |op| be the result of invoking the create MLActivation steps with `"linear"` and |options|. 1. If that throws an error, re-throw the error and abort these steps. 1. Return |op|. @@ -4820,18 +4727,13 @@ partial interface MLGraphBuilder { To check resample options given |options|, run the following steps:
- 1. If |options| is `undefined`, let |options| be a new {{MLResample2dOptions}} object. - 1. If |options|.{{MLResample2dOptions/mode}} [=map/exists=]: - 1. If its value is not one of `"nearest-neighbor"` or `"linear"`, return `null`. - 1. Otherwise, set |options|.{{MLResample2dOptions/mode}} to `"nearest-neighbor"`. - 1. If |options|.{{MLResample2dOptions/scales}} [=map/exists=]: - 1. If its size is not `2`, or if any of its values is not greater than `0`, return `null`. - 1. Otherwise, set |options|.{{MLResample2dOptions/scales}} to `[1.0, 1.0]`. - 1. If |options|.{{MLResample2dOptions/sizes}} [=map/exists=]: if its size is not `2`, or if any of its values is not greater than `0`, return `null`. - 1. If |options|.{{MLResample2dOptions/axes}} [=map/exists=]: - 1. If its value is not one of `[0, 1], [1, 2], [2, 3]`, return `null`. - 1. Otherwise, set |options|.{{MLResample2dOptions/axes}} to `[2, 3]`. - 1. Return |options|. + 1. If |options|.{{MLResample2dOptions/mode}} [=map/exists=], and if its value is not one of `"nearest-neighbor"` or `"linear"`, return `false`. + 1. If |options|.{{MLResample2dOptions/scales}} does not [=map/exist=], set it to to `[1.0, 1.0]`. + 1. Otherwise, if any of its values is not greater than `0`, return `false`. + 1. If |options|.{{MLResample2dOptions/sizes}} [=map/exists=], and if its size is not `2`, or if any of its values is not greater than `0`, return `false`. + 1. If |options|.{{MLResample2dOptions/axes}} does not [=map/exists=], set it to `[2, 3]`. + 1. Otherwise, if its value is not one of `[0, 1], [1, 2], [2, 3]`, return `false`. + 1. Return `true`.
@@ -4857,8 +4759,7 @@ partial interface MLGraphBuilder {
1. Check if the input is a 4-dimensional tensor: if the size of |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}} is not `4`, throw a "{{DataError}}" {{DOMException}} and stop. - 1. Let |options| be the result of running the check resample options steps given |options|. - 1. If that returns `null`, then throw a "{{DataError}}" {{DOMException}} and stop. + 1. If running the check resample options steps given |options| returns `false`, then throw a "{{DataError}}" {{DOMException}} and stop. 1. Let |desc| be the result of running the resample output sizes steps given |options|. 1. If that throws an error, re-throw the error and stop. 1. If any of the following sub-steps fail, throw an "{{OperationError}}" {{DOMException}} and stop. @@ -5161,18 +5062,6 @@ partial interface MLGraphBuilder { The default value is `1`. -
- - To check softplus options given |options|, run the following steps: - -
- 1. If |options| is not an [=object=], then return `false`. - 1. If |options|.{{MLSoftplusOptions/steepness}} is `undefined`, set |options|.{{MLSoftplusOptions/steepness}} to `1`. - 1. Else if |options|.{{MLSoftplusOptions/steepness}} is not a [=numeric type=], then return `false`. - 1. Return `true`. -
-
- #### The {{MLGraphBuilder/softplus(input, options)}} method #### {#api-mlgraphbuilder-softplus-input-options}
**Arguments:** @@ -5188,9 +5077,6 @@ partial interface MLGraphBuilder { The {{MLGraphBuilder/softplus(input, options)}} method steps are:
- 1. Let |input| be the first argument. - 1. Let |options| be the second argument. - 1. If running the check softplus options steps with |options| returns `false`, then [=exception/throw=] a {{TypeError}} and abort these steps. 1. If any of the following sub-steps fail, throw an "{{OperationError}}" {{DOMException}} and stop. 1. Let |output| be the result of invoking the copy MLOperand steps given |input|. 1. Make a request to the underlying platform to: @@ -5218,8 +5104,6 @@ partial interface MLGraphBuilder { The {{MLGraphBuilder/softplus(options)}} method steps are:
- 1. Let |options| be the first argument. - 1. If running the check softplus options steps with |options| returns `false`, then [=exception/throw=] a {{TypeError}} and abort these steps. 1. Let |op| be the result of invoking the create MLActivation steps with `"softplus"` and |options|. 1. If that throws an error, re-throw the error and abort these steps. 1. Return |op|. From ba52f5a7ea89df9c04603d6c16173306b9d5f951 Mon Sep 17 00:00:00 2001 From: Zoltan Kis Date: Wed, 16 Aug 2023 15:01:42 +0300 Subject: [PATCH 076/112] Fix #450: replace undefined with map/exists and remove default inits Signed-off-by: Zoltan Kis --- index.bs | 77 ++++++++++++-------------------------------------------- 1 file changed, 16 insertions(+), 61 deletions(-) diff --git a/index.bs b/index.bs index c00028c0..1888b2b1 100644 --- a/index.bs +++ b/index.bs @@ -1005,7 +1005,7 @@ The {{MLOperand}} objects are created by the methods of {{MLGraphBuilder}}, inte
1. [=Assert=]: the type of |operand|.{{MLOperand/[[builder]]}} is {{MLGraphBuilder}}. - 1. If |builder| is not `undefined` and is not equal to |operand|.{{MLOperand/[[builder]]}}, return `false`. + 1. If |builder| [=map/exists=]] and is not equal to |operand|.{{MLOperand/[[builder]]}}, return `false`. 1. Let |desc| be |operand|.{{MLOperand/[[descriptor]]}}. 1. If |desc| is not an [=object=] that [=implements=] {{MLOperandDescriptor}}, return `false`. 1. If |desc|.{{MLOperandDescriptor/dimensions}} [=map/exists=] and invoking check dimensions given |desc|.{{MLOperandDescriptor/dimensions}} and |desc|.{{MLOperandDescriptor/type}} returns `false`, then return `false`. @@ -1056,7 +1056,7 @@ The {{MLActivation}} objects (including the ones passed as input to methods) are
1. [=Assert=]: the type of |builder| is {{MLGraphBuilder}}. - 1. If |name| is `undefined` or `null`, throw a "{{TypeError}}" and abort these steps. + 1. If |name| is empty, then throw a "{{TypeError}}" and abort these steps. 1. Let |activation| be a new [=object=]. 1. Set |activation|.{{MLActivation/[[builder]]}} to |builder|. 1. Set |activation|.{{MLActivation/[[name]]}} to |name|. @@ -2131,21 +2131,17 @@ partial interface MLGraphBuilder { 1. If |input_size| is not `4`, then throw a "{{DataError}}" {{DOMException}} and stop. 1. If |filter_size| is not `4`, then throw a "{{DataError}}" {{DOMException}} and stop. 1. If the type of |input| and |filter| is not the same, then [=exception/throw=] a {{TypeError}} and stop. - 1. If |options| is `undefined`, let |options| be an empty [=object=]. - 1. If |options|.{{MLConv2dOptions/padding}} is `undefined`, set it to `[0, 0, 0, 0]`. + 1. If |options|.{{MLConv2dOptions/padding}} does not [=map/exist=], set it to `[0, 0, 0, 0]`. 1. Else if |options|.{{MLConv2dOptions/padding}}.length is not 4, then throw a "{{DataError}}" {{DOMException}} and stop. - 1. If |options|.{{MLConv2dOptions/strides}} is `undefined`, set it to `[1, 1]`. + 1. If |options|.{{MLConv2dOptions/strides}} does not [=map/exist=], set it to `[1, 1]`. 1. Else if |options|.{{MLConv2dOptions/strides}}.length is not `2`, then [=exception/throw=] a {{TypeError}} and stop. 1. If any element in |options|.{{MLConv2dOptions/strides}} is equal to 0, then [=exception/throw=] a {{TypeError}} and stop. - 1. If |options|.{{MLConv2dOptions/dilations}} is `undefined`, set it to `[1, 1]`. + 1. If |options|.{{MLConv2dOptions/dilations}} does not [=map/exist=], set it to `[1, 1]`. 1. Else if |options|.{{MLConv2dOptions/dilations}}.length is not `2`, then [=exception/throw=] a {{TypeError}} and stop. - 1. If |options|.{{MLConv2dOptions/autoPad}} is `undefined`, set it to `"explicit"`. - 1. If |options|.{{MLConv2dOptions/groups}} is `undefined`, set it to `1`. - 1. Else if |options|.{{MLConv2dOptions/groups}} is 0, then throw a "{{DataError}}" {{DOMException}} and stop. + 1. If |options|.{{MLConv2dOptions/autoPad}} does not [=map/exist=], set it to `"explicit"`. + 1. If |options|.{{MLConv2dOptions/groups}} is 0, then throw a "{{DataError}}" {{DOMException}} and stop. 1. If |input_size| / |options|.{{MLConv2dOptions/groups}} is not equal to |filter_size|, then throw a "{{DataError}}" {{DOMException}} and stop. 1. Else if |input_size| % |options|.{{MLConv2dOptions/groups}} is not 0, then throw a "{{DataError}}" {{DOMException}} and stop. - 1. If |options|.{{MLConv2dOptions/inputLayout}} is `undefined`, set it to `"nchw"`. - 1. If |options|.{{MLConv2dOptions/filterLayout}} is `undefined`, set it to `"oihw"`. 1. If |options|.{{MLConv2dOptions/bias}} [=map/exists=]: 1. [=Assert=]: the type of |options|.{{MLConv2dOptions/bias}} is {{MLOperand}}. 1. If the length of |options|.{{MLConv2dOptions/bias}}.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}} is not 1, then [=exception/throw=] a {{TypeError}} and stop. @@ -2310,25 +2306,20 @@ partial interface MLGraphBuilder { 1. If |input_size| is not `4`, then throw a "{{DataError}}" {{DOMException}} and stop. 1. If |filter_size| is not `4`, then throw a "{{DataError}}" {{DOMException}} and stop. 1. If the type of |input| and |filter| is not the same, then [=exception/throw=] a {{TypeError}} and stop. - 1. If |options| is `undefined`, let |options| be an empty [=object=]. - 1. If |options|.{{MLConvTranspose2dOptions/padding}} is `undefined`, set it to `[0, 0, 0, 0]`. + 1. If |options|.{{MLConvTranspose2dOptions/padding}} does not [=map/exist=], set it to `[0, 0, 0, 0]`. 1. Else if |options|.{{MLConvTranspose2dOptions/padding}}.length is not 4, then throw a "{{DataError}}" {{DOMException}} and stop. - 1. If |options|.{{MLConvTranspose2dOptions/strides}} is `undefined`, set it to `[1, 1]`. + 1. If |options|.{{MLConvTranspose2dOptions/strides}} does not [=map/exist=], set it to `[1, 1]`. 1. Else if |options|.{{MLConvTranspose2dOptions/strides}}.length is not `2`, then [=exception/throw=] a {{TypeError}} and stop. 1. If any element in |options|.{{MLConv2dOptions/strides}} is equal to 0, then [=exception/throw=] a {{TypeError}} and stop. - 1. If |options|.{{MLConvTranspose2dOptions/dilations}} is `undefined`, set it to `[1, 1]`. + 1. If |options|.{{MLConvTranspose2dOptions/dilations}} does not [=map/exist=], set it to `[1, 1]`. 1. Else if |options|.{{MLConvTranspose2dOptions/dilations}}.length is not `2`, then [=exception/throw=] a {{TypeError}} and stop. - 1. If |options|.{{MLConvTranspose2dOptions/outputPadding}} is `undefined`, set it to `[0, 0]`. + 1. If |options|.{{MLConvTranspose2dOptions/outputPadding}} does not [=map/exist=], set it to `[0, 0]`. 1. Else if |options|.{{MLConvTranspose2dOptions/outputPadding}}.length is not `2`, then [=exception/throw=] a {{TypeError}} and stop. 1. If |options|.{{MLConvTranspose2dOptions/outputSizes}} [=map/exists=]: 1. If |options|.{{MLConvTranspose2dOptions/outputSizes}}.length is not `2`, then [=exception/throw=] a {{TypeError}} and stop. 1. If the elements of |options|.{{MLConvTranspose2dOptions/outputSizes}} are not smaller than the elements at the same dimension (index) for |options|.{{MLConvTranspose2dOptions/strides}}, then throw a "{{DataError}}" {{DOMException}} and stop. - 1. If |options|.{{MLConvTranspose2dOptions/autoPad}} is `undefined`, set it to `"explicit"`. - 1. If |options|.{{MLConvTranspose2dOptions/groups}} is `undefined`, set it to `1`. 1. If |input_size| / |options|.{{MLConvTranspose2dOptions/groups}} is not equal to |filter_size|, then throw a "{{DataError}}" {{DOMException}} and stop. 1. Else if |input_size| % |options|.{{MLConvTranspose2dOptions/groups}} is not 0, then throw a "{{DataError}}" {{DOMException}} and stop. - 1. If |options|.{{MLConvTranspose2dOptions/inputLayout}} is `undefined`, set it to `"nchw"`. - 1. If |options|.{{MLConvTranspose2dOptions/filterLayout}} is `undefined`, set it to `"iohw"`. 1. If |options|.{{MLConvTranspose2dOptions/bias}} [=map/exists=]: 1. [=Assert=]: the type of |options|.{{MLConvTranspose2dOptions/bias}} is {{MLOperand}}. 1. If the length of |options|.{{MLConvTranspose2dOptions/bias}}.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}} is not 1, then [=exception/throw=] a {{TypeError}} and stop. @@ -2724,13 +2715,6 @@ partial interface MLGraphBuilder {
1. [=Assert=]: the type of |a| and |b| is {{MLOperand}}. - 1. If |options| is `undefined`, let |options| be an empty [=object=]. - 1. If |options|.{{MLGemmOptions/alpha}} is `undefined`, set it to `1.0`. - 1. If |options|.{{MLGemmOptions/beta}} is `undefined`, set it to `1.0`. - 1. If |options|.{{MLGemmOptions/aTranspose}} is `undefined`, set it to `false`. - 1. If |options|.{{MLGemmOptions/aTranspose}} is not `false`, set it to `true`. - 1. If |options|.{{MLGemmOptions/bTranspose}} is `undefined`, set it to `false`. - 1. If |options|.{{MLGemmOptions/bTranspose}} is not `false`, set it to `true`. 1. Let |shapeA| be |a|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}} and |sizeA| the size of |shapeA|. 1. Let |shapeB| be |a|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}} and |sizeB| the size of |shapeB|. 1. If |sizeA| is not `2` or |sizeB| is not `2`, then throw a "{{DataError}}" {{DOMException}} and stop. @@ -2864,7 +2848,6 @@ partial interface MLGraphBuilder { 1. [=Assert=]: the type of |input|, |weight| and |recurrentWeight| is {{MLOperand}}. 1. If the rank of |input| or |weight| is not `3`, then throw a "{{DataError}}" {{DOMException}} and stop. 1. If the rank of |weight| or |recurrentWeight| is not `2`, then throw a "{{DataError}}" {{DOMException}} and stop. - 1. If |options| is `undefined`, let |options| be an empty [=object=]. 1. If |options|.{{MLGruOptions/bias}} [=map/exists=]. 1. [=Assert=]: its type is {{MLOperand}}. 1. If its rank is not `2`, then throw a "{{DataError}}" {{DOMException}} and stop. @@ -2874,11 +2857,7 @@ partial interface MLGraphBuilder { 1. If |options|.{{MLGruOptions/initialHiddenState}} [=map/exists=]. 1. [=Assert=]: its type is {{MLOperand}}. 1. If its rank is not `3`, then throw a "{{DataError}}" {{DOMException}} and stop. - 1. If |options|.{{MLGruOptions/resetAfter}} is `undefined`, set it to `true`. - 1. If |options|.{{MLGruOptions/returnSequence}} is `undefined`, set it to `false`. - 1. If |options|.{{MLGruOptions/direction}} is `undefined`, set it to `"forward"`. 1. If |options|.{{MLGruOptions/direction}} is not one of {{MLRecurrentNetworkDirection}}, then [=exception/throw=] a {{TypeError}} and stop. - 1. If |options|.{{MLGruOptions/layout}} is `undefined`, set it to `"zrn"`. 1. If |options|.{{MLGruOptions/layout}} is not one of {{MLGruWeightLayout}}, then [=exception/throw=] a {{TypeError}} and stop. 1. If |options|.{{MLGruOptions/activations}} [=map/exists=] and is not an array of {{MLActivation}} objects with size `2`, then [=exception/throw=] a {{TypeError}} and stop. 1. If |steps| is not a [=number=] or it is `0`, then [=exception/throw=] a {{TypeError}} and stop. @@ -3019,15 +2998,12 @@ partial interface MLGraphBuilder { 1. [=Assert=]: the type of |input|, |weight| and |recurrentWeight| is {{MLOperand}}. 1. If the rank of |input| or |weight| is not `3`, then throw a "{{DataError}}" {{DOMException}} and stop. 1. If the rank of |weight| or |recurrentWeight| is not `2`, then throw a "{{DataError}}" {{DOMException}} and stop. - 1. If |options| is `undefined`, let |options| be an empty [=object=]. 1. If |options|.{{MLGruOptions/bias}} [=map/exists=]: 1. [=Assert=]: its type is {{MLOperand}}. 1. If its rank is not `1`, then throw a "{{DataError}}" {{DOMException}} and stop. 1. If |options|.{{MLGruOptions/recurrentBias}} [=map/exists=]: 1. [=Assert=]: its type is {{MLOperand}}. 1. If its rank is not `1`, then throw a "{{DataError}}" {{DOMException}} and stop. - 1. If |options|.{{MLGruOptions/resetAfter}} is `undefined`, set it to `true`. - 1. If |options|.{{MLGruOptions/layout}} is `undefined`, set it to `"zrn"`. 1. If |options|.{{MLGruOptions/layout}} is not one of {{MLGruWeightLayout}}, then [=exception/throw=] a {{TypeError}} and stop. 1. If |options|.{{MLGruOptions/activations}} [=map/exists=] and is not an array of {{MLActivation}} objects with size `2`, then [=exception/throw=] a {{TypeError}} and stop. 1. Let |desc| a new {{MLOperandDescriptor}}. @@ -3366,13 +3342,10 @@ The {{MLInstanceNormalizationOptions}} members are:
1. [=Assert=]: the type of |input| is {{MLOperand}}. 1. If the rank of |input| is not `4`, then throw a "{{DataError}}" {{DOMException}} and stop. - 1. If |options| is `undefined`, let |options| be an empty [=object=]. 1. [=Assert=]: the type of |options|.{{MLInstanceNormalizationOptions/scale}} is {{MLOperand}}. 1. If the rank of |options|.{{MLInstanceNormalizationOptions/scale}} is not equal to the size of the channel dimension of |input|, then throw a "{{DataError}}" {{DOMException}} and stop. 1. [=Assert=]: the type of |options|.{{MLInstanceNormalizationOptions/bias}} is {{MLOperand}}. 1. If the rank of |options|.{{MLInstanceNormalizationOptions/bias}} is not equal to the size of the channel dimension of |input|, then throw a "{{DataError}}" {{DOMException}} and stop. - 1. If |options|.{{MLInstanceNormalizationOptions/epsilon}} is `undefined`, let it be `0.00001`. - 1. If |options|.{{MLInstanceNormalizationOptions/layout}} is `undefined`, let it be `"nchw"`. 1. Otherwise if |options|.{{MLInstanceNormalizationOptions/layout}} is not one of {{MLInputOperandLayout}}, then throw a "{{DataError}}" {{DOMException}} and stop. 1. If any of the following sub-steps fail, throw an "{{OperationError}}" {{DOMException}} and stop. 1. Let |output| be the result of invoking the copy MLOperand steps given |input|. @@ -3685,8 +3658,6 @@ partial interface MLGraphBuilder { The {{MLGraphBuilder/lstm(input, weight, recurrentWeight, steps, hiddenSize, options)}} steps are:
- 1. If |options| is `undefined`, let |options| be an empty [=object=]. - 1. If |options|.{{MLLstmOptions/direction}} is `undefined`, set it to `"forward"`. 1. If |options|.{{MLLstmOptions/direction}} is not one of {{MLRecurrentNetworkDirection}}, then [=exception/throw=] a {{TypeError}} and stop. 1. Let |num_directions| be `1` if |options|.{{MLLstmOptions/direction}} is `"forward"`, or otherwise let it be `2`. 1. [=Assert=]: the type of |input|, |weight| and |recurrentWeight| is {{MLOperand}}. @@ -3722,8 +3693,6 @@ partial interface MLGraphBuilder { 1. If |options|.{{MLLstmOptions/initialCellState}}.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}[0] is not |num_directions|, then throw a "{{DataError}}" {{DOMException}} and stop. 1. If |options|.{{MLLstmOptions/initialCellState}}.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}[1] is not equal to |batch_size|, then throw a "{{DataError}}" {{DOMException}} and stop. 1. If |options|.{{MLLstmOptions/initialCellState}}.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}[2] is not |hiddenSize|, then throw a "{{DataError}}" {{DOMException}} and stop. - 1. If |options|.{{MLLstmOptions/returnSequence}} is `undefined`, set it to `false`. - 1. If |options|.{{MLLstmOptions/layout}} is `undefined`, set it to `"iofg"`. 1. If |options|.{{MLLstmOptions/layout}} is not one of {{MLLstmWeightLayout}}, then [=exception/throw=] a {{TypeError}} and stop. 1. If |options|.{{MLLstmOptions/activations}} [=map/exists=]: 1. If it is not an array of size `3`, then [=exception/throw=] a {{TypeError}} and stop. @@ -3892,7 +3861,6 @@ partial interface MLGraphBuilder { 1. [=Assert=]: the type of |input|, |weight|, |recurrentWeight|, |hiddenState| and |cellState| is {{MLOperand}}. 1. If the rank of |input|, |weight|, |recurrentWeight|, |hiddenState| or |cellState| is not `2`, then throw a "{{DataError}}" {{DOMException}} and stop. 1. Let |batch_size| be |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}[0]. - 1. If |options| is `undefined`, let |options| be an empty [=object=]. 1. If |options|.{{MLLstmCellOptions/bias}} [=map/exists=]: 1. [=Assert=]: its type is {{MLOperand}}. 1. If its rank is not `1`, then throw a "{{DataError}}" {{DOMException}} and stop. @@ -3905,7 +3873,6 @@ partial interface MLGraphBuilder { 1. [=Assert=]: its type is {{MLOperand}}. 1. If its rank is not `1`, then throw a "{{DataError}}" {{DOMException}} and stop. 1. If |options|.{{MLLstmCellOptions/peepholeWeight}}.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}[0] is not 3 * |hiddenSize|, then throw a "{{DataError}}" {{DOMException}} and stop. - 1. If |options|.{{MLLstmCellOptions/layout}} is `undefined`, set it to `"iofg"`. 1. If |options|.{{MLLstmCellOptions/layout}} is not one of {{MLLstmWeightLayout}}, then [=exception/throw=] a {{TypeError}} and stop. 1. If |options|.{{MLLstmCellOptions/activations}} [=map/exists=]: 1. If it is not an array of size `3`, then [=exception/throw=] a {{TypeError}} and stop. @@ -4180,10 +4147,7 @@ partial interface MLGraphBuilder {
1. [=Assert=]: the type of |input| is {{MLOperand}}. 1. If |beginningPadding| or |endingPadding| is not a sequence of {{unsigned long}}, then [=exception/throw=] a {{TypeError}} and stop. - 1. If |options| is `undefined`, let |options| be an empty [=object=]. - 1. If |options|.{{MLPadOptions/mode}} is `undefined`, set it to `"constant"`. - 1. Otherwise, if |options|.{{MLPadOptions/mode}} is not one of {{MLPaddingMode}}, then [=exception/throw=] a {{TypeError}} and stop. - 1. If |options|.{{MLPadOptions/value}} is `undefined`, set it to `0`. + 1. If |options|.{{MLPadOptions/mode}} is not one of {{MLPaddingMode}}, then [=exception/throw=] a {{TypeError}} and stop. 1. Let |desc| be a copy of |input|.{{MLOperand/[[descriptor]]}}. 1. Set |desc|.{{MLOperandDescriptor/dimensions}} to the result of invoking the calculate padding output sizes given |input|, |beginningPadding| and |endingPadding|. 1. If any of the following sub-steps fail, throw an "{{OperationError}}" {{DOMException}} and stop. @@ -4364,19 +4328,15 @@ partial interface MLGraphBuilder { 1. [=Assert=]: |op| is one of "averagePool2d", "l2Pool2d", "maxPool2d". 1. [=Assert=]: the type of |input| is {{MLOperand}}. 1. If the length of of |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}} is not 4, then throw a "{{DataError}}" {{DOMException}} and stop. - 1. If |options| is `undefined`, let |options| be a new {{MLPool2dOptions}} object. - 1. If |options|.{{MLPool2dOptions/outputSizes}} [=map/exists=], or if |options|.{{MLPool2dOptions/padding}} is `undefined`, set |options|.{{MLPool2dOptions/padding}} to `[0, 0, 0, 0]`. + 1. If |options|.{{MLPool2dOptions/outputSizes}} [=map/exists=], or if |options|.{{MLPool2dOptions/padding}} does not [=map/exist=], set |options|.{{MLPool2dOptions/padding}} to `[0, 0, 0, 0]`. 1. If the length of of |padding|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}} is not 4, then throw a "{{DataError}}" {{DOMException}} and stop. - 1. If |options|.{{MLPool2dOptions/strides}} is `undefined`, set |options|.{{MLPool2dOptions/strides}} to `[1, 1]`. + 1. If |options|.{{MLPool2dOptions/strides}} does not [=map/exist=], set |options|.{{MLPool2dOptions/strides}} to `[1, 1]`. 1. If the length of of |options|.{{MLPool2dOptions/strides}} is not 2, then throw a "{{DataError}}" {{DOMException}} and stop. 1. If any value in |options|.{{MLPool2dOptions/strides}} is not greater than 0, then throw a "{{DataError}}" {{DOMException}} and stop. - 1. If |options|.{{MLPool2dOptions/dilations}} is `undefined`, set |options|.{{MLPool2dOptions/dilations}} to `[1, 1]`. + 1. If |options|.{{MLPool2dOptions/dilations}} does not [=map/exist=], set |options|.{{MLPool2dOptions/dilations}} to `[1, 1]`. 1. If the length of of |options|.{{MLPool2dOptions/dilations}} is not 2, then throw a "{{DataError}}" {{DOMException}} and stop. 1. If any value in |options|.{{MLPool2dOptions/dilations}} is not greater than 0, then throw a "{{DataError}}" {{DOMException}} and stop. - 1. If |options|.{{MLPool2dOptions/autoPad}} is `undefined`, set |options|.{{MLPool2dOptions/autoPad}} to `"explicit"`. 1. If |options|.{{MLPool2dOptions/autoPad}} is not `"explicit"`, set |options|.{{MLPool2dOptions/padding}} to `[0, 0, 0, 0]`. - 1. If |options|.{{MLPool2dOptions/layout}} is `undefined`, set |options|.{{MLPool2dOptions/layout}} to `"nchw"`. - 1. If |options|.{{MLPool2dOptions/roundingType}} is `undefined`, set |options|.{{MLPool2dOptions/roundingType}} to `"floor"`. 1. Let |desc| be a copy of |input|.{{MLOperand/[[descriptor]]}}. 1. If any of the following sub-steps fail, throw an "{{OperationError}}" {{DOMException}} and stop. 1. Make a request to the underlying platform to: @@ -4528,7 +4488,6 @@ partial interface MLGraphBuilder {
1. [=Assert=]: |op| is one of "reduceL1", "reduceL2", "reduceLogSum", "reduceLogSumExp", "reduceMax", "reduceMean", "reduceMin", "reduceProduct", "reduceSum", "reduceSumSquare". 1. [=Assert=]: the type of |input| is {{MLOperand}}. - 1. If |options| is `undefined`, let |options| be a new {{MLReduceOptions}} object with |options|.{{MLReduceOptions/keepDimensions}} set to `false` and |options|.{{MLReduceOptions/axes}} set to `null`. 1. If any of the following sub-steps fail, throw an "{{OperationError}}" {{DOMException}} and stop. 1. Let |output| be the result of invoking the copy MLOperand steps given |input|. 1. Make a request to the underlying platform to: @@ -5217,8 +5176,6 @@ partial interface MLGraphBuilder {
1. [=Assert=]: the type of |input| is {{MLOperand}}. - 1. If |options| is `undefined`, let |options| be an empty [=object=]. - 1. If |options|.{{MLSplitOptions/axis}} is `undefined`, let |options|.{{MLSplitOptions/axis}} be `0`. 1. If |splits| is not {{unsigned long}} or a sequence of {{unsigned long}}, then [=exception/throw=] a {{TypeError}} and stop. 1. If any of the following sub-steps fail, throw an "{{OperationError}}" {{DOMException}} and stop. 1. Let |output| be the result of invoking the copy MLOperand steps given |input|. @@ -5293,7 +5250,6 @@ partial interface MLGraphBuilder {
1. [=Assert=]: the type of |input| is {{MLOperand}}. - 1. If |options| is `undefined`, let |options| be an empty [=object=]. 1. If |options|.{{MLSqueezeOptions/axes}} [=map/exists=], then: 1. Let |dimensions| be |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}. 1. For |index| between 0 and the size of |options|.{{MLSqueezeOptions/axes}}: @@ -5421,8 +5377,7 @@ partial interface MLGraphBuilder {
1. [=Assert=]: the type of |input| is {{MLOperand}}. - 1. If |options| is `undefined`, let |options| be an empty [=object=]. - 1. If |options|.{{MLTransposeOptions/permutation}} is `undefined`, let |options|.{{MLTransposeOptions/permutation}} be the reversed sequence of all indices for |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}. + 1. If |options|.{{MLTransposeOptions/permutation}} does not [=map/exist=], let |options|.{{MLTransposeOptions/permutation}} be the reversed sequence of all indices for |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}. 1. Otherwise if |options|.{{MLTransposeOptions/permutation}} [=map/exists=]: 1. If |options|.{{MLTransposeOptions/permutation}} is not a sequence of {{unsigned long}}, then [=exception/throw=] a {{TypeError}} and stop. 1. If the rank of |options|.{{MLTransposeOptions/permutation}} is not the same as the rank of |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}, then [=exception/throw=] a {{TypeError}} and stop. From 6129789bbeb1765936b58a4294ff186650cb1b0d Mon Sep 17 00:00:00 2001 From: Zoltan Kis Date: Wed, 16 Aug 2023 15:10:52 +0300 Subject: [PATCH 077/112] Fix invocations of 'create MLActivation' Signed-off-by: Zoltan Kis --- index.bs | 24 ++++++++++++------------ 1 file changed, 12 insertions(+), 12 deletions(-) diff --git a/index.bs b/index.bs index 1888b2b1..02c600b7 100644 --- a/index.bs +++ b/index.bs @@ -1937,7 +1937,7 @@ partial interface MLGraphBuilder {
1. If running the check clamp options steps with |options| returns `false`, then [=exception/throw=] a {{TypeError}} and abort these steps. - 1. Let |op| be the result of invoking the create MLActivation steps with `"clamp"` and |options|. + 1. Let |op| be the result of invoking the create MLActivation steps with [=this=], `"clamp"` and |options|. 1. If that throws an error, re-throw the error and abort these steps. 1. Return |op|.
@@ -2655,7 +2655,7 @@ partial interface MLGraphBuilder { The {{MLGraphBuilder/elu(options)}} method steps are:
- 1. Let |op| be the result of invoking the create MLActivation steps with `"elu"` and |options|. + 1. Let |op| be the result of invoking the create MLActivation steps with [=this=], `"elu"` and |options|. 1. Return |op|.
@@ -3206,7 +3206,7 @@ partial interface MLGraphBuilder { The {{MLGraphBuilder/hardSigmoid(options)}} method steps are:
- 1. Let |op| be the result of invoking the create MLActivation steps with `"hardSigmoid"` and |options|. + 1. Let |op| be the result of invoking the create MLActivation steps with [=this=], `"hardSigmoid"` and |options|. 1. If that throws an error, re-throw the error and abort these steps. 1. Return |op|.
@@ -3284,7 +3284,7 @@ partial interface MLGraphBuilder { The {{MLGraphBuilder/hardSwish()}} method steps are:
- 1. Let |op| be the result of invoking the create MLActivation steps with `"hardSwish"`. + 1. Let |op| be the result of invoking the create MLActivation steps with [=this=] and `"hardSwish"`. 1. If that throws an error, re-throw the error and abort these steps. 1. Return |op|.
@@ -3477,7 +3477,7 @@ partial interface MLGraphBuilder { The {{MLGraphBuilder/leakyRelu(options)}} method steps are:
- 1. Let |op| be the result of invoking the create MLActivation steps with `"leakyRelu"` and |options|. + 1. Let |op| be the result of invoking the create MLActivation steps with [=this=], `"leakyRelu"` and |options|. 1. If that throws an error, re-throw the error and abort these steps. 1. Return |op|.
@@ -3568,7 +3568,7 @@ partial interface MLGraphBuilder { The {{MLGraphBuilder/linear(options)}} method steps are:
- 1. Let |op| be the result of invoking the create MLActivation steps with `"linear"` and |options|. + 1. Let |op| be the result of invoking the create MLActivation steps with [=this=], `"linear"` and |options|. 1. If that throws an error, re-throw the error and abort these steps. 1. Return |op|.
@@ -4622,7 +4622,7 @@ partial interface MLGraphBuilder { The {{MLGraphBuilder/relu()}} method steps are:
- 1. Let |op| be the result of invoking the create MLActivation steps with `"relu"`. + 1. Let |op| be the result of invoking the create MLActivation steps with [=this=] and `"relu"`. 1. If that throws an error, re-throw the error and abort these steps. 1. Return |op|.
@@ -4856,7 +4856,7 @@ partial interface MLGraphBuilder { The {{MLGraphBuilder/sigmoid()}} method steps are:
- 1. Let |op| be the result of invoking the create MLActivation steps with `"sigmoid"`. + 1. Let |op| be the result of invoking the create MLActivation steps with [=this=] and `"sigmoid"`. 1. If that throws an error, re-throw the error and abort these steps. 1. Return |op|.
@@ -4975,7 +4975,7 @@ partial interface MLGraphBuilder { The {{MLGraphBuilder/softmax()}} method steps are:
- 1. Let |op| be the result of invoking the create MLActivation steps with `"softmax"`. + 1. Let |op| be the result of invoking the create MLActivation steps with and `"softmax"`. 1. If that throws an error, re-throw the error and abort these steps. 1. Return |op|.
@@ -5063,7 +5063,7 @@ partial interface MLGraphBuilder { The {{MLGraphBuilder/softplus(options)}} method steps are:
- 1. Let |op| be the result of invoking the create MLActivation steps with `"softplus"` and |options|. + 1. Let |op| be the result of invoking the create MLActivation steps with [=this=], `"softplus"` and |options|. 1. If that throws an error, re-throw the error and abort these steps. 1. Return |op|.
@@ -5133,7 +5133,7 @@ partial interface MLGraphBuilder { The {{MLGraphBuilder/softsign()}} method steps are:
- 1. Let |op| be the result of invoking the create MLActivation steps with `"softsign"`. + 1. Let |op| be the result of invoking the create MLActivation steps with [=this=] and `"softsign"`. 1. If that throws an error, re-throw the error and abort these steps. 1. Return |op|.
@@ -5335,7 +5335,7 @@ partial interface MLGraphBuilder { The {{MLGraphBuilder/tanh()}} method steps are:
- 1. Let |op| be the result of invoking the create MLActivation steps with `"tanh"`. + 1. Let |op| be the result of invoking the create MLActivation steps with [=this=] and `"tanh"`. 1. If that throws an error, re-throw the error and abort these steps. 1. Return |op|.
From 6cea0c6d2143373eb06637359755b9af2c957afe Mon Sep 17 00:00:00 2001 From: Zoltan Kis Date: Thu, 17 Aug 2023 11:01:19 +0300 Subject: [PATCH 078/112] Style improvements according to suggestions in #450 Signed-off-by: Zoltan Kis --- index.bs | 219 +++++++++++++++++++++++++++---------------------------- 1 file changed, 108 insertions(+), 111 deletions(-) diff --git a/index.bs b/index.bs index 02c600b7..570380af 100644 --- a/index.bs +++ b/index.bs @@ -827,7 +827,7 @@ Its default allowlist is 'self'. 1. Set |context|.{{[[contextType]]}} to "[=default-context|default=]". 1. If |options|["{{deviceType}}"] [=map/exists=], then set |context|.{{[[deviceType]]}} to |options|["{{deviceType}}"]. Otherwise, set |context|.{{[[deviceType]]}} to "[=device-type-cpu|cpu=]". 1. If |options|["{{powerPreference}}"] [=map/exists=], then set |context|.{{[[powerPreference]]}} to |options|["{{powerPreference}}"]. Otherwise, set |context|.{{[[powerPreference]]}} to "[=power-preference-default|default=]". - 1. If the validate MLContext steps given |context| return `false`, [=reject=] |promise| with a "{{NotSupportedError}}" {{DOMException}} and abort these steps. + 1. If validating MLContext given |context| return `false`, [=reject=] |promise| with a "{{NotSupportedError}}" {{DOMException}} and abort these steps. 1. [=Resolve=] |promise| with |context|.
@@ -840,7 +840,7 @@ Its default allowlist is 'self'.
1. If [=this=]'s [=relevant global object=]'s [=associated Document=] is not [=allowed to use=] the [=webnn-feature|webnn=] feature, throw a "{{SecurityError}}" {{DOMException}} and abort these steps. 1. Let |context| be the result of running the create context steps given |options|. - 1. If the validate MLContext steps given |context| return `false`, throw a "{{NotSupportedError}}" {{DOMException}} and abort these steps. + 1. If validating MLContext given |context| return `false`, throw a "{{NotSupportedError}}" {{DOMException}} and abort these steps. 1. Return |context|.
@@ -905,7 +905,7 @@ dictionary MLOperandDescriptor {
1. Let |elementLength| be 1. - 1. For each |dimension| of |desc|.{{MLOperandDescriptor/dimensions}}: + 1. [=map/For each=] |dimension| of |desc|.{{MLOperandDescriptor/dimensions}}: 1. Set |elementLength| to |elementLength| × |dimension|. 1. Let |elementSize| be the [=element size=] of one of the {{ArrayBufferView}} types that matches |desc|.{{MLOperandDescriptor/type}} according to [this table](#appendices-mloperandtype-arraybufferview-compatibility). 1. Return |elementLength| × |elementSize|. @@ -960,11 +960,11 @@ The {{MLOperand}} objects are created by the methods of {{MLGraphBuilder}}, inte
- To create MLOperand given |builder| and |desc|, run the following steps: + To create an MLOperand given |builder| and |desc|, run the following steps:
1. [=Assert=]: the type of |builder| is {{MLGraphBuilder}}. - 1. If |desc| is not an [=object=] that [=implements=] {{MLOperandDescriptor}}, then [=exception/throw=] a {{TypeError}} and stop. + 1. [=Assert=]: the type of |desc| is {{MLOperandDescriptor}}. 1. Let |operand| be a new [=object=]. 1. Set |operand|.{{MLOperand/[[builder]]}} to |builder|. 1. Set |operand|.{{MLOperand/[[descriptor]]}} to |desc|. @@ -974,7 +974,7 @@ The {{MLOperand}} objects are created by the methods of {{MLGraphBuilder}}, inte
- To copy MLOperand given |operand|, run the following steps: + To copy an MLOperand given |operand|, run the following steps:
1. [=Assert=]: the type of |operand| is {{MLOperand}}. @@ -1007,7 +1007,6 @@ The {{MLOperand}} objects are created by the methods of {{MLGraphBuilder}}, inte 1. [=Assert=]: the type of |operand|.{{MLOperand/[[builder]]}} is {{MLGraphBuilder}}. 1. If |builder| [=map/exists=]] and is not equal to |operand|.{{MLOperand/[[builder]]}}, return `false`. 1. Let |desc| be |operand|.{{MLOperand/[[descriptor]]}}. - 1. If |desc| is not an [=object=] that [=implements=] {{MLOperandDescriptor}}, return `false`. 1. If |desc|.{{MLOperandDescriptor/dimensions}} [=map/exists=] and invoking check dimensions given |desc|.{{MLOperandDescriptor/dimensions}} and |desc|.{{MLOperandDescriptor/type}} returns `false`, then return `false`. 1. Return `true`.
@@ -1052,7 +1051,7 @@ The {{MLActivation}} objects (including the ones passed as input to methods) are
- To create MLActivation given |builder|, |name|, |options| and |init-steps|, run the following steps: + To create an MLActivation given |builder|, |name|, |options| and |init-steps|, run the following steps:
1. [=Assert=]: the type of |builder| is {{MLGraphBuilder}}. @@ -1129,7 +1128,7 @@ When the {{[[contextType]]}} is set to [=default-context|default=] with the {{ML ### The {{MLContext}} validation algorithm ### {#api-mlcontext-validate}
- To validate {{MLContext}}, given |context|, run these steps: + To validate MLContext, given |context|, run these steps:
1. If |context|.{{[[contextType]]}} is not "[=webgpu-context|webgpu=]" or "[=default-context|default=], return `false`. @@ -1166,8 +1165,8 @@ partial interface MLContext {
1. If |graph|.{{MLGraph/[[context]]}}.{{MLContext/[[contextType]]}} is not "[=default-context|default=], throw an "{{OperationError}}" {{DOMException}} and stop. - 1. If invoking the validate graph resources algorithm given |inputs| and |graph|.{{MLGraph/[[inputDescriptors]]}} returns `false`, then throw a "{{DataError}}" {{DOMException}} and stop. - 1. If invoking the validate graph resources algorithm given |outputs| and |graph|.{{MLGraph/[[outputDescriptors]]}} returns `false`, then throw a "{{DataError}}" {{DOMException}} and stop. + 1. If validating graph resources given |inputs| and |graph|.{{MLGraph/[[inputDescriptors]]}} returns `false`, then throw a "{{DataError}}" {{DOMException}} and stop. + 1. If validating graph resources given |outputs| and |graph|.{{MLGraph/[[outputDescriptors]]}} returns `false`, then throw a "{{DataError}}" {{DOMException}} and stop. 1. Invoke execute graph given |graph|, |inputs| and |outputs|. 1. If that throws an error, re-throw the error and stop. 1. Return {{undefined}}. @@ -1180,10 +1179,10 @@ partial interface MLContext {
1. [=Assert=]: the type of |resources| is {{MLNamedArrayBufferViews}}. - 1. For each [=record=] <|key|, |value|> of |resources|: + 1. [=map/For each=] [=record=] <|key|, |value|> of |resources|: 1. If |descriptors|[|key|] does not [=map/exist=], return `false`. 1. [=Assert=]: the type of |value| is {{ArrayBufferView}}. - 1. If running the validate buffer with descriptor given |value| and |descriptors|[|key|] return `false`, return `false`. + 1. If validating buffer with descriptor given |value| and |descriptors|[|key|] return `false`, return `false`. 1. Return `true`.
@@ -1206,7 +1205,7 @@ partial interface MLContext {
1. [=Assert=]: the type of |inputs| is {{MLNamedArrayBufferViews}}. 1. Let |inputResources| denote the input resources of |graph|.{{MLGraph/[[implementation]]}}. - 1. For each <|key|, |inputValue|> of |inputs|: + 1. [=map/For each=] <|key|, |inputValue|> of |inputs|: 1. Let |inputDescriptor| be |graph|.{{MLGraph/[[inputDescriptors]]}}[|key|]. 1. Let |inputTensor| be a new tensor for |graph|.{{MLGraph/[[implementation]]}} as follows: 1. Set the data type of |inputTensor| to the one that matches the [=element type=] of |inputValue|. @@ -1214,7 +1213,7 @@ partial interface MLContext { 1. Set the values of elements in |inputTensor| to the values of elements in |inputValue|. 1. Request the underlying implementation of |graph| to bind |inputResources|[|key|] to |inputTensor|. 1. [=Assert=]: the type of |outputs| is {{MLNamedArrayBufferViews}}. - 1. For each <|key|, |outputValue|> of |outputs|: + 1. [=map/For each=] <|key|, |outputValue|> of |outputs|: 1. Issue a compute request to |graph|.{{MLGraph/[[implementation]]}} given |key| and |inputResources| and wait for completion. 1. If that returns an error, then throw an "{{OperationError}}" {{DOMException}} and stop. 1. Otherwise, store the result in |outputTensor|. @@ -1273,7 +1272,7 @@ partial interface MLContext {
1. Let |transferredViews| be a new {{MLNamedArrayBufferViews}}. - 1. For each |key| -> |value| of |views|: + 1. [=map/For each=] |key| → |value| of |views|: 1. Let |transferredBuffer| be the result of [=ArrayBuffer/transfer|transferring=] the [=underlying buffer=] of |value|. 1. Let |constructor| be the appropriate [=view constructor=] for the type of {{ArrayBufferView}} |value|. 1. Let |elementsNumber| be the result of the [=buffer byte length|byte length=] of |value| ÷ [=element size=] of |value|. @@ -1318,8 +1317,8 @@ partial interface MLContext { 1. Let |promise| be [=a new promise=]. 1. Return |promise| and run the following steps [=in parallel=]: 1. If |graph|.{{MLGraph/[[context]]}}.{{MLContext/[[contextType]]}} is not "[=default-context|default=], [=reject=] |promise| with an "{{OperationError}}" {{DOMException}} and stop. - 1. If invoking the validate graph resources algorithm given |inputs| and |graph|.{{MLGraph/[[inputDescriptors]]}} returns `false`, then [=reject=] |promise| with a "{{DataError}}" {{DOMException}} and stop. - 1. If invoking the validate graph resources algorithm given |outputs| and |graph|.{{MLGraph/[[outputDescriptors]]}} returns `false`, then [=reject=] |promise| with a "{{DataError}}" {{DOMException}} and stop. + 1. If validating graph resources given |inputs| and |graph|.{{MLGraph/[[inputDescriptors]]}} returns `false`, then [=reject=] |promise| with a "{{DataError}}" {{DOMException}} and stop. + 1. If validating graph resources given |outputs| and |graph|.{{MLGraph/[[outputDescriptors]]}} returns `false`, then [=reject=] |promise| with a "{{DataError}}" {{DOMException}} and stop. 1. Let |transferredInputs| be the result of [=MLNamedArrayBufferViews/transfer|transferring=] {{MLNamedArrayBufferViews}} |inputs|. 1. Let |transferredOutputs| be the result of [=MLNamedArrayBufferViews/transfer|transferring=] {{MLNamedArrayBufferViews}} |outputs|. 1. Invoke execute graph given |graph|, |transferredInputs| and |transferredOutputs|. @@ -1455,20 +1454,20 @@ partial interface MLCommandEncoder {
1. If any of the following requirements are unmet, then throw a "{{DataError}}" {{DOMException}} and stop.
- 1. For each |key| -> |value| of |inputs|: + 1. [=map/For each=] |key| → |value| of |inputs|: 1. |graph|.{{MLGraph/[[inputDescriptors]]}}[|key|] must [=map/exist=]. 1. Let |inputDesc| be |graph|.{{MLGraph/[[inputDescriptors]]}}[|key|]. 1. If |value| is a {{GPUBuffer}}, then: 1. |value|.{{GPUBuffer/size}} must equal to [=byte length=] of |inputDesc|. - 1. For each |key| -> |value| of |outputs|: + 1. [=map/For each=] |key| → |value| of |outputs|: 1. |graph|.{{MLGraph/[[outputDescriptors]]}}[|key|] must [=map/exist=]. 1. Let |outputDesc| be |graph|.{{MLGraph/[[outputDescriptors]]}}[|key|]. 1. If |value| is a {{GPUBuffer}}, then: 1. |value|.{{GPUBuffer/size}} must equal to [=byte length=] of |outputDesc|.
- 1. For each |key| -> |value| of |inputs|: + 1. [=map/For each=] |key| → |value| of |inputs|: 1. Set the input of |graph|.{{MLGraph/[[implementation]]}} that is associated with |key| to |value|. - 1. For each |key| -> |value| of |outputs|: + 1. [=map/For each=] |key| → |value| of |outputs|: 1. Set the output of |graph|.{{MLGraph/[[implementation]]}} that is associated with |key| to |value|. 1. Issue a compute request of |graph|.{{MLGraph/[[implementation]]}}. 1. If there is an error returned by |graph|.{{MLGraph/[[implementation]]}}, then: @@ -1580,7 +1579,7 @@ Both {{MLGraphBuilder}}.{{MLGraphBuilder/build()}} and {{MLGraphBuilder}}.{{MLGr
1. If [=this=]'s [=relevant global object=]'s [=associated Document=] is not [=allowed to use=] the [=webnn-feature|webnn=] feature, throw a "{{SecurityError}}" {{DOMException}} and abort these steps. - 1. If the validate MLContext steps given |context| return `false`, throw a "{{TypeError}}" and abort these steps. + 1. If validating MLContext given |context| return `false`, throw a "{{TypeError}}" and abort these steps. 1. Set {{MLGraphBuilder/[[context]]}} to |context|.
@@ -1610,7 +1609,7 @@ Create a named {{MLOperand}} based on a descriptor, that can be used as an input 1. If the [=check dimensions=] steps given |descriptor|.{{MLOperandDescriptor/type}} and |descriptor|.{{MLOperandDescriptor/dimensions}} return `false`, throw a "{{DataError}}" {{DOMException}} and stop. 1. If the [=byte length=] of |descriptor| is not supported by the underlying platform, then throw a "{{DataError}}" {{DOMException}} and stop. 1. If any of the following sub-steps fail, throw an "{{OperationError}}" {{DOMException}} and stop. - 1. Let |operand| be the result of invoking the create MLOperand steps with [=this=] and |descriptor|. + 1. Let |operand| be the result of creating an MLOperand given [=this=] and |descriptor|. 1. Set |operand|.{{MLOperand/[[name]]}} to |name|. 1. Make a request to the underlying platform to: 1. Create an [=implementation-defined=] platform input operand |operandImpl| given |descriptor|. @@ -1650,7 +1649,7 @@ Build a composed graph up to a given output operand into a computational graph,
1. [=Assert=]: the type of |outputs| is {{MLNamedOperands}}. 1. If |outputs| is empty, then [=exception/throw=] a {{TypeError}} and stop. - 1. For each |element| in |outputs|: + 1. [=map/For each=] |element| in |outputs|: 1. [=Assert=]: the type of |element|.key is [=string=]. 1. If |element|.key is empty, then [=exception/throw=] a {{TypeError}} and stop. 1. [=Assert=]: the type of |element|.value is {{MLOperand}}. @@ -1661,8 +1660,8 @@ Build a composed graph up to a given output operand into a computational graph, 1. Connect |graph| to a new [=implementation-defined=] graph implementation |graphImpl| given |graph|. 1. Store a reference to |graphImpl| in |graph|.{{MLGraph/[[implementation]]}}. 1. Make a request to the underlying platform to initialize the graph: - 1. For each |operand| in |outputs|: - 1. If running the validate MLOperand given |operand| and [=this=] returns `false`, then [=exception/throw=] a {{TypeError}} and stop. + 1. [=map/For each=] |operand| in |outputs|: + 1. If validating MLOperand given |operand| and [=this=] returns `false`, then [=exception/throw=] a {{TypeError}} and stop. 1. If |operand| was created as an input by the underlying platform: 1. If |operand|.{{MLOperand/[[name]]}}] is not unique for |graphImpl|, then [=exception/throw=] a {{TypeError}} and stop. 1. Add |operand|.{{MLOperand/[[descriptor]]}} to |graph|.{{MLGraph/[[inputDescriptors]]}}[|operand|.{{MLOperand/[[name]]}}]. @@ -1696,10 +1695,10 @@ Create a constant {{MLOperand}} that can be used in {{MLGraphBuilder}} methods. 1. [=Assert=]: the type of |descriptor| is {{MLOperandDescriptor}}. 1. If the [=byte length=] of |descriptor| is not supported by the underlying platform, then throw a "{{DataError}}" {{DOMException}} and stop. 1. If the [=check dimensions=] steps given |descriptor|.{{MLOperandDescriptor/type}} and |descriptor|.{{MLOperandDescriptor/dimensions}} return `false`, throw a "{{DataError}}" {{DOMException}} and stop. - 1. If invoking validate buffer with descriptor given |bufferView| and |descriptor| return `false`, then [=exception/throw=] a {{TypeError}} and stop. + 1. If validating buffer with descriptor given |bufferView| and |descriptor| return `false`, then [=exception/throw=] a {{TypeError}} and stop. 1. If any of the following sub-steps fail, throw an "{{OperationError}}" {{DOMException}} and stop. - 1. Let |operand| be the result of invoking the create MLOperand steps with [=this=] and |descriptor|. - 1. Let |bytes| be the result of invoking [[=get a copy of the bytes held by the buffer source=]] given |bufferView|. + 1. Let |operand| be the result of creating an MLOperand given [=this=] and |descriptor|. + 1. Let |bytes| be the result of invoking the [[=get a copy of the bytes held by the buffer source=]] steps given |bufferView|. 1. Make a request to the underlying platform to: 1. Create an [=implementation-defined=] platform operand |constantImpl| to represent a constant, given |descriptor|. 1. Store a reference of |constantImpl| in |operand|.{{MLOperand/[[operand]]}}. @@ -1734,7 +1733,7 @@ Create a constant {{MLOperand}} that can be used in {{MLGraphBuilder}} methods. In the case of a scalar constant, |descriptor|.{{MLOperandDescriptor/dimensions}} is ignored.
1. If any of the following sub-steps fail, throw an "{{OperationError}}" {{DOMException}} and stop. - 1. Let |operand| be the result of invoking the create MLOperand steps with [=this=] and |descriptor|. + 1. Let |operand| be the result of creating an MLOperand given [=this=] and |descriptor|. 1. Make a request to the underlying platform to: 1. Create an [=implementation-defined=] platform operand |constantImpl| to represent a constant, given |descriptor|. 1. Store a reference of |constantImpl| in |operand|.{{MLOperand/[[operand]]}}. @@ -1801,14 +1800,12 @@ partial interface MLGraphBuilder {
1. [=Assert=]: the type of |input|, |mean| and |variance| is {{MLOperand}}. 1. [=Assert=]: the type of |mean| is - 1. To validate |mean|, run the following substeps: - 1. If |mean|.{{MLOperand/[[descriptor]]}}.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}} is not equal with |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}} from which the dimension represented by |options|.axis is removed, then [=exception/throw=] a {{TypeError}} and abort these steps. - 1. To validate |options|, run these substeps: - 1. If |options|.axis is not a number between 0 and the rank of |input|, then [=exception/throw=] a {{TypeError}} and abort these steps. - 1. If |input| is a 4-D tensor of the *"nchw"* layout, set |options|.axis to 1. - 1. If |input| is a 4-D tensor of the *"nhwc"* layout, set |options|.axis to 3. + 1. If |mean|.{{MLOperand/[[descriptor]]}}.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}} is not equal with |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}} from which the dimension represented by |options|.axis is removed, then [=exception/throw=] a {{TypeError}} and abort these steps. + 1. If |options|.axis is not a number between 0 and the rank of |input|, then [=exception/throw=] a {{TypeError}} and abort these steps. + 1. If |input| is a 4-D tensor of the *"nchw"* layout, set |options|.axis to 1. + 1. If |input| is a 4-D tensor of the *"nhwc"* layout, set |options|.axis to 3. 1. If any of the following sub-steps fail, throw an "{{OperationError}}" {{DOMException}} and stop. - 1. Let |output| be the result of invoking the create MLOperand steps with [=this=] and |descriptor|, that may use the same underlying data as |input|. + 1. Let |output| be the result of creating an MLOperand given [=this=] and |descriptor|, that may use the same underlying data as |input|. 1. Make a request to the underlying platform to initialize the batch normalization: 1. Create an [=implementation-defined=] platform operator |batchNormImpl| for this method, given |input|, |mean|, |variance| and |options|. 1. If |options|.activation [=map/exists=],register it as activation to |batchNormImpl|. @@ -1907,9 +1904,9 @@ partial interface MLGraphBuilder {
1. [=Assert=]: the type of |operand| is {{MLOperand}}. - 1. If running the check clamp options steps with |options| returns `false`, then [=exception/throw=] a {{TypeError}} and abort these steps. + 1. If running the check clamp options steps given |options| returns `false`, then [=exception/throw=] a {{TypeError}} and abort these steps. 1. If any of the following sub-steps fail, throw an "{{OperationError}}" {{DOMException}} and stop. - 1. Let |output| be the result of invoking the copy MLOperand steps given |operand|. + 1. Let |output| be the result of copying an MLOperand given |operand|. 1. Make a request to the underlying platform to: 1. Create an [=implementation-defined=] platform operator |clampImpl| for this method, given |options|.{{MLClampOptions/minValue}} and |options|.{{MLClampOptions/maxValue}}. 1. Store a reference of |clampImpl| in |output|.{{MLOperand/[[operator]]}}. @@ -1936,8 +1933,8 @@ partial interface MLGraphBuilder { The {{MLGraphBuilder/clamp(options)}} method steps are:
- 1. If running the check clamp options steps with |options| returns `false`, then [=exception/throw=] a {{TypeError}} and abort these steps. - 1. Let |op| be the result of invoking the create MLActivation steps with [=this=], `"clamp"` and |options|. + 1. If running the check clamp options steps given |options| returns `false`, then [=exception/throw=] a {{TypeError}} and abort these steps. + 1. Let |op| be the result of creating an MLActivation given [=this=], `"clamp"` and |options|. 1. If that throws an error, re-throw the error and abort these steps. 1. Return |op|.
@@ -1978,9 +1975,9 @@ partial interface MLGraphBuilder { 1. If |axis| is greater than or equal to the rank of |inputs|, fail. 1. Let |desc| be |inputs|[0].{{MLOperand/[[descriptor]]}}. 1. Let |desc|.{{MLOperandDescriptor/dimensions}}[|axis|] be `0`. - 1. For each |index| between 0 and the rank of |inputs|: - 1. If running validate MLOperand given |inputs|[|index|] and [=this=] returns `false`, then fail. - 1. For each |dim| between 0 and the rank of |inputs|[|index|]: + 1. [=map/For each=] |index| between 0 and the rank of |inputs|: + 1. If validating MLOperand given |inputs|[|index|] and [=this=] returns `false`, then fail. + 1. [=map/For each=] |dim| between 0 and the rank of |inputs|[|index|]:
If the shape of each corresponding dimension and type of the operands, except for those of the dimension given by |axis|, is not the same, fail.
@@ -1988,7 +1985,7 @@ partial interface MLGraphBuilder { 1. If |inputs|[|dim|].{{MLOperandDescriptor/type}} is not equal to |inputs|[0].{{MLOperandDescriptor/type}}. 1. If |dim| is equal to |axis|, add to |desc|.{{MLOperandDescriptor/dimensions}}[|axis|] the value of |inputs|[|index|].{{MLOperandDescriptor/dimensions}}[|dim|]. 1. If any of the following sub-steps fail, throw an "{{OperationError}}" {{DOMException}} and stop. - 1. Let |output| be the result of invoking the create MLOperand steps given [=this=] and |desc|. + 1. Let |output| be the result of creating an MLOperand given [=this=] and |desc|. 1. Make a request to the underlying platform to: 1. Create an [=implementation-defined=] platform operator |concatImpl| for this method, given |inputs| and |axis|. 1. Store a reference of |concatImpl| in |output|.{{MLOperand/[[operator]]}}. @@ -2131,12 +2128,12 @@ partial interface MLGraphBuilder { 1. If |input_size| is not `4`, then throw a "{{DataError}}" {{DOMException}} and stop. 1. If |filter_size| is not `4`, then throw a "{{DataError}}" {{DOMException}} and stop. 1. If the type of |input| and |filter| is not the same, then [=exception/throw=] a {{TypeError}} and stop. - 1. If |options|.{{MLConv2dOptions/padding}} does not [=map/exist=], set it to `[0, 0, 0, 0]`. + 1. If |options|.{{MLConv2dOptions/padding}} does not [=map/exist=], set it to `« 0, 0, 0, 0 »`. 1. Else if |options|.{{MLConv2dOptions/padding}}.length is not 4, then throw a "{{DataError}}" {{DOMException}} and stop. - 1. If |options|.{{MLConv2dOptions/strides}} does not [=map/exist=], set it to `[1, 1]`. + 1. If |options|.{{MLConv2dOptions/strides}} does not [=map/exist=], set it to `« 1, 1 »`. 1. Else if |options|.{{MLConv2dOptions/strides}}.length is not `2`, then [=exception/throw=] a {{TypeError}} and stop. 1. If any element in |options|.{{MLConv2dOptions/strides}} is equal to 0, then [=exception/throw=] a {{TypeError}} and stop. - 1. If |options|.{{MLConv2dOptions/dilations}} does not [=map/exist=], set it to `[1, 1]`. + 1. If |options|.{{MLConv2dOptions/dilations}} does not [=map/exist=], set it to `« 1, 1 »`. 1. Else if |options|.{{MLConv2dOptions/dilations}}.length is not `2`, then [=exception/throw=] a {{TypeError}} and stop. 1. If |options|.{{MLConv2dOptions/autoPad}} does not [=map/exist=], set it to `"explicit"`. 1. If |options|.{{MLConv2dOptions/groups}} is 0, then throw a "{{DataError}}" {{DOMException}} and stop. @@ -2154,7 +2151,7 @@ partial interface MLGraphBuilder { 1. Set |desc|.{{MLOperandDescriptor/type}} to |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/type}}. 1. Set |desc|.{{MLOperandDescriptor/dimensions}} to |output_shape|. 1. If any of the following sub-steps fail, throw an "{{OperationError}}" {{DOMException}} and stop. - 1. Let |output| be the result of invoking the create MLOperand steps given [=this=] and |desc|. + 1. Let |output| be the result of creating an MLOperand given [=this=] and |desc|. 1. Make a request to the underlying platform to: 1. Create an [=implementation-defined=] platform operator |conv2dImpl| for this method, given |options| and |filter|. 1. If |options|.{{MLConv2dOptions/activation}} [=map/exists=],register it as activation to |conv2dImpl|. @@ -2306,14 +2303,14 @@ partial interface MLGraphBuilder { 1. If |input_size| is not `4`, then throw a "{{DataError}}" {{DOMException}} and stop. 1. If |filter_size| is not `4`, then throw a "{{DataError}}" {{DOMException}} and stop. 1. If the type of |input| and |filter| is not the same, then [=exception/throw=] a {{TypeError}} and stop. - 1. If |options|.{{MLConvTranspose2dOptions/padding}} does not [=map/exist=], set it to `[0, 0, 0, 0]`. + 1. If |options|.{{MLConvTranspose2dOptions/padding}} does not [=map/exist=], set it to `« 0, 0, 0, 0 »`. 1. Else if |options|.{{MLConvTranspose2dOptions/padding}}.length is not 4, then throw a "{{DataError}}" {{DOMException}} and stop. - 1. If |options|.{{MLConvTranspose2dOptions/strides}} does not [=map/exist=], set it to `[1, 1]`. + 1. If |options|.{{MLConvTranspose2dOptions/strides}} does not [=map/exist=], set it to `« 1, 1 »`. 1. Else if |options|.{{MLConvTranspose2dOptions/strides}}.length is not `2`, then [=exception/throw=] a {{TypeError}} and stop. 1. If any element in |options|.{{MLConv2dOptions/strides}} is equal to 0, then [=exception/throw=] a {{TypeError}} and stop. - 1. If |options|.{{MLConvTranspose2dOptions/dilations}} does not [=map/exist=], set it to `[1, 1]`. + 1. If |options|.{{MLConvTranspose2dOptions/dilations}} does not [=map/exist=], set it to `« 1, 1 »`. 1. Else if |options|.{{MLConvTranspose2dOptions/dilations}}.length is not `2`, then [=exception/throw=] a {{TypeError}} and stop. - 1. If |options|.{{MLConvTranspose2dOptions/outputPadding}} does not [=map/exist=], set it to `[0, 0]`. + 1. If |options|.{{MLConvTranspose2dOptions/outputPadding}} does not [=map/exist=], set it to `« 0, 0 »`. 1. Else if |options|.{{MLConvTranspose2dOptions/outputPadding}}.length is not `2`, then [=exception/throw=] a {{TypeError}} and stop. 1. If |options|.{{MLConvTranspose2dOptions/outputSizes}} [=map/exists=]: 1. If |options|.{{MLConvTranspose2dOptions/outputSizes}}.length is not `2`, then [=exception/throw=] a {{TypeError}} and stop. @@ -2332,7 +2329,7 @@ partial interface MLGraphBuilder { 1. Set |desc|.{{MLOperandDescriptor/type}} to |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/type}}. 1. Set |desc|.{{MLOperandDescriptor/dimensions}} to |output_shape|. 1. If any of the following sub-steps fail, throw an "{{OperationError}}" {{DOMException}} and stop. - 1. Let |output| be the result of invoking the create MLOperand steps given [=this=] and |desc|. + 1. Let |output| be the result of creating an MLOperand given [=this=] and |desc|. 1. Make a request to the underlying platform to: 1. Create an [=implementation-defined=] platform operator |convTranspose2dImpl| for this method, given |options| and |filter|. 1. If |options|.{{MLConvTranspose2dOptions/activation}} [=map/exists=],register it as activation to |convTranspose2dImpl|. @@ -2397,7 +2394,7 @@ partial interface MLGraphBuilder { 1. Let |descriptor|.{{MLOperandDescriptor/dimensions}} be the result of running the [=MLGraphBuilder/broadcast-shapes=] steps given |a|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}} and |b|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}. 1. If that throws an error, re-throw the error and stop. 1. If any of the following sub-steps fail, throw an "{{OperationError}}" {{DOMException}} and stop. - 1. Let |output| be the result of invoking the create MLOperand steps given [=this=] and |descriptor|. + 1. Let |output| be the result of creating an MLOperand given [=this=] and |descriptor|. 1. Make a request to the underlying platform to: 1. Let |opImpl| be an [=implementation-defined=] platform operator for the binary operation |op|, given |a| and |b|. 1. Store a reference of |opImpl| in |output|.{{MLOperand/[[operator]]}}. @@ -2513,7 +2510,7 @@ partial interface MLGraphBuilder { 1. Let |kind| be `"output"`. 1. Let |descriptor| be a new {{MLOperandDescriptor}}. 1. If any of the following sub-steps fail, throw an "{{OperationError}}" {{DOMException}} and stop. - 1. Let |output| be the result of invoking the copy MLOperand steps given |input|. + 1. Let |output| be the result of copying an MLOperand given |input|. 1. Make a request to the underlying platform to: 1. Let |opImpl| be an [=implementation-defined=] platform operator for the unary operation |op|. 1. Store a reference of |opImpl| in |output|.{{MLOperand/[[operator]]}}. @@ -2628,7 +2625,7 @@ partial interface MLGraphBuilder {
1. If any of the following sub-steps fail, throw an "{{OperationError}}" {{DOMException}} and stop. - 1. Let |output| be the result of invoking the copy MLOperand steps given |input|. + 1. Let |output| be the result of copying an MLOperand given |input|. 1. Make a request to the underlying platform to: 1. Let |opImpl| be an [=implementation-defined=] platform operator for the ELU operation, given |options|. 1. Store a reference of |opImpl| in |output|.{{MLOperand/[[operator]]}}. @@ -2655,7 +2652,7 @@ partial interface MLGraphBuilder { The {{MLGraphBuilder/elu(options)}} method steps are:
- 1. Let |op| be the result of invoking the create MLActivation steps with [=this=], `"elu"` and |options|. + 1. Let |op| be the result of creating an MLActivation given [=this=], `"elu"` and |options|. 1. Return |op|.
@@ -2729,7 +2726,7 @@ partial interface MLGraphBuilder { 1. Set |desc|.{{MLOperandDescriptor/dimensions}} to [|shapeA|[0], |shapeB|[1]]. 1. Set |desc|.{{MLOperandDescriptor/type}} to |a|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/type}}. 1. If any of the following sub-steps fail, throw an "{{OperationError}}" {{DOMException}} and stop. - 1. Let |output| be the result of invoking the create MLOperand steps given [=this=] and |desc|. + 1. Let |output| be the result of creating an MLOperand given [=this=] and |desc|. 1. Make a request to the underlying platform to: 1. Let |opImpl| be an [=implementation-defined=] platform operator for the GEMM operation, given |options|. 1. Store a reference of |opImpl| in |output|.{{MLOperand/[[operator]]}}. @@ -3010,7 +3007,7 @@ partial interface MLGraphBuilder { 1. Set |desc|.{{MLOperandDescriptor/dimensions}} to [ |input|.{{MLOperandDescriptor/dimensions}}[0], |hiddenSize| ]. 1. Set |desc|.{{MLOperandDescriptor/type}} to |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/type}}. 1. If any of the following sub-steps fail, throw an "{{OperationError}}" {{DOMException}} and stop. - 1. Let |output| be the result of invoking the create MLOperand steps given [=this=] and |desc|. + 1. Let |output| be the result of creating an MLOperand given [=this=] and |desc|. 1. Make a request to the underlying platform to: 1. Let |opImpl| be an [=implementation-defined=] platform operator for `"gruCell"`, given |weight|, |recurrentWeight|, |hiddenState|, |hiddenSize| and |options| as parameters. 1. Store a reference of |opImpl| in |output|.{{MLOperand/[[operator]]}}. @@ -3181,7 +3178,7 @@ partial interface MLGraphBuilder {
The {{MLGraphBuilder/hardSigmoid(input, options)}} method steps are: 1. If any of the following sub-steps fail, throw an "{{OperationError}}" {{DOMException}} and stop. - 1. Let |output| be the result of invoking the copy MLOperand steps given |input|. + 1. Let |output| be the result of copying an MLOperand given |input|. 1. Make a request to the underlying platform to: 1. Let |opImpl| be an [=implementation-defined=] platform operator for the hard sigmoid operation, given |options|. 1. Store a reference of |opImpl| in |output|.{{MLOperand/[[operator]]}}. @@ -3206,7 +3203,7 @@ partial interface MLGraphBuilder { The {{MLGraphBuilder/hardSigmoid(options)}} method steps are:
- 1. Let |op| be the result of invoking the create MLActivation steps with [=this=], `"hardSigmoid"` and |options|. + 1. Let |op| be the result of creating an MLActivation given [=this=], `"hardSigmoid"` and |options|. 1. If that throws an error, re-throw the error and abort these steps. 1. Return |op|.
@@ -3258,7 +3255,7 @@ partial interface MLGraphBuilder {
1. If any of the following sub-steps fail, throw an "{{OperationError}}" {{DOMException}} and stop. - 1. Let |output| be the result of invoking the copy MLOperand steps given |input|. + 1. Let |output| be the result of copying an MLOperand given |input|. 1. Make a request to the underlying platform to: 1. Let |opImpl| be an [=implementation-defined=] platform operator for the hard-swish operation. 1. Store a reference of |opImpl| in |output|.{{MLOperand/[[operator]]}}. @@ -3284,7 +3281,7 @@ partial interface MLGraphBuilder { The {{MLGraphBuilder/hardSwish()}} method steps are:
- 1. Let |op| be the result of invoking the create MLActivation steps with [=this=] and `"hardSwish"`. + 1. Let |op| be the result of creating an MLActivation given [=this=] and `"hardSwish"`. 1. If that throws an error, re-throw the error and abort these steps. 1. Return |op|.
@@ -3348,7 +3345,7 @@ The {{MLInstanceNormalizationOptions}} members are: 1. If the rank of |options|.{{MLInstanceNormalizationOptions/bias}} is not equal to the size of the channel dimension of |input|, then throw a "{{DataError}}" {{DOMException}} and stop. 1. Otherwise if |options|.{{MLInstanceNormalizationOptions/layout}} is not one of {{MLInputOperandLayout}}, then throw a "{{DataError}}" {{DOMException}} and stop. 1. If any of the following sub-steps fail, throw an "{{OperationError}}" {{DOMException}} and stop. - 1. Let |output| be the result of invoking the copy MLOperand steps given |input|. + 1. Let |output| be the result of copying an MLOperand given |input|. 1. Make a request to the underlying platform to: 1. Let |opImpl| be an [=implementation-defined=] platform operator for the instance normalization operation, given |options|. 1. Store a reference of |opImpl| in |output|.{{MLOperand/[[operator]]}}. @@ -3451,7 +3448,7 @@ partial interface MLGraphBuilder {
1. If any of the following sub-steps fail, throw an "{{OperationError}}" {{DOMException}} and stop. - 1. Let |output| be the result of invoking the copy MLOperand steps given |input|. + 1. Let |output| be the result of copying an MLOperand given |input|. 1. Make a request to the underlying platform to: 1. Let |opImpl| be an [=implementation-defined=] platform operator for the Leaky RELU operation, given |options|. 1. Store a reference of |opImpl| in |output|.{{MLOperand/[[operator]]}}. @@ -3477,7 +3474,7 @@ partial interface MLGraphBuilder { The {{MLGraphBuilder/leakyRelu(options)}} method steps are:
- 1. Let |op| be the result of invoking the create MLActivation steps with [=this=], `"leakyRelu"` and |options|. + 1. Let |op| be the result of creating an MLActivation given [=this=], `"leakyRelu"` and |options|. 1. If that throws an error, re-throw the error and abort these steps. 1. Return |op|.
@@ -3542,7 +3539,7 @@ partial interface MLGraphBuilder {
1. If any of the following sub-steps fail, throw an "{{OperationError}}" {{DOMException}} and stop. - 1. Let |output| be the result of invoking the copy MLOperand steps given |input|. + 1. Let |output| be the result of copying an MLOperand given |input|. 1. Make a request to the underlying platform to: 1. Let |opImpl| be an [=implementation-defined=] platform operator for the linear operation, given |options|. 1. Store a reference of |opImpl| in |output|.{{MLOperand/[[operator]]}}. @@ -3568,7 +3565,7 @@ partial interface MLGraphBuilder { The {{MLGraphBuilder/linear(options)}} method steps are:
- 1. Let |op| be the result of invoking the create MLActivation steps with [=this=], `"linear"` and |options|. + 1. Let |op| be the result of creating an MLActivation given [=this=], `"linear"` and |options|. 1. If that throws an error, re-throw the error and abort these steps. 1. Return |op|.
@@ -3701,10 +3698,10 @@ partial interface MLGraphBuilder { 1. Let |desc| a new {{MLOperandDescriptor}}. 1. Set |desc|.{{MLOperandDescriptor/dimensions}} to [ |nume_directions|, |batch_size|, |hiddenSize| ]. 1. Set |desc|.{{MLOperandDescriptor/type}} to |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/type}}. - 1. Let |output0| be the result of invoking the create MLOperand steps given [=this=] and |desc|. - 1. Let |output1| be the result of invoking the create MLOperand steps given [=this=] and |desc|. + 1. Let |output0| be the result of creating an MLOperand given [=this=] and |desc|. + 1. Let |output1| be the result of creating an MLOperand given [=this=] and |desc|. 1. Set |desc|.{{MLOperandDescriptor/dimensions}} to [ |steps|, |nume_directions|, |batch_size|, |hiddenSize| ]. - 1. Let |output2| be the result of invoking the create MLOperand steps given [=this=] and |desc|. + 1. Let |output2| be the result of creating an MLOperand given [=this=] and |desc|. 1. Let |output| be the array [ |output0|, |output1|, |output2 ]. 1. Make a request to the underlying platform to: 1. Let |opImpl| be an [=implementation-defined=] platform operator for the LSTM operation, given |weight|, |recurrentWeight|, |steps|, |hiddenSize| and |options|. @@ -3881,8 +3878,8 @@ partial interface MLGraphBuilder { 1. Set |desc|.{{MLOperandDescriptor/dimensions}} to [ |batch_size|, |hiddenSize| ]. 1. Set |desc|.{{MLOperandDescriptor/type}} to |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/type}}. 1. If any of the following sub-steps fail, throw an "{{OperationError}}" {{DOMException}} and stop. - 1. Let |output0| be the result of invoking the create MLOperand steps given [=this=] and |desc|. - 1. Let |output1| be the result of invoking the create MLOperand steps given [=this=] and |desc|. + 1. Let |output0| be the result of creating an MLOperand given [=this=] and |desc|. + 1. Let |output1| be the result of creating an MLOperand given [=this=] and |desc|. 1. Let |output| be the array [ |output0|, |output1| ]. 1. Make a request to the underlying platform to: 1. Let |opImpl| be an [=implementation-defined=] platform operator for the LSTM cell operation, given |weight|, |recurrentWeight|, |hiddenState|, |cellState|, |hiddenSize| and |options|. @@ -4045,11 +4042,11 @@ partial interface MLGraphBuilder {
1. Let |shapeA| be |a|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}} and |sizeA| the size of |shapeA|. 1. Let |shapeB| be |a|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}} and |sizeB| the size of |shapeB|. - 1. If |sizeA| and |sizeB| is `1`, return `[ 1 ]`. + 1. If |sizeA| and |sizeB| is `1`, return `« 1 »`. 1. If | sizeA| is `1` and |sizeB| is not, then insert `1` in the front of |shapeA| to become [ 1 | |shapeA| ] and let |sizeA| be `2`. 1. If | sizeB| is `1` and |sizeA| is not, then insert `1` in the front of |shapeB| to become [ 1 | |shapeB| ] and let |sizeB| be `2`. 1. Let |shape| be an array whose size |size| is the maximum of |sizeA| and |sizeB|. - 1. For each |index| between 0 and |size|: + 1. [=map/For each=] |index| between 0 and |size|: 1. Set |shape|[|index|] to the maximum of |shapeA|[|index|] and |shapeB|[|index|]. 1. Return |shape|.
@@ -4062,10 +4059,10 @@ partial interface MLGraphBuilder {
1. [=Assert=]: the type of |a| and |b| is {{MLOperand}}. 1. Let |desc| a new {{MLOperandDescriptor}}. - 1. Set |desc|.{{MLOperandDescriptor/dimensions}} to the result of invoking the calculate matmul output sizes given |a| and |b|. + 1. Set |desc|.{{MLOperandDescriptor/dimensions}} to the result of invoking the calculate matmul output sizes steps given |a| and |b|. 1. Set |desc|.{{MLOperandDescriptor/type}} to |a|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/type}}. 1. If any of the following sub-steps fail, throw an "{{OperationError}}" {{DOMException}} and stop. - 1. Let |output| be the result of invoking the create MLOperand steps given [=this=] and |desc|. + 1. Let |output| be the result of creating an MLOperand given [=this=] and |desc|. 1. Make a request to the underlying platform to: 1. Let |opImpl| be an [=implementation-defined=] platform operator for the matrix multiplication operation. 1. Store a reference of |opImpl| in |output|.{{MLOperand/[[operator]]}}. @@ -4149,9 +4146,9 @@ partial interface MLGraphBuilder { 1. If |beginningPadding| or |endingPadding| is not a sequence of {{unsigned long}}, then [=exception/throw=] a {{TypeError}} and stop. 1. If |options|.{{MLPadOptions/mode}} is not one of {{MLPaddingMode}}, then [=exception/throw=] a {{TypeError}} and stop. 1. Let |desc| be a copy of |input|.{{MLOperand/[[descriptor]]}}. - 1. Set |desc|.{{MLOperandDescriptor/dimensions}} to the result of invoking the calculate padding output sizes given |input|, |beginningPadding| and |endingPadding|. + 1. Set |desc|.{{MLOperandDescriptor/dimensions}} to the result of invoking the calculate padding output sizes steps given |input|, |beginningPadding| and |endingPadding|. 1. If any of the following sub-steps fail, throw an "{{OperationError}}" {{DOMException}} and stop. - 1. Let |output| be the result of invoking the create MLOperand steps given [=this=] and |desc|. + 1. Let |output| be the result of creating an MLOperand given [=this=] and |desc|. 1. Make a request to the underlying platform to: 1. Let |opImpl| be an [=implementation-defined=] platform operator for the padding operation, given |beginningPadding|, |endingPadding| and |options|. 1. Store a reference of |opImpl| in |output|.{{MLOperand/[[operator]]}}. @@ -4328,20 +4325,20 @@ partial interface MLGraphBuilder { 1. [=Assert=]: |op| is one of "averagePool2d", "l2Pool2d", "maxPool2d". 1. [=Assert=]: the type of |input| is {{MLOperand}}. 1. If the length of of |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}} is not 4, then throw a "{{DataError}}" {{DOMException}} and stop. - 1. If |options|.{{MLPool2dOptions/outputSizes}} [=map/exists=], or if |options|.{{MLPool2dOptions/padding}} does not [=map/exist=], set |options|.{{MLPool2dOptions/padding}} to `[0, 0, 0, 0]`. + 1. If |options|.{{MLPool2dOptions/outputSizes}} [=map/exists=], or if |options|.{{MLPool2dOptions/padding}} does not [=map/exist=], set |options|.{{MLPool2dOptions/padding}} to `« 0, 0, 0, 0 »`. 1. If the length of of |padding|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}} is not 4, then throw a "{{DataError}}" {{DOMException}} and stop. - 1. If |options|.{{MLPool2dOptions/strides}} does not [=map/exist=], set |options|.{{MLPool2dOptions/strides}} to `[1, 1]`. + 1. If |options|.{{MLPool2dOptions/strides}} does not [=map/exist=], set |options|.{{MLPool2dOptions/strides}} to `« 1, 1 »`. 1. If the length of of |options|.{{MLPool2dOptions/strides}} is not 2, then throw a "{{DataError}}" {{DOMException}} and stop. 1. If any value in |options|.{{MLPool2dOptions/strides}} is not greater than 0, then throw a "{{DataError}}" {{DOMException}} and stop. - 1. If |options|.{{MLPool2dOptions/dilations}} does not [=map/exist=], set |options|.{{MLPool2dOptions/dilations}} to `[1, 1]`. + 1. If |options|.{{MLPool2dOptions/dilations}} does not [=map/exist=], set |options|.{{MLPool2dOptions/dilations}} to `« 1, 1 »`. 1. If the length of of |options|.{{MLPool2dOptions/dilations}} is not 2, then throw a "{{DataError}}" {{DOMException}} and stop. 1. If any value in |options|.{{MLPool2dOptions/dilations}} is not greater than 0, then throw a "{{DataError}}" {{DOMException}} and stop. - 1. If |options|.{{MLPool2dOptions/autoPad}} is not `"explicit"`, set |options|.{{MLPool2dOptions/padding}} to `[0, 0, 0, 0]`. + 1. If |options|.{{MLPool2dOptions/autoPad}} is not `"explicit"`, set |options|.{{MLPool2dOptions/padding}} to `« 0, 0, 0, 0 »`. 1. Let |desc| be a copy of |input|.{{MLOperand/[[descriptor]]}}. 1. If any of the following sub-steps fail, throw an "{{OperationError}}" {{DOMException}} and stop. 1. Make a request to the underlying platform to: 1. Calculate the output dimensions given |input| and |options|. Let |desc|.{{MLOperandDescriptor/dimensions}} be the result of that. - 1. Let |output| be the result of invoking the create MLOperand steps given [=this=] and |desc|. + 1. Let |output| be the result of creating an MLOperand given [=this=] and |desc|. 1. Let |opImpl| be an [=implementation-defined=] platform operator for the |op| pooling operation, given |options|. 1. Store a reference of |opImpl| in |output|.{{MLOperand/[[operator]]}}. 1. Create an [=implementation-defined=] platform operand |outputImpl| to represent the output, given |output| and |opImpl|. @@ -4406,7 +4403,7 @@ partial interface MLGraphBuilder { 1. Let |descriptor|.{{MLOperandDescriptor/dimensions}} be the result of running the [=MLGraphBuilder/broadcast-shapes=] steps given |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}} and |slope|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}. 1. If that throws an error, re-throw the error and stop. 1. If any of the following sub-steps fail, throw an "{{OperationError}}" {{DOMException}} and stop. - 1. Let |output| be the result of invoking the create MLOperand steps given [=this=] and |descriptor|. + 1. Let |output| be the result of creating an MLOperand given [=this=] and |descriptor|. 1. Make a request to the underlying platform to: 1. Let |opImpl| be an [=implementation-defined=] platform operator for the PreLU operation, given |slope|. 1. Store a reference of |opImpl| in |output|.{{MLOperand/[[operator]]}}. @@ -4489,7 +4486,7 @@ partial interface MLGraphBuilder { 1. [=Assert=]: |op| is one of "reduceL1", "reduceL2", "reduceLogSum", "reduceLogSumExp", "reduceMax", "reduceMean", "reduceMin", "reduceProduct", "reduceSum", "reduceSumSquare". 1. [=Assert=]: the type of |input| is {{MLOperand}}. 1. If any of the following sub-steps fail, throw an "{{OperationError}}" {{DOMException}} and stop. - 1. Let |output| be the result of invoking the copy MLOperand steps given |input|. + 1. Let |output| be the result of copying an MLOperand given |input|. 1. Make a request to the underlying platform to: 1. Let |opImpl| be an [=implementation-defined=] platform operator for the |op| reduce operation, given |options|. 1. Store a reference of |opImpl| in |output|.{{MLOperand/[[operator]]}}. @@ -4596,7 +4593,7 @@ partial interface MLGraphBuilder {
1. [=Assert=]: the type of |input| is {{MLOperand}}. 1. If any of the following sub-steps fail, throw an "{{OperationError}}" {{DOMException}} and stop. - 1. Let |output| be the result of invoking the copy MLOperand steps given |input|. + 1. Let |output| be the result of copying an MLOperand given |input|. 1. Make a request to the underlying platform to: 1. Let |opImpl| be an [=implementation-defined=] platform operator for the ReLU operation. 1. Store a reference of |opImpl| in |output|.{{MLOperand/[[operator]]}}. @@ -4622,7 +4619,7 @@ partial interface MLGraphBuilder { The {{MLGraphBuilder/relu()}} method steps are:
- 1. Let |op| be the result of invoking the create MLActivation steps with [=this=] and `"relu"`. + 1. Let |op| be the result of creating an MLActivation given [=this=] and `"relu"`. 1. If that throws an error, re-throw the error and abort these steps. 1. Return |op|.
@@ -4687,11 +4684,11 @@ partial interface MLGraphBuilder {
1. If |options|.{{MLResample2dOptions/mode}} [=map/exists=], and if its value is not one of `"nearest-neighbor"` or `"linear"`, return `false`. - 1. If |options|.{{MLResample2dOptions/scales}} does not [=map/exist=], set it to to `[1.0, 1.0]`. + 1. If |options|.{{MLResample2dOptions/scales}} does not [=map/exist=], set it to to `« 1.0, 1.0 »`. 1. Otherwise, if any of its values is not greater than `0`, return `false`. 1. If |options|.{{MLResample2dOptions/sizes}} [=map/exists=], and if its size is not `2`, or if any of its values is not greater than `0`, return `false`. - 1. If |options|.{{MLResample2dOptions/axes}} does not [=map/exists=], set it to `[2, 3]`. - 1. Otherwise, if its value is not one of `[0, 1], [1, 2], [2, 3]`, return `false`. + 1. If |options|.{{MLResample2dOptions/axes}} does not [=map/exists=], set it to `« 2, 3 »`. + 1. Otherwise, if its value is not one of `« 0, 1], « 1, 2], « 2, 3 »`, return `false`. 1. Return `true`.
@@ -4722,7 +4719,7 @@ partial interface MLGraphBuilder { 1. Let |desc| be the result of running the resample output sizes steps given |options|. 1. If that throws an error, re-throw the error and stop. 1. If any of the following sub-steps fail, throw an "{{OperationError}}" {{DOMException}} and stop. - 1. Let |output| be the result of invoking the create MLOperand steps given [=this=] and |desc|. + 1. Let |output| be the result of creating an MLOperand given [=this=] and |desc|. 1. Make a request to the underlying platform to: 1. Let |opImpl| be an [=implementation-defined=] platform operator for the resample 2D operation, given |options|. 1. Store a reference of |opImpl| in |output|.{{MLOperand/[[operator]]}}. @@ -4763,9 +4760,9 @@ partial interface MLGraphBuilder {
1. [=Assert=]: the type of |input| is {{MLOperand}}. 1. Let |outputShape| be an empty array of {{unsigned long}}. - 1. If |newShape| is a scalar [=number=], set |outputShape| to `[ 1 ]`. + 1. If |newShape| is a scalar [=number=], set |outputShape| to `« 1 »`. 1. Otherwise, if |newShape| is an array of {{unsigned long}}: - 1. If the size of |newShape| is `0`, set |outputShape| to `[ 1 ]` (reshaping to scalar). + 1. If the size of |newShape| is `0`, set |outputShape| to `« 1 »` (reshaping to scalar). 1. If |newShape| contains more than one `null` value, then throw a "{{DataError}}" {{DOMException}} and stop. 1. If any value in |newShape| is `0`, then throw a "{{DataError}}" {{DOMException}} and stop. 1. Let |inputElementCount| be the product of all elements in |inputs|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}. @@ -4775,7 +4772,7 @@ partial interface MLGraphBuilder { 1. Let |desc| be a copy of |input|.{{MLOperand/[[descriptor]]}}. 1. Set |desc|.{{MLOperandDescriptor/dimensions}} to |newShape|. 1. If any of the following sub-steps fail, throw an "{{OperationError}}" {{DOMException}} and stop. - 1. Let |output| be the result of invoking the create MLOperand steps given [=this=] and |desc|. + 1. Let |output| be the result of creating an MLOperand given [=this=] and |desc|. 1. Make a request to the underlying platform to: 1. Let |opImpl| be an [=implementation-defined=] platform operator for the reshape operation. 1. Store a reference of |opImpl| in |output|.{{MLOperand/[[operator]]}}. @@ -4830,7 +4827,7 @@ partial interface MLGraphBuilder {
1. [=Assert=]: the type of |input| is {{MLOperand}}. 1. If any of the following sub-steps fail, throw an "{{OperationError}}" {{DOMException}} and stop. - 1. Let |output| be the result of invoking the copy MLOperand steps given |input|. + 1. Let |output| be the result of copying an MLOperand given |input|. 1. Make a request to the underlying platform to: 1. Let |opImpl| be an [=implementation-defined=] platform operator for the sigmoid operation. 1. Store a reference of |opImpl| in |output|.{{MLOperand/[[operator]]}}. @@ -4856,7 +4853,7 @@ partial interface MLGraphBuilder { The {{MLGraphBuilder/sigmoid()}} method steps are:
- 1. Let |op| be the result of invoking the create MLActivation steps with [=this=] and `"sigmoid"`. + 1. Let |op| be the result of creating an MLActivation given [=this=] and `"sigmoid"`. 1. If that throws an error, re-throw the error and abort these steps. 1. Return |op|.
@@ -4890,7 +4887,7 @@ partial interface MLGraphBuilder { Further validation of |starts| and |sizes| given |input| is left [=implementation-defined=].
1. If any of the following sub-steps fail, throw an "{{OperationError}}" {{DOMException}} and stop. - 1. Let |output| be the result of invoking the copy MLOperand steps given |input|. + 1. Let |output| be the result of copying an MLOperand given |input|. 1. Make a request to the underlying platform to: 1. Let |opImpl| be an [=implementation-defined=] platform operator for the slice operation, given |starts| and |sizes|. 1. Store a reference of |opImpl| in |output|.{{MLOperand/[[operator]]}}. @@ -4950,7 +4947,7 @@ partial interface MLGraphBuilder { 1. [=Assert=]: the type of |input| is {{MLOperand}}. 1. If the length of |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}} is not 2, then throw a "{{DataError}}" {{DOMException}} and stop. 1. If any of the following sub-steps fail, throw an "{{OperationError}}" {{DOMException}} and stop. - 1. Let |output| be the result of invoking the copy MLOperand steps given |input|. + 1. Let |output| be the result of copying an MLOperand given |input|. 1. Make a request to the underlying platform to: 1. Let |opImpl| be an [=implementation-defined=] platform operator for the softmax operation. 1. Store a reference of |opImpl| in |output|.{{MLOperand/[[operator]]}}. @@ -4975,7 +4972,7 @@ partial interface MLGraphBuilder { The {{MLGraphBuilder/softmax()}} method steps are:
- 1. Let |op| be the result of invoking the create MLActivation steps with and `"softmax"`. + 1. Let |op| be the result of creating an MLActivation given and `"softmax"`. 1. If that throws an error, re-throw the error and abort these steps. 1. Return |op|.
@@ -5037,7 +5034,7 @@ partial interface MLGraphBuilder {
1. If any of the following sub-steps fail, throw an "{{OperationError}}" {{DOMException}} and stop. - 1. Let |output| be the result of invoking the copy MLOperand steps given |input|. + 1. Let |output| be the result of copying an MLOperand given |input|. 1. Make a request to the underlying platform to: 1. Let |opImpl| be an [=implementation-defined=] platform operator for the softplus operation, given |options|. 1. Store a reference of |opImpl| in |output|.{{MLOperand/[[operator]]}}. @@ -5063,7 +5060,7 @@ partial interface MLGraphBuilder { The {{MLGraphBuilder/softplus(options)}} method steps are:
- 1. Let |op| be the result of invoking the create MLActivation steps with [=this=], `"softplus"` and |options|. + 1. Let |op| be the result of creating an MLActivation given [=this=], `"softplus"` and |options|. 1. If that throws an error, re-throw the error and abort these steps. 1. Return |op|.
@@ -5108,7 +5105,7 @@ partial interface MLGraphBuilder {
1. [=Assert=]: the type of |input| is {{MLOperand}}. 1. If any of the following sub-steps fail, throw an "{{OperationError}}" {{DOMException}} and stop. - 1. Let |output| be the result of invoking the copy MLOperand steps given |input|. + 1. Let |output| be the result of copying an MLOperand given |input|. 1. Make a request to the underlying platform to: 1. Let |opImpl| be an [=implementation-defined=] platform operator for the softsign operation, given |options|. 1. Store a reference of |opImpl| in |output|.{{MLOperand/[[operator]]}}. @@ -5133,7 +5130,7 @@ partial interface MLGraphBuilder { The {{MLGraphBuilder/softsign()}} method steps are:
- 1. Let |op| be the result of invoking the create MLActivation steps with [=this=] and `"softsign"`. + 1. Let |op| be the result of creating an MLActivation given [=this=] and `"softsign"`. 1. If that throws an error, re-throw the error and abort these steps. 1. Return |op|.
@@ -5178,7 +5175,7 @@ partial interface MLGraphBuilder { 1. [=Assert=]: the type of |input| is {{MLOperand}}. 1. If |splits| is not {{unsigned long}} or a sequence of {{unsigned long}}, then [=exception/throw=] a {{TypeError}} and stop. 1. If any of the following sub-steps fail, throw an "{{OperationError}}" {{DOMException}} and stop. - 1. Let |output| be the result of invoking the copy MLOperand steps given |input|. + 1. Let |output| be the result of copying an MLOperand given |input|. 1. Make a request to the underlying platform to: 1. Let |opImpl| be an [=implementation-defined=] platform operator for the split operation, given |splits| and |options|. 1. Store a reference of |opImpl| in |output|.{{MLOperand/[[operator]]}}. @@ -5256,7 +5253,7 @@ partial interface MLGraphBuilder { 1. Let |oneDimIndex| be |options|.{{MLSqueezeOptions/axes}}[|index|]. 1. If |dimensions|[|oneDimIndex|] is not `1`, then [=exception/throw=] a {{TypeError}} and stop. 1. If any of the following sub-steps fail, throw an "{{OperationError}}" {{DOMException}} and stop. - 1. Let |output| be the result of invoking the copy MLOperand steps given |input|. + 1. Let |output| be the result of copying an MLOperand given |input|. 1. Make a request to the underlying platform to: 1. Let |opImpl| be an [=implementation-defined=] platform operator for the squeeze operation, given |options|. 1. Store a reference of |opImpl| in |output|.{{MLOperand/[[operator]]}}. @@ -5309,7 +5306,7 @@ partial interface MLGraphBuilder {
1. [=Assert=]: the type of |input| is {{MLOperand}}. 1. If any of the following sub-steps fail, throw an "{{OperationError}}" {{DOMException}} and stop. - 1. Let |output| be the result of invoking the copy MLOperand steps given |input|. + 1. Let |output| be the result of copying an MLOperand given |input|. 1. Make a request to the underlying platform to: 1. Let |opImpl| be an [=implementation-defined=] platform operator for the hyperbolic tangent operation. 1. Store a reference of |opImpl| in |output|.{{MLOperand/[[operator]]}}. @@ -5335,7 +5332,7 @@ partial interface MLGraphBuilder { The {{MLGraphBuilder/tanh()}} method steps are:
- 1. Let |op| be the result of invoking the create MLActivation steps with [=this=] and `"tanh"`. + 1. Let |op| be the result of creating an MLActivation given [=this=] and `"tanh"`. 1. If that throws an error, re-throw the error and abort these steps. 1. Return |op|.
@@ -5384,7 +5381,7 @@ partial interface MLGraphBuilder { 1. If the values in |options|.{{MLTransposeOptions/permutation}} are not between `0` and the rank of |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}} minus `1`, then [=exception/throw=] a {{TypeError}} and stop. 1. If the values in |options|.{{MLTransposeOptions/permutation}} contain duplicate value, then [=exception/throw=] a {{TypeError}} and stop. 1. If any of the following sub-steps fail, throw an "{{OperationError}}" {{DOMException}} and stop. - 1. Let |output| be the result of invoking the copy MLOperand steps given |input|. + 1. Let |output| be the result of copying an MLOperand given |input|. 1. Make a request to the underlying platform to: 1. Let |opImpl| be an [=implementation-defined=] platform operator for the transpose operation, given |options|. 1. Store a reference of |opImpl| in |output|.{{MLOperand/[[operator]]}}. From e69e71c852eb623a5ca40dfb8967e578ac8e081f Mon Sep 17 00:00:00 2001 From: Zoltan Kis Date: Thu, 17 Aug 2023 11:51:46 +0300 Subject: [PATCH 079/112] Fix typos for #446 Signed-off-by: Zoltan Kis --- index.bs | 28 ++++++++++++++++------------ 1 file changed, 16 insertions(+), 12 deletions(-) diff --git a/index.bs b/index.bs index 570380af..1966de4f 100644 --- a/index.bs +++ b/index.bs @@ -1005,7 +1005,7 @@ The {{MLOperand}} objects are created by the methods of {{MLGraphBuilder}}, inte
1. [=Assert=]: the type of |operand|.{{MLOperand/[[builder]]}} is {{MLGraphBuilder}}. - 1. If |builder| [=map/exists=]] and is not equal to |operand|.{{MLOperand/[[builder]]}}, return `false`. + 1. If |builder| is not equal to |operand|.{{MLOperand/[[builder]]}}, return `false`. 1. Let |desc| be |operand|.{{MLOperand/[[descriptor]]}}. 1. If |desc|.{{MLOperandDescriptor/dimensions}} [=map/exists=] and invoking check dimensions given |desc|.{{MLOperandDescriptor/dimensions}} and |desc|.{{MLOperandDescriptor/type}} returns `false`, then return `false`. 1. Return `true`. @@ -1799,7 +1799,6 @@ partial interface MLGraphBuilder {
1. [=Assert=]: the type of |input|, |mean| and |variance| is {{MLOperand}}. - 1. [=Assert=]: the type of |mean| is 1. If |mean|.{{MLOperand/[[descriptor]]}}.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}} is not equal with |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}} from which the dimension represented by |options|.axis is removed, then [=exception/throw=] a {{TypeError}} and abort these steps. 1. If |options|.axis is not a number between 0 and the rank of |input|, then [=exception/throw=] a {{TypeError}} and abort these steps. 1. If |input| is a 4-D tensor of the *"nchw"* layout, set |options|.axis to 1. @@ -2507,7 +2506,6 @@ partial interface MLGraphBuilder {
1. [=Assert=]: |op| is one of "abs", "ceil", "cos", "exp", "floor", "log", "neg", "sin", "tan". 1. [=Assert=]: the type of |input| is {{MLOperand}}. - 1. Let |kind| be `"output"`. 1. Let |descriptor| be a new {{MLOperandDescriptor}}. 1. If any of the following sub-steps fail, throw an "{{OperationError}}" {{DOMException}} and stop. 1. Let |output| be the result of copying an MLOperand given |input|. @@ -2616,7 +2614,7 @@ partial interface MLGraphBuilder { - *alpha*: a {{float}} scalar multiplier, default to 1. **Returns:** - - an {{MLOperand}}. The output tensor of the same shape as *x*. + - an {{MLOperand}}. The output tensor of the same shape as *input*.
@@ -3162,7 +3160,7 @@ partial interface MLGraphBuilder { The default value is `0.2`. : beta :: - A {{float}} point scalar addition. + A {{float}} scalar addition. The default value is `0.5`. @@ -3175,8 +3173,13 @@ partial interface MLGraphBuilder { **Returns:** - an {{MLOperand}}. The output tensor of the same shape as *input*.
-
+ +
+ The {{MLGraphBuilder/hardSigmoid(input, options)}} method steps are: + +
+ 1. [=Assert=]: the type of |input| is {{MLOperand}}. 1. If any of the following sub-steps fail, throw an "{{OperationError}}" {{DOMException}} and stop. 1. Let |output| be the result of copying an MLOperand given |input|. 1. Make a request to the underlying platform to: @@ -3187,7 +3190,8 @@ partial interface MLGraphBuilder { 1. Connect |input|.{{MLOperand/[[operand]]}} as input to |opImpl|. 1. Connect |output|.{{MLOperand/[[operand]]}} as output to |opImpl|. 1. Return |output|. -
+
+ #### The {{MLGraphBuilder/hardSigmoid(options)}} method #### {#api-mlgraphbuilder-hardsigmoid-options}
@@ -4324,14 +4328,14 @@ partial interface MLGraphBuilder {
1. [=Assert=]: |op| is one of "averagePool2d", "l2Pool2d", "maxPool2d". 1. [=Assert=]: the type of |input| is {{MLOperand}}. - 1. If the length of of |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}} is not 4, then throw a "{{DataError}}" {{DOMException}} and stop. + 1. If the length of |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}} is not 4, then throw a "{{DataError}}" {{DOMException}} and stop. 1. If |options|.{{MLPool2dOptions/outputSizes}} [=map/exists=], or if |options|.{{MLPool2dOptions/padding}} does not [=map/exist=], set |options|.{{MLPool2dOptions/padding}} to `« 0, 0, 0, 0 »`. - 1. If the length of of |padding|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}} is not 4, then throw a "{{DataError}}" {{DOMException}} and stop. + 1. If the length of |padding|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}} is not 4, then throw a "{{DataError}}" {{DOMException}} and stop. 1. If |options|.{{MLPool2dOptions/strides}} does not [=map/exist=], set |options|.{{MLPool2dOptions/strides}} to `« 1, 1 »`. - 1. If the length of of |options|.{{MLPool2dOptions/strides}} is not 2, then throw a "{{DataError}}" {{DOMException}} and stop. + 1. If the length of |options|.{{MLPool2dOptions/strides}} is not 2, then throw a "{{DataError}}" {{DOMException}} and stop. 1. If any value in |options|.{{MLPool2dOptions/strides}} is not greater than 0, then throw a "{{DataError}}" {{DOMException}} and stop. 1. If |options|.{{MLPool2dOptions/dilations}} does not [=map/exist=], set |options|.{{MLPool2dOptions/dilations}} to `« 1, 1 »`. - 1. If the length of of |options|.{{MLPool2dOptions/dilations}} is not 2, then throw a "{{DataError}}" {{DOMException}} and stop. + 1. If the length of |options|.{{MLPool2dOptions/dilations}} is not 2, then throw a "{{DataError}}" {{DOMException}} and stop. 1. If any value in |options|.{{MLPool2dOptions/dilations}} is not greater than 0, then throw a "{{DataError}}" {{DOMException}} and stop. 1. If |options|.{{MLPool2dOptions/autoPad}} is not `"explicit"`, set |options|.{{MLPool2dOptions/padding}} to `« 0, 0, 0, 0 »`. 1. Let |desc| be a copy of |input|.{{MLOperand/[[descriptor]]}}. @@ -4389,7 +4393,7 @@ partial interface MLGraphBuilder { - *slope*: an {{MLOperand}}. The slope tensor. Its shape is either the same as, or unidirectionally broadcastable to the shape of input tensor *input* according to [[!numpy-broadcasting-rule]]. **Returns:** - - an {{MLOperand}}. The output tensor of the same shape as *x*. + - an {{MLOperand}}. The output tensor of the same shape as *input*.
From 33bb5e44b8924d9e38351fb3998ebf43ad90ad80 Mon Sep 17 00:00:00 2001 From: Zoltan Kis Date: Thu, 17 Aug 2023 21:43:51 +0300 Subject: [PATCH 080/112] Fix more typos Signed-off-by: Zoltan Kis --- index.bs | 76 ++++++++++++++++++++++++++++---------------------------- 1 file changed, 38 insertions(+), 38 deletions(-) diff --git a/index.bs b/index.bs index 1966de4f..8e9f7b44 100644 --- a/index.bs +++ b/index.bs @@ -1790,7 +1790,7 @@ partial interface MLGraphBuilder { - *variance*: an {{MLOperand}}. The 1-D tensor of the variance values of the input features across the batch whose length is equal to the size of the input dimension denoted by {{MLBatchNormalizationOptions/axis}}. - *options*: an optional {{MLBatchNormalizationOptions}}. Specifies the optional parameters of the operation. - **Returns:** an {{MLOperand}}. The batch-normalized N-D tensor of the same shape as the input tensor. + **Returns:** an {{MLOperand}}. The batch-normalized N-D tensor of the same shape as *input*.
@@ -2422,41 +2422,41 @@ partial interface MLGraphBuilder {
- The element-wise binary operation algorithms invoke the [=MLGraphBuilder/element-wise-binary-op | create element-wise binary operation =] steps as follows. + The element-wise binary operation algorithms invoke the [=MLGraphBuilder/element-wise-binary-op | create element-wise binary operation=] steps as follows.
The {{MLGraphBuilder/add(a, b)}} steps are: - 1. Let |output| be the result of running the [=MLGraphBuilder/element-wise-binary-op | create element-wise binary operation =] given "add", |a| and |b|. + 1. Let |output| be the result of running the [=MLGraphBuilder/element-wise-binary-op | create element-wise binary operation=] given "add", |a| and |b|. 1. If that throws an error, then re-throw the error and stop. 1. Return |output|. The {{MLGraphBuilder/sub(a, b)}} steps are: - 1. Let |output| be the result of running the [=MLGraphBuilder/element-wise-binary-op | create element-wise binary operation =] given "sub", |a| and |b|. + 1. Let |output| be the result of running the [=MLGraphBuilder/element-wise-binary-op | create element-wise binary operation=] given "sub", |a| and |b|. 1. If that throws an error, then re-throw the error and stop. 1. Return |output|. The {{MLGraphBuilder/mul(a, b)}} steps are: - 1. Let |output| be the result of running the [=MLGraphBuilder/element-wise-binary-op | create element-wise binary operation =] given "mul", |a| and |b|. + 1. Let |output| be the result of running the [=MLGraphBuilder/element-wise-binary-op | create element-wise binary operation=] given "mul", |a| and |b|. 1. If that throws an error, then re-throw the error and stop. 1. Return |output|. The {{MLGraphBuilder/div(a, b)}} steps are: - 1. Let |output| be the result of running the [=MLGraphBuilder/element-wise-binary-op | create element-wise binary operation =] given "div", |a| and |b|. + 1. Let |output| be the result of running the [=MLGraphBuilder/element-wise-binary-op | create element-wise binary operation=] given "div", |a| and |b|. 1. If that throws an error, then re-throw the error and stop. 1. Return |output|. The {{MLGraphBuilder/max(a, b)}} steps are: - 1. Let |output| be the result of running the [=MLGraphBuilder/element-wise-binary-op | create element-wise binary operation =] given "max", |a| and |b|. + 1. Let |output| be the result of running the [=MLGraphBuilder/element-wise-binary-op | create element-wise binary operation=] given "max", |a| and |b|. 1. If that throws an error, then re-throw the error and stop. 1. Return |output|. The {{MLGraphBuilder/min(a, b)}} steps are: - 1. Let |output| be the result of running the [=MLGraphBuilder/element-wise-binary-op | create element-wise binary operation =] given "min", |a| and |b|. + 1. Let |output| be the result of running the [=MLGraphBuilder/element-wise-binary-op | create element-wise binary operation=] given "min", |a| and |b|. 1. If that throws an error, then re-throw the error and stop. 1. Return |output|. The {{MLGraphBuilder/pow(a, b)}} steps are: - 1. Let |output| be the result of running the [=MLGraphBuilder/element-wise-binary-op | create element-wise binary operation =] given "pow", |a| and |b|. + 1. Let |output| be the result of running the [=MLGraphBuilder/element-wise-binary-op | create element-wise binary operation=] given "pow", |a| and |b|. 1. If that throws an error, then re-throw the error and stop. 1. Return |output|.
@@ -2522,51 +2522,51 @@ partial interface MLGraphBuilder {
- The element-wise unary operation algorithms invoke the [=MLGraphBuilder/element-wise-unary-op | create element-wise unary operation =] steps as follows. + The element-wise unary operation algorithms invoke the [=MLGraphBuilder/element-wise-unary-op | create element-wise unary operation=] steps as follows.
The {{MLGraphBuilder/abs(input)}} steps are: - 1. Let |output| be the result of running the [=MLGraphBuilder/element-wise-unary-op | create element-wise unary operation =] given "abs" and |input|. + 1. Let |output| be the result of running the [=MLGraphBuilder/element-wise-unary-op | create element-wise unary operation=] given "abs" and |input|. 1. If that throws an error, then re-throw the error and stop. 1. Return |output|. The {{MLGraphBuilder/ceil(input)}} steps are: - 1. Let |output| be the result of running the [=MLGraphBuilder/element-wise-unary-op | create element-wise unary operation =] given "ceil" and |input|. + 1. Let |output| be the result of running the [=MLGraphBuilder/element-wise-unary-op | create element-wise unary operation=] given "ceil" and |input|. 1. If that throws an error, then re-throw the error and stop. 1. Return |output|. The {{MLGraphBuilder/cos(input)}} steps are: - 1. Let |output| be the result of running the [=MLGraphBuilder/element-wise-unary-op | create element-wise unary operation =] given "cos" and |input|. + 1. Let |output| be the result of running the [=MLGraphBuilder/element-wise-unary-op | create element-wise unary operation=] given "cos" and |input|. 1. If that throws an error, then re-throw the error and stop. 1. Return |output|. The {{MLGraphBuilder/exp(input)}} steps are: - 1. Let |output| be the result of running the [=MLGraphBuilder/element-wise-unary-op | create element-wise unary operation =] given "exp" and |input|. + 1. Let |output| be the result of running the [=MLGraphBuilder/element-wise-unary-op | create element-wise unary operation=] given "exp" and |input|. 1. If that throws an error, then re-throw the error and stop. 1. Return |output|. The {{MLGraphBuilder/floor(input)}} steps are: - 1. Let |output| be the result of running the [=MLGraphBuilder/element-wise-unary-op | create element-wise unary operation =] given "floor" and |input|. + 1. Let |output| be the result of running the [=MLGraphBuilder/element-wise-unary-op | create element-wise unary operation=] given "floor" and |input|. 1. If that throws an error, then re-throw the error and stop. 1. Return |output|. The {{MLGraphBuilder/log(input)}} steps are: - 1. Let |output| be the result of running the [=MLGraphBuilder/element-wise-unary-op | create element-wise unary operation =] given "log" and |input|. + 1. Let |output| be the result of running the [=MLGraphBuilder/element-wise-unary-op | create element-wise unary operation=] given "log" and |input|. 1. If that throws an error, then re-throw the error and stop. 1. Return |output|. The {{MLGraphBuilder/neg(input)}} steps are: - 1. Let |output| be the result of running the [=MLGraphBuilder/element-wise-unary-op | create element-wise unary operation =] given "neg" and |input|. + 1. Let |output| be the result of running the [=MLGraphBuilder/element-wise-unary-op | create element-wise unary operation=] given "neg" and |input|. 1. If that throws an error, then re-throw the error and stop. 1. Return |output|. The {{MLGraphBuilder/sin(input)}} steps are: - 1. Let |output| be the result of running the [=MLGraphBuilder/element-wise-unary-op | create element-wise unary operation =] given "sin" and |input|. + 1. Let |output| be the result of running the [=MLGraphBuilder/element-wise-unary-op | create element-wise unary operation=] given "sin" and |input|. 1. If that throws an error, then re-throw the error and stop. 1. Return |output|. The {{MLGraphBuilder/tan(input)}} steps are: - 1. Let |output| be the result of running the [=MLGraphBuilder/element-wise-unary-op | create element-wise unary operation =] given "tan" and |input|. + 1. Let |output| be the result of running the [=MLGraphBuilder/element-wise-unary-op | create element-wise unary operation=] given "tan" and |input|. 1. If that throws an error, then re-throw the error and stop. 1. Return |output|.
@@ -3333,7 +3333,7 @@ The {{MLInstanceNormalizationOptions}} members are: - *input*: an {{MLOperand}}. The input 4-D tensor. - *options*: an optional {{MLInstanceNormalizationOptions}}. The optional parameters of the operation. - **Returns:** an {{MLOperand}}. The instance-normalized 4-D tensor of the same shape as the input tensor. + **Returns:** an {{MLOperand}}. The instance-normalized 4-D tensor of the same shape as *input*.
@@ -4359,21 +4359,21 @@ partial interface MLGraphBuilder {
The {{MLGraphBuilder/averagePool2d(input, options)}} steps are: - 1. Let |output| be the result of running the [=MLGraphBuilder/pooling-op | create pooling operation =] given `"averagePool2d"`, |input| and |options|. + 1. Let |output| be the result of running the [=MLGraphBuilder/pooling-op | create pooling operation=] given `"averagePool2d"`, |input| and |options|. 1. If that throws an error, then re-throw the error and stop. 1. Return |output|.
The {{MLGraphBuilder/l2Pool2d(input, options)}} steps are: - 1. Let |output| be the result of running the [=MLGraphBuilder/pooling-op | create pooling operation =] given `"l2Pool2d"`, |input| and |options|. + 1. Let |output| be the result of running the [=MLGraphBuilder/pooling-op | create pooling operation=] given `"l2Pool2d"`, |input| and |options|. 1. If that throws an error, then re-throw the error and stop. 1. Return |output|.
The {{MLGraphBuilder/maxPool2d(input, options)}} steps are: - 1. Let |output| be the result of running the [=MLGraphBuilder/pooling-op | create pooling operation =] given `"maxPool2d"`, |input| and |options|. + 1. Let |output| be the result of running the [=MLGraphBuilder/pooling-op | create pooling operation=] given `"maxPool2d"`, |input| and |options|. 1. If that throws an error, then re-throw the error and stop. 1. Return |output|.
@@ -4507,52 +4507,52 @@ partial interface MLGraphBuilder { The following reduce algorithms are supported. The {{MLGraphBuilder/reduceL1(input, options)}} steps are: - 1. Let |output| be the result of running the [=MLGraphBuilder/reduce-op | create reduce operation =] given "reduceL1", |input| and |options|. + 1. Let |output| be the result of running the [=MLGraphBuilder/reduce-op | create reduce operation=] given "reduceL1", |input| and |options|. 1. If that throws an error, then re-throw the error and stop. 1. Return |output|. The {{MLGraphBuilder/reduceL2(input, options)}} steps are: - 1. Let |output| be the result of running the [=MLGraphBuilder/reduce-op | create reduce operation =] given "reduceL2", |input| and |options|. + 1. Let |output| be the result of running the [=MLGraphBuilder/reduce-op | create reduce operation=] given "reduceL2", |input| and |options|. 1. If that throws an error, then re-throw the error and stop. 1. Return |output|. The {{MLGraphBuilder/reduceLogSum(input, options)}} steps are: - 1. Let |output| be the result of running the [=MLGraphBuilder/reduce-op | create reduce operation =] given "reduceLogSum", |input| and |options|. + 1. Let |output| be the result of running the [=MLGraphBuilder/reduce-op | create reduce operation=] given "reduceLogSum", |input| and |options|. 1. If that throws an error, then re-throw the error and stop. 1. Return |output|. The {{MLGraphBuilder/reduceLogSumExp(input, options)}} steps are: - 1. Let |output| be the result of running the [=MLGraphBuilder/reduce-op | create reduce operation =] given "reduceLogSumExp", |input| and |options|. + 1. Let |output| be the result of running the [=MLGraphBuilder/reduce-op | create reduce operation=] given "reduceLogSumExp", |input| and |options|. 1. If that throws an error, then re-throw the error and stop. 1. Return |output|. The {{MLGraphBuilder/reduceMax(input, options)}} steps are: - 1. Let |output| be the result of running the [=MLGraphBuilder/reduce-op | create reduce operation =] given "reduceMax", |input| and |options|. + 1. Let |output| be the result of running the [=MLGraphBuilder/reduce-op | create reduce operation=] given "reduceMax", |input| and |options|. 1. If that throws an error, then re-throw the error and stop. 1. Return |output|. The {{MLGraphBuilder/reduceMean(input, options)}} steps are: - 1. Let |output| be the result of running the [=MLGraphBuilder/reduce-op | create reduce operation =] given "reduceMean", |input| and |options|. + 1. Let |output| be the result of running the [=MLGraphBuilder/reduce-op | create reduce operation=] given "reduceMean", |input| and |options|. 1. If that throws an error, then re-throw the error and stop. 1. Return |output|. The {{MLGraphBuilder/reduceMin(input, options)}} steps are: - 1. Let |output| be the result of running the [=MLGraphBuilder/reduce-op | create reduce operation =] given "reduceMin", |input| and |options|. + 1. Let |output| be the result of running the [=MLGraphBuilder/reduce-op | create reduce operation=] given "reduceMin", |input| and |options|. 1. If that throws an error, then re-throw the error and stop. 1. Return |output|. The {{MLGraphBuilder/reduceProduct(input, options)}} steps are: - 1. Let |output| be the result of running the [=MLGraphBuilder/reduce-op | create reduce operation =] given "reduceProduct", |input| and |options|. + 1. Let |output| be the result of running the [=MLGraphBuilder/reduce-op | create reduce operation=] given "reduceProduct", |input| and |options|. 1. If that throws an error, then re-throw the error and stop. 1. Return |output|. The {{MLGraphBuilder/reduceSum(input, options)}} steps are: - 1. Let |output| be the result of running the [=MLGraphBuilder/reduce-op | create reduce operation =] given "reduceSum", |input| and |options|. + 1. Let |output| be the result of running the [=MLGraphBuilder/reduce-op | create reduce operation=] given "reduceSum", |input| and |options|. 1. If that throws an error, then re-throw the error and stop. 1. Return |output|. The {{MLGraphBuilder/reduceSumSquare(input, options)}} steps are: - 1. Let |output| be the result of running the [=MLGraphBuilder/reduce-op | create reduce operation =] given "reduceSumSquare", |input| and |options|. + 1. Let |output| be the result of running the [=MLGraphBuilder/reduce-op | create reduce operation=] given "reduceSumSquare", |input| and |options|. 1. If that throws an error, then re-throw the error and stop. 1. Return |output|.
@@ -4587,7 +4587,7 @@ partial interface MLGraphBuilder { - *input*: an {{MLOperand}}. The input tensor. **Returns:** - - an {{MLOperand}}. The output tensor of the same shape as *x*. + - an {{MLOperand}}. The output tensor of the same shape as *input*.
@@ -4940,7 +4940,7 @@ partial interface MLGraphBuilder { - *input*: an {{MLOperand}}. The input 2-D tensor. **Returns:** - - an {{MLOperand}}. The output 2-D tensor that contains the softmax results, of the same shape as the input tensor. + - an {{MLOperand}}. The output 2-D tensor that contains the softmax results, of the same shape as *input*.
@@ -5029,7 +5029,7 @@ partial interface MLGraphBuilder { - *options*: an optional {{MLSoftplusOptions}}. The optional parameters of the operation. **Returns:** - - an {{MLOperand}}. The output tensor of the same shape as *x*. + - an {{MLOperand}}. The output tensor of the same shape as *input*.
@@ -5099,7 +5099,7 @@ partial interface MLGraphBuilder { - *input*: an {{MLOperand}}. The input tensor. **Returns:** - - an {{MLOperand}}. The output tensor of the same shape as *x*. + - an {{MLOperand}}. The output tensor of the same shape as *input*.
@@ -5300,7 +5300,7 @@ partial interface MLGraphBuilder { - *input*: an {{MLOperand}}. The input tensor. **Returns:** - - an {{MLOperand}}. The output tensor of the same shape as *x*. + - an {{MLOperand}}. The output tensor of the same shape as *input*.
From d6d8c3eb69b7699b4498256beff1c3a3c8de4ed0 Mon Sep 17 00:00:00 2001 From: Zoltan Kis Date: Fri, 18 Aug 2023 00:26:02 +0300 Subject: [PATCH 081/112] Remove 'and stop' after throwing. Fix throw clauses. Signed-off-by: Zoltan Kis --- index.bs | 466 +++++++++++++++++++++++++++---------------------------- 1 file changed, 233 insertions(+), 233 deletions(-) diff --git a/index.bs b/index.bs index 8e9f7b44..090dc9c7 100644 --- a/index.bs +++ b/index.bs @@ -814,7 +814,7 @@ Its default allowlist is 'self'. The {{ML/createContext(options)}} method steps are:
- 1. If [=this=]'s [=relevant global object=]'s [=associated Document=] is not [=allowed to use=] the [=webnn-feature|webnn=] feature, return [=a new promise=] [=rejected=] with a "{{SecurityError}}" {{DOMException}} and abort these steps. + 1. If [=this=]'s [=relevant global object=]'s [=associated Document=] is not [=allowed to use=] the [=webnn-feature|webnn=] feature, return [=a new promise=] [=rejected=] with a "{{SecurityError}}" {{DOMException}}. 1. Let |promise| be [=a new promise=]. 1. Return |promise| and run the following steps [=in parallel=]. 1. Run the create context steps given |options|: @@ -827,7 +827,7 @@ Its default allowlist is 'self'. 1. Set |context|.{{[[contextType]]}} to "[=default-context|default=]". 1. If |options|["{{deviceType}}"] [=map/exists=], then set |context|.{{[[deviceType]]}} to |options|["{{deviceType}}"]. Otherwise, set |context|.{{[[deviceType]]}} to "[=device-type-cpu|cpu=]". 1. If |options|["{{powerPreference}}"] [=map/exists=], then set |context|.{{[[powerPreference]]}} to |options|["{{powerPreference}}"]. Otherwise, set |context|.{{[[powerPreference]]}} to "[=power-preference-default|default=]". - 1. If validating MLContext given |context| return `false`, [=reject=] |promise| with a "{{NotSupportedError}}" {{DOMException}} and abort these steps. + 1. If validating MLContext given |context| returns `false`, [=reject=] |promise| with a "{{NotSupportedError}}" {{DOMException}}. 1. [=Resolve=] |promise| with |context|.
@@ -838,9 +838,9 @@ Its default allowlist is 'self'. The {{ML/createContextSync(options)}} method steps are:
- 1. If [=this=]'s [=relevant global object=]'s [=associated Document=] is not [=allowed to use=] the [=webnn-feature|webnn=] feature, throw a "{{SecurityError}}" {{DOMException}} and abort these steps. + 1. If [=this=]'s [=relevant global object=]'s [=associated Document=] is not [=allowed to use=] the [=webnn-feature|webnn=] feature, then [=exception/throw=] a "{{SecurityError}}" {{DOMException}}. 1. Let |context| be the result of running the create context steps given |options|. - 1. If validating MLContext given |context| return `false`, throw a "{{NotSupportedError}}" {{DOMException}} and abort these steps. + 1. If validating MLContext given |context| return `false`, then [=exception/throw=] a "{{NotSupportedError}}" {{DOMException}}. 1. Return |context|.
@@ -1055,12 +1055,12 @@ The {{MLActivation}} objects (including the ones passed as input to methods) are
1. [=Assert=]: the type of |builder| is {{MLGraphBuilder}}. - 1. If |name| is empty, then throw a "{{TypeError}}" and abort these steps. + 1. If |name| is empty, then [=exception/throw=] a "{{TypeError}}" and abort these steps. 1. Let |activation| be a new [=object=]. 1. Set |activation|.{{MLActivation/[[builder]]}} to |builder|. 1. Set |activation|.{{MLActivation/[[name]]}} to |name|. 1. If |options| is an [=object=], set |activation|.{{MLActivation/[[options]]}} to |options|. - 1. If any of the following sub-steps fail, throw an "{{OperationError}}" {{DOMException}} and stop. + 1. If any of the following sub-steps fail, [=exception/throw=] an "{{OperationError}}" {{DOMException}}. 1. Make a request to the underlying platform to: 1. Create an [=implementation-defined=] platform operator |opImpl| for the given |name| operation. 1. Store a reference of |opImpl| in |activation|.{{MLActivation/[[operator]]}}. @@ -1140,7 +1140,7 @@ When the {{[[contextType]]}} is set to [=default-context|default=] with the {{ML ### Synchronous Execution ### {#api-mlcontext-sync-execution} -Synchronously carries out the computational workload of a compiled graph {{MLGraph}} on the calling thread, which must be a worker thread, to produce results as defined by the operations in the graph. This method of execution requires an {{MLContext}} created with {{MLContextOptions}}. Otherwise, it throws an "{{OperationError}}" {{DOMException}}. +Synchronously carries out the computational workload of a compiled graph {{MLGraph}} on the calling thread, which must be a worker thread, to produce results as defined by the operations in the graph. This method of execution requires an {{MLContext}} created with {{MLContextOptions}}. Otherwise, it[=exception/throws=] an "{{OperationError}}" {{DOMException}}.
-Both {{MLGraphBuilder}}.{{MLGraphBuilder/build()}} and {{MLGraphBuilder}}.{{MLGraphBuilder/buildSync()}} methods compile the graph builder state up to the specified output operands into a compiled graph according to the type of {{MLContext}} that creates it. Since this operation can be costly in some machine configurations, the calling thread of the {{MLGraphBuilder}}.{{MLGraphBuilder/buildSync()}} method must only be a worker thread to avoid potential disruption of the user experience. When the {{[[contextType]]}} of the {{MLContext}} is set to [=default-context|default=], the compiled graph is initialized right before the {{MLGraph}} is returned. This graph initialization stage is important for optimal performance of the subsequent graph executions. See [[#api-mlcommandencoder-graph-initialization]] for more detail. +Both {{MLGraphBuilder}}.{{MLGraphBuilder/build()}} and {{MLGraphBuilder}}.{{MLGraphBuilder/buildSync()}} methods compile the graph builder state up to the specified output operands into a compiled graph according to the type of {{MLContext}} that creates it. Since this operation can be costly in some machine configurations, the calling thread of the {{MLGraphBuilder}}.{{MLGraphBuilder/buildSync()}} method must only be a worker thread to avoid potential disruption of the user experience. When the {{[[contextType]]}} of the {{MLContext}} is set to "[=default-context|default=]", the compiled graph is initialized right before the {{MLGraph}} is returned. This graph initialization stage is important for optimal performance of the subsequent graph executions. See [[#api-mlcommandencoder-graph-initialization]] for more detail.
{{MLBufferResourceView}} has the following members: @@ -1698,7 +1698,7 @@ Create a constant {{MLOperand}} that can be used in {{MLGraphBuilder}} methods. 1. If validating buffer with descriptor given |bufferView| and |descriptor| returns `false`, then [=exception/throw=] a {{TypeError}}. 1. If any of the following sub-steps fail, [=exception/throw=] an "{{OperationError}}" {{DOMException}}. 1. Let |operand| be the result of creating an MLOperand given [=this=] and |descriptor|. - 1. Let |bytes| be the result of invoking the [[=get a copy of the bytes held by the buffer source=]] steps given |bufferView|. + 1. Let |bytes| be the result of invoking the [=get a copy of the bytes held by the buffer source=] steps given |bufferView|. 1. Make a request to the underlying platform to: 1. Create an [=implementation-defined=] platform operand |constantImpl| to represent a constant, given |descriptor|. 1. Store a reference of |constantImpl| in |operand|.{{MLOperand/[[operand]]}}. @@ -4148,7 +4148,6 @@ partial interface MLGraphBuilder {
1. [=Assert=]: the type of |input| is {{MLOperand}}. - 1. If |beginningPadding| or |endingPadding| is not a sequence of {{unsigned long}}, then [=exception/throw=] a {{TypeError}}. 1. If |options|.{{MLPadOptions/mode}} is not one of {{MLPaddingMode}}, then [=exception/throw=] a {{TypeError}}. 1. Let |desc| be a copy of |input|.{{MLOperand/[[descriptor]]}}. 1. Set |desc|.{{MLOperandDescriptor/dimensions}} to the result of invoking the calculate padding output sizes steps given |input|, |beginningPadding| and |endingPadding|. @@ -4891,7 +4890,6 @@ partial interface MLGraphBuilder {
1. [=Assert=]: the type of |input| is {{MLOperand}}. - 1. If |starts| or |sizes| is not a sequence of {{long}}, then [=exception/throw=] a {{TypeError}}. 1. If |sizes|.size is 0, then [=exception/throw=] a {{TypeError}}. 1. If the [=list/size=] of |starts| and |sizes| is not equal to the rank of |input|, then [=exception/throw=] a {{TypeError}}. 1. If any of the following sub-steps fail, [=exception/throw=] an "{{OperationError}}" {{DOMException}}. @@ -5388,7 +5386,6 @@ partial interface MLGraphBuilder { 1. [=Assert=]: the type of |input| is {{MLOperand}}. 1. If |options|.{{MLTransposeOptions/permutation}} does not [=map/exist=], let |options|.{{MLTransposeOptions/permutation}} be the reversed sequence of all indices for |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}. 1. Otherwise if |options|.{{MLTransposeOptions/permutation}} [=map/exists=]: - 1. If |options|.{{MLTransposeOptions/permutation}} is not a sequence of {{unsigned long}}, then [=exception/throw=] a {{TypeError}}. 1. If the [=rank=] of |options|.{{MLTransposeOptions/permutation}} is not the same as the [=rank=] of |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}, then [=exception/throw=] a {{TypeError}}. 1. If the values in |options|.{{MLTransposeOptions/permutation}} are not between `0` and the [=rank=] of |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}} minus `1`, then [=exception/throw=] a {{TypeError}}. 1. If the values in |options|.{{MLTransposeOptions/permutation}} contain duplicate value, then [=exception/throw=] a {{TypeError}}. From 268cbce3e1409cb1686a85fd1065eff6aae8dceb Mon Sep 17 00:00:00 2001 From: Joshua Bell Date: Mon, 21 Aug 2023 14:25:19 -0700 Subject: [PATCH 091/112] more tidying fix b issue --- index.bs | 4 +++- 1 file changed, 3 insertions(+), 1 deletion(-) diff --git a/index.bs b/index.bs index 06fb3959..9f57eb44 100644 --- a/index.bs +++ b/index.bs @@ -4040,13 +4040,14 @@ partial interface MLGraphBuilder { - If both *a* and *b* are 1-dimensional, the operation is a vector dot-product, which produces a scalar output.
+
To calculate matmul output sizes, given |a| and |b| run the following steps:
1. Let |shapeA| be |a|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}} and |sizeA| the [=list/size=] of |shapeA|. - 1. Let |shapeB| be |a|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}} and |sizeB| the [=list/size=] of |shapeB|. + 1. Let |shapeB| be |b|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}} and |sizeB| the [=list/size=] of |shapeB|. 1. If |sizeA| and |sizeB| is `1`, return `« 1 »`. 1. If | sizeA| is `1` and |sizeB| is not, then insert `1` in the front of |shapeA| to become [ 1 | |shapeA| ] and let |sizeA| be `2`. 1. If | sizeB| is `1` and |sizeA| is not, then insert `1` in the front of |shapeB| to become [ 1 | |shapeB| ] and let |sizeB| be `2`. @@ -4056,6 +4057,7 @@ partial interface MLGraphBuilder { 1. Return |shape|.
+
From 54e703bede305ea48415171a8fdc7bce3c251b57 Mon Sep 17 00:00:00 2001 From: Joshua Bell Date: Mon, 21 Aug 2023 14:44:20 -0700 Subject: [PATCH 092/112] Split out createContext/createContextSync overloads --- index.bs | 77 +++++++++++++++++++++++++++++++++++++++++++++----------- 1 file changed, 62 insertions(+), 15 deletions(-) diff --git a/index.bs b/index.bs index 9f57eb44..4edad5f6 100644 --- a/index.bs +++ b/index.bs @@ -809,41 +809,88 @@ string "webnn". Its default allowlist is 'self'. ### The {{ML/createContext()}} method ### {#api-ml-createcontext} + +
+
+ + To create a context given |options|, run these steps: + +
+ 1. Let |context| be a new {{MLContext}} object. + 1. If |options| is a {{GPUDevice}} object, + 1. Set |context|.{{[[contextType]]}} to "[=webgpu-context|webgpu=]". + 1. Set |context|.{{[[deviceType]]}} to "[=device-type-gpu|gpu=]". + 1. Set |context|.{{[[powerPreference]]}} to "[=power-preference-default|default=]". + 1. Otherwise, + 1. Set |context|.{{[[contextType]]}} to "[=default-context|default=]". + 1. If |options|["{{deviceType}}"] [=map/exists=], then set |context|.{{[[deviceType]]}} to |options|["{{deviceType}}"]. Otherwise, set |context|.{{[[deviceType]]}} to "[=device-type-cpu|cpu=]". + 1. If |options|["{{powerPreference}}"] [=map/exists=], then set |context|.{{[[powerPreference]]}} to |options|["{{powerPreference}}"]. Otherwise, set |context|.{{[[powerPreference]]}} to "[=power-preference-default|default=]". + 1. Return |context|. +
+
+
+ +
- The {{ML/createContext(options)}} method steps are: + The createContext(|options|) steps are: -
+
1. If [=this=]'s [=relevant global object=]'s [=associated Document=] is not [=allowed to use=] the [=webnn-feature|webnn=] feature, return [=a new promise=] [=rejected=] with a "{{SecurityError}}" {{DOMException}}. 1. Let |promise| be [=a new promise=]. 1. Return |promise| and run the following steps [=in parallel=]. - 1. Run the create context steps given |options|: - 1. Let |context| be a new {{MLContext}} object. - 1. If |options| is a {{GPUDevice}} object, - 1. Set |context|.{{[[contextType]]}} to "[=webgpu-context|webgpu=]". - 1. Set |context|.{{[[deviceType]]}} to "[=device-type-gpu|gpu=]". - 1. Set |context|.{{[[powerPreference]]}} to "[=power-preference-default|default=]". - 1. Otherwise, - 1. Set |context|.{{[[contextType]]}} to "[=default-context|default=]". - 1. If |options|["{{deviceType}}"] [=map/exists=], then set |context|.{{[[deviceType]]}} to |options|["{{deviceType}}"]. Otherwise, set |context|.{{[[deviceType]]}} to "[=device-type-cpu|cpu=]". - 1. If |options|["{{powerPreference}}"] [=map/exists=], then set |context|.{{[[powerPreference]]}} to |options|["{{powerPreference}}"]. Otherwise, set |context|.{{[[powerPreference]]}} to "[=power-preference-default|default=]". + 1. Let |context| be the result of [=creating a context=] given |options|. 1. If validating MLContext given |context| returns `false`, [=reject=] |promise| with a "{{NotSupportedError}}" {{DOMException}}. 1. [=Resolve=] |promise| with |context|.
+
+ +
+
+ + The createContext(|gpuDevice|) method steps are: + +
+ 1. If [=this=]'s [=relevant global object=]'s [=associated Document=] is not [=allowed to use=] the [=webnn-feature|webnn=] feature, return [=a new promise=] [=rejected=] with a "{{SecurityError}}" {{DOMException}}. + 1. Let |promise| be [=a new promise=]. + 1. Return |promise| and run the following steps [=in parallel=]. + 1. Let |context| be the result of [=creating a context=] given |gpuDevice|. + 1. If validating MLContext given |context| returns `false`, [=reject=] |promise| with a "{{NotSupportedError}}" {{DOMException}}. + 1. [=Resolve=] |promise| with |context|. +
+
+
### The {{ML/createContextSync()}} method ### {#api-ml-createcontextsync} + +
+
+ + The createContextSync(|options|) method steps are: + +
+ 1. If [=this=]'s [=relevant global object=]'s [=associated Document=] is not [=allowed to use=] the [=webnn-feature|webnn=] feature, then [=exception/throw=] a "{{SecurityError}}" {{DOMException}}. + 1. Let |context| be the result [=creating a context=] |options|. + 1. If validating MLContext given |context| return `false`, then [=exception/throw=] a "{{NotSupportedError}}" {{DOMException}}. + 1. Return |context|. +
+
+
+ +
- The {{ML/createContextSync(options)}} method steps are: + The createContextSync(|gpuDevice|) method steps are: -
+
1. If [=this=]'s [=relevant global object=]'s [=associated Document=] is not [=allowed to use=] the [=webnn-feature|webnn=] feature, then [=exception/throw=] a "{{SecurityError}}" {{DOMException}}. - 1. Let |context| be the result of running the create context steps given |options|. + 1. Let |context| be the result [=creating a context=] with |gpuDevice|. 1. If validating MLContext given |context| return `false`, then [=exception/throw=] a "{{NotSupportedError}}" {{DOMException}}. 1. Return |context|.
+
## The MLGraph interface ## {#api-mlgraph} The {{MLGraph}} interface represents a compiled computational graph. A compiled graph once constructed is immutable and cannot be subsequently changed. From 5cad5651acdc27a607a119a16cd4a5eec86a4b38 Mon Sep 17 00:00:00 2001 From: Joshua Bell Date: Mon, 21 Aug 2023 15:09:21 -0700 Subject: [PATCH 093/112] Drop some unnecessary lt attributes --- index.bs | 6 +++--- 1 file changed, 3 insertions(+), 3 deletions(-) diff --git a/index.bs b/index.bs index 4edad5f6..7d04b966 100644 --- a/index.bs +++ b/index.bs @@ -1175,7 +1175,7 @@ When the {{[[contextType]]}} is set to [=default-context|default=] with the {{ML ### The {{MLContext}} validation algorithm ### {#api-mlcontext-validate}
- To validate MLContext, given |context|, run these steps: + To validate MLContext, given |context|, run these steps:
1. If |context|.{{[[contextType]]}} is not "[=webgpu-context|webgpu=]" or "[=default-context|default=]", return `false`. @@ -1222,7 +1222,7 @@ partial interface MLContext {
- To validate graph resources, given |resources| and |descriptors|, run the following steps: + To validate graph resources, given |resources| and |descriptors|, run the following steps:
1. [=Assert=]: the type of |resources| is {{MLNamedArrayBufferViews}}. @@ -1247,7 +1247,7 @@ partial interface MLContext {
- To execute graph, given |graph|, |inputs| and |outputs|, run the following steps: + To execute graph, given |graph|, |inputs| and |outputs|, run the following steps:
1. [=Assert=]: the type of |inputs| is {{MLNamedArrayBufferViews}}. From ef939ec0b89b6731e91f2e7adb495af75d6b01f9 Mon Sep 17 00:00:00 2001 From: Zoltan Kis Date: Tue, 22 Aug 2023 13:09:03 +0300 Subject: [PATCH 094/112] Fix some of the review comments for lstm() and lstmCell(). Fix more typos for #446. Signed-off-by: Zoltan Kis --- index.bs | 21 +++++++++++---------- 1 file changed, 11 insertions(+), 10 deletions(-) diff --git a/index.bs b/index.bs index 7d04b966..8c07bd6c 100644 --- a/index.bs +++ b/index.bs @@ -2455,7 +2455,7 @@ partial interface MLGraphBuilder {
- To broadcast shapes given |shape1| and |shape2|, run the following steps: + To broadcast-shapes given |shape1| and |shape2|, run the following steps:
1. [=Assert=]: The type of |shape1| and |shape2| is `sequence of unsigned long`. @@ -2554,7 +2554,6 @@ partial interface MLGraphBuilder {
1. [=Assert=]: |op| is one of "abs", "ceil", "cos", "exp", "floor", "log", "neg", "sin", "tan". 1. [=Assert=]: the type of |input| is {{MLOperand}}. - 1. Let |descriptor| be a new {{MLOperandDescriptor}}. 1. If any of the following sub-steps fail, [=exception/throw=] an "{{OperationError}}" {{DOMException}}. 1. Let |output| be the result of copying an MLOperand given |input|. 1. Make a request to the underlying platform to: @@ -2759,7 +2758,7 @@ partial interface MLGraphBuilder {
1. [=Assert=]: the type of |a| and |b| is {{MLOperand}}. 1. Let |shapeA| be |a|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}} and |sizeA| the [=list/size=] of |shapeA|. - 1. Let |shapeB| be |a|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}} and |sizeB| the [=list/size=] of |shapeB|. + 1. Let |shapeB| be |b|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}} and |sizeB| the [=list/size=] of |shapeB|. 1. If |sizeA| is not `2` or |sizeB| is not `2`, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. 1. If |options|.{{MLGemmOptions/aTranspose}} is `true`, then let |shapeA| be the reverse array of |shapeA|. 1. If |options|.{{MLGemmOptions/bTranspose}} is `true`, then let |shapeB| be the reverse array of |shapeB|. @@ -3663,7 +3662,7 @@ partial interface MLGraphBuilder { : peepholeWeight :: - An {{MLOperand}}. Specifies the 2-D weight tensor for peepholes of shape [num_directions, 4 * hidden_size]. The pack ordering of the weight vectors is for the `input (i)`, `output (o)`, and `forget (f)` gate, respectively. + An {{MLOperand}}. Specifies the 2-D weight tensor for peepholes of shape [num_directions, 3 * hidden_size]. The pack ordering of the weight vectors is for the `input (i)`, `output (o)`, and `forget (f)` gate, respectively. : initialHiddenState :: @@ -3748,13 +3747,15 @@ partial interface MLGraphBuilder { 1. [=Assert=]: the type of its elements is {{MLActivation}}. 1. If any of the following sub-steps fail, [=exception/throw=] an "{{OperationError}}" {{DOMException}}. 1. Let |desc| a new {{MLOperandDescriptor}}. - 1. Set |desc|.{{MLOperandDescriptor/dimensions}} to [ |nume_directions|, |batch_size|, |hiddenSize| ]. + 1. Set |desc|.{{MLOperandDescriptor/dimensions}} to [ |num_directions|, |batch_size|, |hiddenSize| ]. 1. Set |desc|.{{MLOperandDescriptor/type}} to |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/type}}. 1. Let |output0| be the result of creating an MLOperand given [=this=] and |desc|. 1. Let |output1| be the result of creating an MLOperand given [=this=] and |desc|. - 1. Set |desc|.{{MLOperandDescriptor/dimensions}} to [ |steps|, |nume_directions|, |batch_size|, |hiddenSize| ]. - 1. Let |output2| be the result of creating an MLOperand given [=this=] and |desc|. - 1. Let |output| be the array [ |output0|, |output1|, |output2 ]. + 1. Set |desc|.{{MLOperandDescriptor/dimensions}} to [ |steps|, |num_directions|, |batch_size|, |hiddenSize| ]. + 1. If |options|.{{MLLstmOptions/returnSequence}} is set to true: + 1. Let |output2| be the result of creating an MLOperand given [=this=] and |desc|. + 1. Let |output| be the array [ |output0|, |output1|, |output2| ]. + 1. Otherwise, Let |output| be the array [ |output0|, |output1| ]. 1. Make a request to the underlying platform to: 1. Let |opImpl| be an [=implementation-defined=] platform operator for the LSTM operation, given |weight|, |recurrentWeight|, |steps|, |hiddenSize| and |options|. 1. Store a reference of |opImpl| in |output0|.{{MLOperand/[[operator]]}}, |output1|.{{MLOperand/[[operator]]}} and |output2|.{{MLOperand/[[operator]]}}. @@ -4823,7 +4824,7 @@ partial interface MLGraphBuilder { 1. If the [=list/size=] of |newShape| is `0`, set |outputShape| to `« 1 »` (reshaping to scalar). 1. If |newShape| contains more than one `null` value, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. 1. If any value in |newShape| is `0`, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. - 1. Let |inputElementCount| be the product of all elements in |inputs|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}. + 1. Let |inputElementCount| be the product of all elements in |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}. 1. If |newShape| contains a `null` value, set that value to |inputElementCount| divided by the product of all other values in |newShape|. 1. If that value is too large for {{unsigned long}}, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. 1. If product of all values in |newShape| is not equal to |inputElementCount|, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. @@ -5162,7 +5163,7 @@ partial interface MLGraphBuilder { 1. If any of the following sub-steps fail, [=exception/throw=] an "{{OperationError}}" {{DOMException}}. 1. Let |output| be the result of copying an MLOperand given |input|. 1. Make a request to the underlying platform to: - 1. Let |opImpl| be an [=implementation-defined=] platform operator for the softsign operation, given |options|. + 1. Let |opImpl| be an [=implementation-defined=] platform operator for the softsign operation. 1. Store a reference of |opImpl| in |output|.{{MLOperand/[[operator]]}}. 1. Create an [=implementation-defined=] platform operand |outputImpl| to represent the output, given |output| and |opImpl|. 1. Store a reference to |outputImpl| in |output|.{{MLOperand/[[operand]]}}. From f7d29a2e3c2a54aa1de780cc8ec52ad25633a783 Mon Sep 17 00:00:00 2001 From: Zoltan Kis Date: Tue, 22 Aug 2023 13:14:33 +0300 Subject: [PATCH 095/112] Replace 'between' with [=the range=] definitions for #446. Signed-off-by: Zoltan Kis --- index.bs | 16 ++++++++-------- 1 file changed, 8 insertions(+), 8 deletions(-) diff --git a/index.bs b/index.bs index 8c07bd6c..a3f84a1c 100644 --- a/index.bs +++ b/index.bs @@ -1846,7 +1846,7 @@ partial interface MLGraphBuilder {
1. [=Assert=]: the type of |input|, |mean| and |variance| is {{MLOperand}}. - 1. If |options|.axis is not a number between 0 and the [=rank=] of |input|, then [=exception/throw=] a {{TypeError}}. + 1. If |options|.axis is not a number in [=the range=] 0 to the [=rank=] of |input|, exclusive, then [=exception/throw=] a {{TypeError}}. 1. If the [=list/size=] of |mean|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}} is not equal with |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}[|options|.{{MLBatchNormalizationOptions/axis}}], then [=exception/throw=] a {{TypeError}}. 1. If the [=list/size=] of |variance|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}} is not equal with |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}[|options|.{{MLBatchNormalizationOptions/axis}}], then [=exception/throw=] a {{TypeError}}. 1. If |options|.{{MLBatchNormalizationOptions/scale}} [=map/exists=] and its [=list/size=] is not equal with |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}[|options|.{{MLBatchNormalizationOptions/axis}}], then [=exception/throw=] a {{TypeError}}. @@ -2022,9 +2022,9 @@ partial interface MLGraphBuilder { 1. Let |desc| be |inputs|[0].{{MLOperand/[[descriptor]]}}. 1. If |axis| is greater than or equal to the [=rank=] of |desc|, fail. 1. Let |desc|.{{MLOperandDescriptor/dimensions}}[|axis|] be `0`. - 1. [=map/For each=] |index| between 0 and the [=rank=] of |inputs|: + 1. [=map/For each=] |index| in [=the range=] 0 to the [=rank=] of |inputs|, exclusive: 1. If validating MLOperand given |inputs|[|index|] and [=this=] returns `false`, then fail. - 1. [=map/For each=] |dim| between 0 and the [=rank=] of |inputs|[|index|].{{MLOperandDescriptor/dimensions}}: + 1. [=map/For each=] |dim| in [=the range=] 0 to the [=rank=] of |inputs|[|index|].{{MLOperandDescriptor/dimensions}}, exclusive:
If the shape of each corresponding dimension and type of the operands, except for those of the dimension given by |axis|, is not the same, fail.
@@ -4100,7 +4100,7 @@ partial interface MLGraphBuilder { 1. If | sizeA| is `1` and |sizeB| is not, then insert `1` in the front of |shapeA| to become [ 1 | |shapeA| ] and let |sizeA| be `2`. 1. If | sizeB| is `1` and |sizeA| is not, then insert `1` in the front of |shapeB| to become [ 1 | |shapeB| ] and let |sizeB| be `2`. 1. Let |shape| be an array whose size |size| is the maximum of |sizeA| and |sizeB|. - 1. [=map/For each=] |index| between 0 and |size|: + 1. [=map/For each=] |index| in [=the range=] 0 to |size|, exclusive: 1. Set |shape|[|index|] to the maximum of |shapeA|[|index|] and |shapeB|[|index|]. 1. Return |shape|.
@@ -4185,7 +4185,7 @@ partial interface MLGraphBuilder {
1. Let |shape| be a copy of |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}. - 1. For |index| between `0` and the [=rank=] of |shape|: + 1. For |index| in [=the range=] 0 to the [=rank=] of |shape|, exclusive: 1. Add to |shape|[|index|] the value of |beginningPadding|[|index|]. 1. Add to |shape|[|index|] the value of |endingPadding|[|index|]. 1. Return |shape|. @@ -4759,7 +4759,7 @@ partial interface MLGraphBuilder {
1. Let |desc| be an {{MLOperandDescriptor}} initialized to |input|.{{MLOperand/[[descriptor]]}}. 1. If |options|.{{MLResample2dOptions/sizes}} [=map/exists=], then set |desc|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}} to |options|.{{MLResample2dOptions/sizes}} and return |desc|. - 1. For |index| between `0` and the [=rank=] of |desc|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}: + 1. For |index| in [=the range=] 0 to the [=rank=] of |desc|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}, exclusive: 1. Let |inputSize| be the [=list/size=] of |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}[|index|]. 1. Let |outputSize| be |inputSize| multiplied by |options|.{{MLResample2dOptions/scales}}. 1. If that fails or |outputSize| is not a positive [=number=], then [=exception/throw=] a "{{DataError}}" {{DOMException}}. @@ -5309,7 +5309,7 @@ partial interface MLGraphBuilder { 1. Let |dimensions| be |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}. 1. Let |axesLength| be the [=list/size=] of |options|.{{MLSqueezeOptions/axes}}. 1. If |axesLength| is not smaller than the rank of |dimensions|, - 1. For |index| between 0 and |axesLength|: + 1. For |index| in [=the range=] 0 to |axesLength|, exclusive: 1. Let |oneDimIndex| be |options|.{{MLSqueezeOptions/axes}}[|index|]. 1. If |dimensions|[|oneDimIndex|] is not `1`, then [=exception/throw=] a {{TypeError}}. 1. If any of the following sub-steps fail, [=exception/throw=] an "{{OperationError}}" {{DOMException}}. @@ -5437,7 +5437,7 @@ partial interface MLGraphBuilder { 1. If |options|.{{MLTransposeOptions/permutation}} does not [=map/exist=], let |options|.{{MLTransposeOptions/permutation}} be the reversed sequence of all indices for |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}. 1. Otherwise if |options|.{{MLTransposeOptions/permutation}} [=map/exists=]: 1. If the [=rank=] of |options|.{{MLTransposeOptions/permutation}} is not the same as the [=rank=] of |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}, then [=exception/throw=] a {{TypeError}}. - 1. If the values in |options|.{{MLTransposeOptions/permutation}} are not between `0` and the [=rank=] of |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}} minus `1`, then [=exception/throw=] a {{TypeError}}. + 1. If the values in |options|.{{MLTransposeOptions/permutation}} are not in [=the range=] 0 and the [=rank=] of |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}} exclusive, then [=exception/throw=] a {{TypeError}}. 1. If the values in |options|.{{MLTransposeOptions/permutation}} contain duplicate value, then [=exception/throw=] a {{TypeError}}. 1. If any of the following sub-steps fail, [=exception/throw=] an "{{OperationError}}" {{DOMException}}. 1. Let |output| be the result of copying an MLOperand given |input|. From a06557c639be63cf76dafb38aff0d708532f0852 Mon Sep 17 00:00:00 2001 From: Joshua Bell Date: Mon, 21 Aug 2023 13:07:01 -0700 Subject: [PATCH 096/112] git squash commit for zk-conventions-integration. bca82eb627bb0dff61f94c4d2b95633fec0dc43c mostly done except dupe dom-x ids c8981d14180c0d5e5c11c31fe4ad1a3c1bb3320f one more --- index.bs | 477 ++++++++++++++++++++++++++++++++++++++----------------- 1 file changed, 329 insertions(+), 148 deletions(-) diff --git a/index.bs b/index.bs index a3f84a1c..9886518d 100644 --- a/index.bs +++ b/index.bs @@ -995,10 +995,12 @@ interface MLOperand {};
+
To get the rank of an {{MLOperand}} |operand|, run the following steps: -
+
1. Return the [=list/size=] of |operand|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}.
+
Since the {{MLOperand/[[builder]]}} object is bound by the {{MLGraphBuilder/constructor()}} constructor to an {{MLContext}} object, an {{MLOperand}} is also always bound to the same {{MLContext}} object. @@ -1206,11 +1208,12 @@ partial interface MLContext { **Returns:** {{undefined}}.
+
- The {{MLContext/computeSync(graph, inputs, outputs)}} method steps are: + The computeSync(|graph|, |inputs|, |outputs|) method steps are: -
+
1. If |graph|.{{MLGraph/[[context]]}}.{{MLContext/[[contextType]]}} is not "[=default-context|default=]", [=exception/throw=] an "{{OperationError}}" {{DOMException}}. 1. If validating graph resources given |inputs| and |graph|.{{MLGraph/[[inputDescriptors]]}} returns `false`, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. 1. If validating graph resources given |outputs| and |graph|.{{MLGraph/[[outputDescriptors]]}} returns `false`, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. @@ -1219,6 +1222,7 @@ partial interface MLContext { 1. Return {{undefined}}.
+
@@ -1356,11 +1360,12 @@ partial interface MLContext { **Returns:** Promise<{{MLComputeResult}}>.
+
- The {{MLContext/compute(graph, inputs, outputs)}} method steps are: + The compute(|graph|, |inputs|, |outputs|) method steps are: -
+
1. Let |promise| be [=a new promise=]. 1. Return |promise| and run the following steps [=in parallel=]: 1. If |graph|.{{MLGraph/[[context]]}}.{{MLContext/[[contextType]]}} is not "[=default-context|default=]", [=reject=] |promise| with an "{{OperationError}}" {{DOMException}}. @@ -1377,6 +1382,7 @@ partial interface MLContext { 1. [=Resolve=] |promise| with |result|.
+
#### Examples #### {#api-mlcontext-async-execution-examples}
@@ -1465,16 +1471,18 @@ partial interface MLCommandEncoder { **Returns:** {{undefined}}.
+
- The {{MLCommandEncoder/initializeGraph(graph)}} steps are: + The initializeGraph(|graph|) method steps are: -
+
- Graph initialization stage typically involves a process known as "weight preprocessing" where all the constant inputs to the graph are preprocessed and cached at the operating system level for subsequent graph execution calls. The initializing inputs are typically the constant weight data specified through the {{MLGraphBuilder/constant(descriptor, bufferView)|MLGraphBuilder/constant()}} method as constant operands during graph construction time. + Graph initialization stage typically involves a process known as "weight preprocessing" where all the constant inputs to the graph are preprocessed and cached at the operating system level for subsequent graph execution calls. The initializing inputs are typically the constant weight data specified through the {{MLGraphBuilder/constant(descriptor, bufferView)|MLGraphBuilder/constant(value, type)}} method as constant operands during graph construction time.
+
### Dispatch Execution Commands ### {#api-mlcommandencoder-dispatch-commands} Record the {{MLGraph}} execution with the inputs {{MLNamedGPUResources}} and outputs {{MLNamedGPUResources}}. @@ -1494,11 +1502,12 @@ partial interface MLCommandEncoder { **Returns:** {{undefined}}.
+
- The {{MLCommandEncoder/dispatch(graph, inputs, outputs)}} steps are: + The dispatch(|graph|, |inputs|, |outputs|) method steps are: -
+
1. If any of the following requirements are unmet, then [=exception/throw=] a "{{DataError}}" {{DOMException}}.
1. [=map/For each=] |key| → |value| of |inputs|: @@ -1522,6 +1531,7 @@ partial interface MLCommandEncoder { 1. Return {{undefined}}.
+
### Generate GPU Command Buffer ### {#api-mlcommandencoder-generate-gpu-command-buffer} Complete the recording of ML workload and return a WebGPU-compatible {{GPUCommandBuffer}} containing the recorded workload. @@ -1539,11 +1549,12 @@ partial interface MLCommandEncoder { **Returns:** {{GPUCommandBuffer}}.
+
- The {{MLCommandEncoder/finish(descriptor)}} method steps are: + The finish(|descriptor|) method steps are: -
+
1. If any of the following sub-steps fail, [=exception/throw=] an "{{OperationError}}" {{DOMException}}. 1. Make a request to the underlying platform to complete the recording of the ML workload, given |descriptor|.
@@ -1552,6 +1563,7 @@ partial interface MLCommandEncoder { 1. Return a {{GPUCommandBuffer}} containing the recorded workload.
+
## The MLGraphBuilder interface ## {#api-mlgraphbuilder} @@ -1620,9 +1632,11 @@ Both {{MLGraphBuilder}}.{{MLGraphBuilder/build()}} and {{MLGraphBuilder}}.{{MLGr
### The {{MLGraphBuilder}} constructor ### {#api-mlgraphbuilder-constructor} + +
- The [=new=] {{MLGraphBuilder(context)}} constructor steps are: + The [=new=] MLGraphBuilder(context) constructor steps are:
1. If [=this=]'s [=relevant global object=]'s [=associated Document=] is not [=allowed to use=] the [=webnn-feature|webnn=] feature, then [=exception/throw=] a "{{SecurityError}}" {{DOMException}}. @@ -1630,6 +1644,7 @@ Both {{MLGraphBuilder}}.{{MLGraphBuilder/build()}} and {{MLGraphBuilder}}.{{MLGr 1. Set {{MLGraphBuilder/[[context]]}} to |context|.
+
### The {{MLGraphBuilder/input()}} method ### {#api-mlgraphbuilder-input} Create a named {{MLOperand}} based on a descriptor, that can be used as an input. @@ -1641,11 +1656,12 @@ Create a named {{MLOperand}} based on a descriptor, that can be used as an input **Returns:**: an {{MLOperand}} object.
+
- The {{MLGraphBuilder/input(name, descriptor)}} steps are: + The input(|name|, |descriptor|) method steps are: -
+
The permissions and context validity have been checked by [[#api-mlgraphbuilder-constructor]] steps.
@@ -1665,16 +1681,19 @@ Create a named {{MLOperand}} based on a descriptor, that can be used as an input 1. Return |operand|.
+
### The build() method ### {#api-mlgraphbuilder-build} Build a composed graph up to a given output operand into a computational graph, asynchronously or synchronously. #### The {{MLGraphBuilder/build(outputs)}} method #### {#api-mlgraphbuilder-build-outputs} + +
- The {{MLGraphBuilder/build(outputs)}} steps are: + The build(|outputs|) method steps are: -
+
The permissions and context validity have been checked by [[#api-mlgraphbuilder-constructor]] steps.
@@ -1684,13 +1703,16 @@ Build a composed graph up to a given output operand into a computational graph, 1. If that [=exception/throws=], re-[=exception/throw=] the error.
+
#### The {{MLGraphBuilder/buildSync(outputs)}} method #### {#api-mlgraphbuilder-buildsync-outputs} + +
- The {{MLGraphBuilder/buildSync(outputs)}} steps are: + The buildSync(|outputs|) method steps are: -
+
The permissions and context validity have been checked by [[#api-mlgraphbuilder-constructor]] steps.
@@ -1719,6 +1741,7 @@ Build a composed graph up to a given output operand into a computational graph, 1. Return |graph|.
+
### The constant() method ### {#api-mlgraphbuilder-constant-method} Create a constant {{MLOperand}} that can be used in {{MLGraphBuilder}} methods. @@ -1731,11 +1754,12 @@ Create a constant {{MLOperand}} that can be used in {{MLGraphBuilder}} methods. **Returns:**: an {{MLOperand}} object.
+
- The {{MLGraphBuilder/constant(descriptor, bufferView)}} steps are: + The constant(|descriptor|, |bufferView|) method steps are: -
+
The permissions and context validity have been checked by [[#api-mlgraphbuilder-constructor]] steps.
@@ -1753,6 +1777,7 @@ Create a constant {{MLOperand}} that can be used in {{MLGraphBuilder}} methods. 1. Return |operand|.
+
#### The {{MLGraphBuilder/constant(value, type)}} method #### {#api-mlgraphbuilder-constant-value-type} @@ -1763,11 +1788,12 @@ Create a constant {{MLOperand}} that can be used in {{MLGraphBuilder}} methods. **Returns:**: an {{MLOperand}} object.
+
- The {{MLGraphBuilder/constant(value, type)}} steps are: + The constant(|value|, |type|) method steps are: -
+
The permissions and context validity have been checked by [[#api-mlgraphbuilder-constructor]] steps.
@@ -1788,6 +1814,7 @@ Create a constant {{MLOperand}} that can be used in {{MLGraphBuilder}} methods. 1. Return |operand|.
+
### The batchNormalization() method ### {#api-mlgraphbuilder-batchnorm} Normalize the tensor values of input features across the batch dimension using [[Batch-Normalization]]. For each input feature, the mean and variance values of that feature supplied in this calculation as parameters are previously computed across the batch dimension of the input during the model training phase of this operation. @@ -1840,11 +1867,12 @@ partial interface MLGraphBuilder { **Returns:** an {{MLOperand}}. The batch-normalized N-D tensor of the same shape as *input*.
+
- The {{MLGraphBuilder/batchNormalization(input, mean, variance, options)}} method steps are: + The batchNormalization(|input|, |mean|, |variance|, |options|) method steps are: -
+
1. [=Assert=]: the type of |input|, |mean| and |variance| is {{MLOperand}}. 1. If |options|.axis is not a number in [=the range=] 0 to the [=rank=] of |input|, exclusive, then [=exception/throw=] a {{TypeError}}. 1. If the [=list/size=] of |mean|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}} is not equal with |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}[|options|.{{MLBatchNormalizationOptions/axis}}], then [=exception/throw=] a {{TypeError}}. @@ -1860,6 +1888,7 @@ partial interface MLGraphBuilder { 1. Return |output|.
+
@@ -1945,11 +1974,12 @@ partial interface MLGraphBuilder { - an {{MLOperand}}. The output tensor of the same shape as *operand*.
+
- The {{MLGraphBuilder/clamp(operand, options)}} method steps are: + The clamp(|operand|, |options|) method steps are: -
+
1. [=Assert=]: the type of |operand| is {{MLOperand}}. 1. If running the check clamp options steps given |options| returns `false`, then [=exception/throw=] a {{TypeError}}. 1. If any of the following sub-steps fail, [=exception/throw=] an "{{OperationError}}" {{DOMException}}. @@ -1964,6 +1994,7 @@ partial interface MLGraphBuilder { 1. Return |output|.
+
#### The {{MLGraphBuilder/clamp(options)}} method #### {#api-mlgraphbuilder-clamp-options}
@@ -1975,17 +2006,19 @@ partial interface MLGraphBuilder { - an {{MLActivation}}. The operator representing the clamp operation.
+
- The {{MLGraphBuilder/clamp(options)}} method steps are: + The clamp(|options|) method steps are: -
+
1. If running the check clamp options steps given |options| returns `false`, then [=exception/throw=] a {{TypeError}}. 1. Let |op| be the result of creating an MLActivation given [=this=], `"clamp"` and |options|. 1. If that [=exception/throws=] an error, re-[=exception/throw=] the error. 1. Return |op|.
+
### The concat() method ### {#api-mlgraphbuilder-concat} Concatenates the input tensors along a given axis. @@ -2006,11 +2039,12 @@ partial interface MLGraphBuilder { computed as the sum of all the input sizes of the same dimension.
+
- The {{MLGraphBuilder/concat(inputs, axis)}} steps are: + The concat(|inputs|, |axis|) method steps are: -
+
The permissions and context validity have been checked by [[#api-mlgraphbuilder-constructor]] steps.
@@ -2043,6 +2077,7 @@ partial interface MLGraphBuilder { 1. Return |output|.
+
### The conv2d() method ### {#api-mlgraphbuilder-conv2d} Compute a 2-D convolution given 4-D input and filter tensors @@ -2164,11 +2199,12 @@ partial interface MLGraphBuilder { for *"oihw"* layout, [height, width, 1, options.groups] for *"hwio"* layout, [options.groups, height, width, 1] for *"ohwi"* layout and [1, height, width, options.groups] for *"ihwo"* layout.
+
- The {{MLGraphBuilder/conv2d(input, filter, options)}} steps are: + The conv2d(|input|, |filter|, |options|) method steps are: -
+
1. [=Assert=]: the type of |input| and |filter| is {{MLOperand}}. 1. Let |input_size| be the [=list/size=] of |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}. 1. Let |filter_size| be the [=list/size=] of |filter|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}. @@ -2210,6 +2246,7 @@ partial interface MLGraphBuilder { 1. Return |output|.
+
### The convTranspose2d() method ### {#api-mlgraphbuilder-convtranspose2d} Compute a 2-D transposed convolution given 4-D input and filter tensors @@ -2339,11 +2376,12 @@ partial interface MLGraphBuilder { *output_size = (input_size - 1) ** *stride + (filter_size - 1) ** *dilation + 1 - beginning_padding - ending_padding + output_padding*
+
- The {{MLGraphBuilder/convTranspose2d(input, filter, options)}} steps are: + The convTranspose2d(|input|, |filter|, |options|) method steps are: -
+
1. [=Assert=]: the type of |input| and |filter| is {{MLOperand}}. 1. Let |input_size| be the [=list/size=] of |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}. 1. Let |filter_size| be the [=list/size=] of |filter|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}. @@ -2388,6 +2426,7 @@ partial interface MLGraphBuilder { 1. Return |output|.
+
### Element-wise binary operations ### {#api-mlgraphbuilder-binary} Compute the element-wise binary addition, subtraction, multiplication, division, power, maximum and minimum of the two input tensors. @@ -2428,11 +2467,12 @@ partial interface MLGraphBuilder { - *pow*: Compute the values of the values of the first input tensor to the power of the values of the second input tensor, element-wise.
+
To create element-wise binary operation given |op|, |a| and |b|, run the following steps: -
+
1. [=Assert=]: |op| is one of "add", "sub", "mul", "div", "max", "min", "pow". 1. [=Assert=]: the type of |a| and |b| is {{MLOperand}}. 1. If |a|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/type}} is not equal to |b|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/type}}, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. @@ -2452,12 +2492,14 @@ partial interface MLGraphBuilder { 1. Return |output|.
+
+
To broadcast-shapes given |shape1| and |shape2|, run the following steps: -
+
1. [=Assert=]: The type of |shape1| and |shape2| is `sequence of unsigned long`. 1. Let |output| be the result of invoking the [=implementation-defined=] shape broadcast on |shape1| and |shape2|. 1. If that fails, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. @@ -2467,46 +2509,61 @@ partial interface MLGraphBuilder {
+
The element-wise binary operation algorithms invoke the [=MLGraphBuilder/element-wise-binary-op | create element-wise binary operation=] steps as follows.
- The {{MLGraphBuilder/add(a, b)}} steps are: +
+ The add(|a|, |b|) method steps are: 1. Let |output| be the result of running the [=MLGraphBuilder/element-wise-binary-op | create element-wise binary operation=] given "add", |a| and |b|. 1. If that [=exception/throws=] an error, then re-[=exception/throw=] the error. 1. Return |output|. +
- The {{MLGraphBuilder/sub(a, b)}} steps are: +
+ The sub(|a|, |b|) method steps are: 1. Let |output| be the result of running the [=MLGraphBuilder/element-wise-binary-op | create element-wise binary operation=] given "sub", |a| and |b|. 1. If that [=exception/throws=] an error, then re-[=exception/throw=] the error. 1. Return |output|. +
- The {{MLGraphBuilder/mul(a, b)}} steps are: +
+ The mul(|a|, |b|) method steps are: 1. Let |output| be the result of running the [=MLGraphBuilder/element-wise-binary-op | create element-wise binary operation=] given "mul", |a| and |b|. 1. If that [=exception/throws=] an error, then re-[=exception/throw=] the error. 1. Return |output|. +
- The {{MLGraphBuilder/div(a, b)}} steps are: +
+ The div(|a|, |b|) method steps are: 1. Let |output| be the result of running the [=MLGraphBuilder/element-wise-binary-op | create element-wise binary operation=] given "div", |a| and |b|. 1. If that [=exception/throws=] an error, then re-[=exception/throw=] the error. 1. Return |output|. +
- The {{MLGraphBuilder/max(a, b)}} steps are: +
+ The max(|a|, |b|) method steps are: 1. Let |output| be the result of running the [=MLGraphBuilder/element-wise-binary-op | create element-wise binary operation=] given "max", |a| and |b|. 1. If that [=exception/throws=] an error, then re-[=exception/throw=] the error. 1. Return |output|. +
- The {{MLGraphBuilder/min(a, b)}} steps are: +
+ The min(|a|, |b|) method steps are: 1. Let |output| be the result of running the [=MLGraphBuilder/element-wise-binary-op | create element-wise binary operation=] given "min", |a| and |b|. 1. If that [=exception/throws=] an error, then re-[=exception/throw=] the error. 1. Return |output|. +
- The {{MLGraphBuilder/pow(a, b)}} steps are: +
+ The pow(|a|, |b|) method steps are: 1. Let |output| be the result of running the [=MLGraphBuilder/element-wise-binary-op | create element-wise binary operation=] given "pow", |a| and |b|. 1. If that [=exception/throws=] an error, then re-[=exception/throw=] the error. 1. Return |output|. +
@@ -2547,11 +2604,12 @@ partial interface MLGraphBuilder { - *tan*: Compute the tangent of the input tensor, element-wise.
+
To create element-wise unary operation given |op| and |input|, run the following steps: -
+
1. [=Assert=]: |op| is one of "abs", "ceil", "cos", "exp", "floor", "log", "neg", "sin", "tan". 1. [=Assert=]: the type of |input| is {{MLOperand}}. 1. If any of the following sub-steps fail, [=exception/throw=] an "{{OperationError}}" {{DOMException}}. @@ -2566,56 +2624,75 @@ partial interface MLGraphBuilder { 1. Return |output|.
+
The element-wise unary operation algorithms invoke the [=MLGraphBuilder/element-wise-unary-op | create element-wise unary operation=] steps as follows.
- The {{MLGraphBuilder/abs(input)}} steps are: +
+ The abs(|input|) method steps are: 1. Let |output| be the result of running the [=MLGraphBuilder/element-wise-unary-op | create element-wise unary operation=] given "abs" and |input|. 1. If that [=exception/throws=] an error, then re-[=exception/throw=] the error. 1. Return |output|. +
- The {{MLGraphBuilder/ceil(input)}} steps are: +
+ The ceil(|input|) method steps are: 1. Let |output| be the result of running the [=MLGraphBuilder/element-wise-unary-op | create element-wise unary operation=] given "ceil" and |input|. 1. If that [=exception/throws=] an error, then re-[=exception/throw=] the error. 1. Return |output|. +
- The {{MLGraphBuilder/cos(input)}} steps are: +
+ The cos(|input|) method steps are: 1. Let |output| be the result of running the [=MLGraphBuilder/element-wise-unary-op | create element-wise unary operation=] given "cos" and |input|. 1. If that [=exception/throws=] an error, then re-[=exception/throw=] the error. 1. Return |output|. +
- The {{MLGraphBuilder/exp(input)}} steps are: +
+ The exp(|input|) method steps are: 1. Let |output| be the result of running the [=MLGraphBuilder/element-wise-unary-op | create element-wise unary operation=] given "exp" and |input|. 1. If that [=exception/throws=] an error, then re-[=exception/throw=] the error. 1. Return |output|. +
- The {{MLGraphBuilder/floor(input)}} steps are: +
+ The floor(|input|) method steps are: 1. Let |output| be the result of running the [=MLGraphBuilder/element-wise-unary-op | create element-wise unary operation=] given "floor" and |input|. 1. If that [=exception/throws=] an error, then re-[=exception/throw=] the error. 1. Return |output|. +
- The {{MLGraphBuilder/log(input)}} steps are: +
+ The log(|input|) method steps are: 1. Let |output| be the result of running the [=MLGraphBuilder/element-wise-unary-op | create element-wise unary operation=] given "log" and |input|. 1. If that [=exception/throws=] an error, then re-[=exception/throw=] the error. 1. Return |output|. +
- The {{MLGraphBuilder/neg(input)}} steps are: +
+ The neg(|input|) method steps are: 1. Let |output| be the result of running the [=MLGraphBuilder/element-wise-unary-op | create element-wise unary operation=] given "neg" and |input|. 1. If that [=exception/throws=] an error, then re-[=exception/throw=] the error. 1. Return |output|. +
- The {{MLGraphBuilder/sin(input)}} steps are: +
+ The sin(|input|) method steps are: 1. Let |output| be the result of running the [=MLGraphBuilder/element-wise-unary-op | create element-wise unary operation=] given "sin" and |input|. 1. If that [=exception/throws=] an error, then re-[=exception/throw=] the error. 1. Return |output|. +
- The {{MLGraphBuilder/tan(input)}} steps are: +
+ The tan(|input|) method steps are: 1. Let |output| be the result of running the [=MLGraphBuilder/element-wise-unary-op | create element-wise unary operation=] given "tan" and |input|. 1. If that [=exception/throws=] an error, then re-[=exception/throw=] the error. 1. Return |output|. +
@@ -2664,11 +2741,12 @@ partial interface MLGraphBuilder { - an {{MLOperand}}. The output tensor of the same shape as *input*.
+
- The {{MLGraphBuilder/elu(input, options)}} method steps are: + The elu(|input|, |options|) method steps are: -
+
1. If any of the following sub-steps fail, [=exception/throw=] an "{{OperationError}}" {{DOMException}}. 1. Let |output| be the result of copying an MLOperand given |input|. 1. Make a request to the underlying platform to: @@ -2681,6 +2759,7 @@ partial interface MLGraphBuilder { 1. Return |output|.
+
#### The {{MLGraphBuilder/elu(options)}} method #### {#api-mlgraphbuilder-elu-options}
@@ -2692,15 +2771,17 @@ partial interface MLGraphBuilder { - an {{MLActivation}}. The activation function representing the elu operation.
+
- The {{MLGraphBuilder/elu(options)}} method steps are: + The elu(|options|) method steps are: -
+
1. Let |op| be the result of creating an MLActivation given [=this=], `"elu"` and |options|. 1. Return |op|.
+
### The gemm() method ### {#api-mlgraphbuilder-gemm} Calculate the [general matrix multiplication of the Basic Linear Algebra Subprograms](https://en.wikipedia.org/wiki/Basic_Linear_Algebra_Subprograms#Level_3). The calculation follows the expression `alpha * A * B + beta * C`, where `A` is a 2-D tensor with shape [M, K] or [K, M], `B` is a 2-D tensor with shape [K, N] or [N, K], and `C` is broadcastable to the shape [M, N]. `A` and `B` may optionally be transposed prior to the calculation. @@ -2751,11 +2832,12 @@ partial interface MLGraphBuilder { **Returns:** an {{MLOperand}}. The output 2-D tensor of shape [M, N] that contains the calculated product of all the inputs.
+
- The {{MLGraphBuilder/gemm(a, b, options)}} steps are: + The gemm(|a|, |b|, |options|) method steps are: -
+
1. [=Assert=]: the type of |a| and |b| is {{MLOperand}}. 1. Let |shapeA| be |a|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}} and |sizeA| the [=list/size=] of |shapeA|. 1. Let |shapeB| be |b|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}} and |sizeB| the [=list/size=] of |shapeB|. @@ -2782,6 +2864,7 @@ partial interface MLGraphBuilder { 1. Return |output|.
+
@@ -2882,11 +2965,12 @@ partial interface MLGraphBuilder { **Returns:** a sequence of {{MLOperand}}. The first element of the sequence is a 3-D tensor of shape [num_directions, batch_size, hidden_size], the cell output from the last time step of the network. Additionally, if |options|.{{MLGruOptions/returnSequence}} is set to `true`, the second element is the 4-D output tensor of shape [steps, num_directions, batch_size, hidden_size] containing every cell outputs from each time step in the temporal sequence.
+
- The {{MLGraphBuilder/gru(input, weight, recurrentWeight, steps, hiddenSize, options)}} steps are: + The gru(|input|, |weight|, |recurrentWeight|, |steps|, |hiddenSize|, |options|) method steps are: -
+
1. [=Assert=]: the type of |input|, |weight| and |recurrentWeight| is {{MLOperand}}. 1. If the [=rank=] of |input| or |weight| is not `3`, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. 1. If the [=rank=] of |weight| or |recurrentWeight| is not `2`, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. @@ -2912,6 +2996,7 @@ partial interface MLGraphBuilder { 1. Return |output|.
+
@@ -3032,11 +3117,12 @@ partial interface MLGraphBuilder { **Returns:** an {{MLOperand}}. The 2-D tensor of shape [batch_size, hidden_size], the cell output hidden state of a single time step of the recurrent network.
+
- The {{MLGraphBuilder/gruCell(input, weight, recurrentWeight, hiddenState, hiddenSize, options)}} steps are: + The gruCell(|input|, |weight|, |recurrentWeight|, |hiddenState|, |hiddenSize|, |options|) method steps are: -
+
1. [=Assert=]: the type of |input|, |weight| and |recurrentWeight| is {{MLOperand}}. 1. If the [=rank=] of |input| or |weight| is not `3`, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. 1. If the [=rank=] of |weight| or |recurrentWeight| is not `2`, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. @@ -3063,6 +3149,7 @@ partial interface MLGraphBuilder { 1. Return |output|.
+
@@ -3221,11 +3308,12 @@ partial interface MLGraphBuilder { - an {{MLOperand}}. The output tensor of the same shape as *input*.
+
- The {{MLGraphBuilder/hardSigmoid(input, options)}} method steps are: + The hardSigmoid(|input|, |options|) method steps are: -
+
1. [=Assert=]: the type of |input| is {{MLOperand}}. 1. If any of the following sub-steps fail, [=exception/throw=] an "{{OperationError}}" {{DOMException}}. 1. Let |output| be the result of copying an MLOperand given |input|. @@ -3239,6 +3327,7 @@ partial interface MLGraphBuilder { 1. Return |output|.
+
#### The {{MLGraphBuilder/hardSigmoid(options)}} method #### {#api-mlgraphbuilder-hardsigmoid-options}
@@ -3249,16 +3338,18 @@ partial interface MLGraphBuilder { - an {{MLActivation}}. The activation function representing the hard sigmoid operation.
+
- The {{MLGraphBuilder/hardSigmoid(options)}} method steps are: + The hardSigmoid(|options|) method steps are: -
+
1. Let |op| be the result of creating an MLActivation given [=this=], `"hardSigmoid"` and |options|. 1. If that [=exception/throws=] an error, re-[=exception/throw=] the error. 1. Return |op|.
+
### The hardSwish() method ### {#api-mlgraphbuilder-hard-swish} Computes the nonlinear function `y = x * max(0, min(6, (x + 3))) / 6` that is introduced by [[MobileNetV3]] on the input tensor element-wise. @@ -3300,11 +3391,12 @@ partial interface MLGraphBuilder { - an {{MLOperand}}. The output tensor of the same shape as *input*.
+
- The {{MLGraphBuilder/hardSwish(input)}} method steps are: + The hardSwish(|input|) method steps are: -
+
1. If any of the following sub-steps fail, [=exception/throw=] an "{{OperationError}}" {{DOMException}}. 1. Let |output| be the result of copying an MLOperand given |input|. 1. Make a request to the underlying platform to: @@ -3317,6 +3409,7 @@ partial interface MLGraphBuilder { 1. Return |output|.
+
#### The {{MLGraphBuilder/hardSwish()}} method #### {#api-mlgraphbuilder-hardswish}
@@ -3327,16 +3420,18 @@ partial interface MLGraphBuilder { - an {{MLActivation}}. The activation function representing the hard-swish operation.
+
- The {{MLGraphBuilder/hardSwish()}} method steps are: + The hardSwish() method steps are: -
+
1. Let |op| be the result of creating an MLActivation given [=this=] and `"hardSwish"`. 1. If that [=exception/throws=] an error, re-[=exception/throw=] the error. 1. Return |op|.
+
### The instanceNormalization() method ### {#api-mlgraphbuilder-instancenorm} Normalize the input features using [[Instance-Normalization]]. Unlike [[#api-mlgraphbuilder-batchnorm]] where the mean and variance values used in the calculation are previously computed across the batch dimension during the model training phase, the mean and variance values used in the calculation of an instance normalization are computed internally on the fly per input feature. @@ -3383,11 +3478,12 @@ The {{MLInstanceNormalizationOptions}} members are: **Returns:** an {{MLOperand}}. The instance-normalized 4-D tensor of the same shape as *input*.
+
- The {{MLGraphBuilder/instanceNormalization(input, options)}} steps are: + The instanceNormalization(|input|, |options|) method steps are: -
+
1. [=Assert=]: the type of |input| is {{MLOperand}}. 1. If the [=rank=] of |input| is not `4`, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. 1. [=Assert=]: the type of |options|.{{MLInstanceNormalizationOptions/scale}} is {{MLOperand}}. @@ -3407,6 +3503,7 @@ The {{MLInstanceNormalizationOptions}} members are: 1. Return |output|.
+
@@ -3493,11 +3590,12 @@ partial interface MLGraphBuilder { - an {{MLOperand}}. The output tensor of the same shape as *input*.
+
- The {{MLGraphBuilder/leakyRelu(input, options)}} method steps are: + The leakyRelu(|input|, |options|) method steps are: -
+
1. If any of the following sub-steps fail, [=exception/throw=] an "{{OperationError}}" {{DOMException}}. 1. Let |output| be the result of copying an MLOperand given |input|. 1. Make a request to the underlying platform to: @@ -3510,6 +3608,7 @@ partial interface MLGraphBuilder { 1. Return |output|.
+
#### The {{MLGraphBuilder/leakyRelu(options)}} method #### {#api-mlgraphbuilder-leaky-relu-options}
@@ -3520,16 +3619,18 @@ partial interface MLGraphBuilder { - an {{MLActivation}}. The activation function representing the leaky relu operation.
+
- The {{MLGraphBuilder/leakyRelu(options)}} method steps are: + The leakyRelu(|options|) method steps are: -
+
1. Let |op| be the result of creating an MLActivation given [=this=], `"leakyRelu"` and |options|. 1. If that [=exception/throws=] an error, re-[=exception/throw=] the error. 1. Return |op|.
+
### The linear() method ### {#api-mlgraphbuilder-linear} Calculate a linear function `y = alpha * x + beta` on the input tensor. @@ -3584,11 +3685,12 @@ partial interface MLGraphBuilder { - an {{MLOperand}}. The output tensor of the same shape as *input*.
+
- The {{MLGraphBuilder/linear(input, options)}} method steps are: + The linear(|input|, |options|) method steps are: -
+
1. If any of the following sub-steps fail, [=exception/throw=] an "{{OperationError}}" {{DOMException}}. 1. Let |output| be the result of copying an MLOperand given |input|. 1. Make a request to the underlying platform to: @@ -3601,6 +3703,7 @@ partial interface MLGraphBuilder { 1. Return |output|.
+
#### The {{MLGraphBuilder/linear(options)}} method #### {#api-mlgraphbuilder-linear-options}
@@ -3611,16 +3714,18 @@ partial interface MLGraphBuilder { - an {{MLActivation}}. The activation function representing the linear operation.
+
- The {{MLGraphBuilder/linear(options)}} method steps are: + The linear(|options|) method steps are: -
+
1. Let |op| be the result of creating an MLActivation given [=this=], `"linear"` and |options|. 1. If that [=exception/throws=] an error, re-[=exception/throw=] the error. 1. Return |op|.
+
### The lstm() method ### {#api-mlgraphbuilder-lstm} Long Short-Term Memory [[LSTM]] recurrent network uses an input, output, forget, and cell gate to compute the output state that rolls into the output across the temporal sequence of the network. @@ -3701,11 +3806,12 @@ partial interface MLGraphBuilder { **Returns:** a sequence of {{MLOperand}}. The first element of the sequence is a 3-D tensor of shape [num_directions, batch_size, hidden_size], the output hidden state from the last time step of the network. The second element is a 3-D tensor of shape [num_directions, batch_size, hidden_size], the output cell state from the last time step of the network. Additionally, if |options|.{{MLLstmOptions/returnSequence}} is set to true, the third element is the 4-D output tensor of shape [steps, num_directions, batch_size, hidden_size] containing every output from each time step in the temporal sequence.
+
- The {{MLGraphBuilder/lstm(input, weight, recurrentWeight, steps, hiddenSize, options)}} steps are: + The lstm(|input|, |weight|, |recurrentWeight|, |steps|, |hiddenSize|, |options|) method steps are: -
+
1. If |options|.{{MLLstmOptions/direction}} is not one of {{MLRecurrentNetworkDirection}}, then [=exception/throw=] a {{TypeError}}. 1. Let |num_directions| be `1` if |options|.{{MLLstmOptions/direction}} is `"forward"`, or otherwise let it be `2`. 1. [=Assert=]: the type of |input|, |weight| and |recurrentWeight| is {{MLOperand}}. @@ -3766,6 +3872,7 @@ partial interface MLGraphBuilder { 1. Return |output|.
+
@@ -3903,11 +4010,12 @@ partial interface MLGraphBuilder { **Returns:** a sequence of {{MLOperand}}. The first element of the sequence is the output hidden state of the current time step of the recurrent network. The following element is the output cell state. Both elements are 2-D tensors of shape [batch_size, hidden_size].
+
- The {{MLGraphBuilder/lstmCell(input, weight, recurrentWeight, hiddenState, cellState, hiddenSize, options)}} steps are: + The lstmCell(|input|, |weight|, |recurrentWeight|, |hiddenState|, |cellState|, |hiddenSize|, |options|) method steps are: -
+
1. [=Assert=]: the type of |input|, |weight|, |recurrentWeight|, |hiddenState| and |cellState| is {{MLOperand}}. 1. If the [=rank=] of |input|, |weight|, |recurrentWeight|, |hiddenState| or |cellState| is not `2`, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. 1. Let |batch_size| be |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}[0]. @@ -3944,6 +4052,7 @@ partial interface MLGraphBuilder { 1. Return |output|.
+
@@ -4107,11 +4216,12 @@ partial interface MLGraphBuilder {
+
- The {{MLGraphBuilder/matmul(a, b)}} steps are: + The matmul(|a|, |b|) method steps are: -
+
1. [=Assert=]: the type of |a| and |b| is {{MLOperand}}. 1. Let |desc| a new {{MLOperandDescriptor}}. 1. Set |desc|.{{MLOperandDescriptor/dimensions}} to the result of invoking the calculate matmul output sizes steps given |a| and |b|. @@ -4128,6 +4238,7 @@ partial interface MLGraphBuilder { 1. Return |output|.
+
### The pad() method ### {#api-mlgraphbuilder-pad} Inflate the tensor with constant or mirrored values on the edges. @@ -4192,11 +4303,12 @@ partial interface MLGraphBuilder {
+
- The {{MLGraphBuilder/pad(input, beginningPadding, endingPadding, options)}} steps are: + The pad(|input|, |beginningPadding|, |endingPadding|, |options|) method steps are: -
+
1. [=Assert=]: the type of |input| is {{MLOperand}}. 1. If |options|.{{MLPadOptions/mode}} is not one of {{MLPaddingMode}}, then [=exception/throw=] a {{TypeError}}. 1. Let |desc| be a copy of |input|.{{MLOperand/[[descriptor]]}}. @@ -4213,6 +4325,7 @@ partial interface MLGraphBuilder { 1. Return |output|.
+
@@ -4371,11 +4484,12 @@ partial interface MLGraphBuilder { +
To create pooling operation given |op|, |input| and |options|, run the following steps: -
+
1. [=Assert=]: |op| is one of "averagePool2d", "l2Pool2d", "maxPool2d". 1. [=Assert=]: the type of |input| is {{MLOperand}}. 1. If the [=list/size=] of |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}} is not 4, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. @@ -4407,27 +4521,28 @@ partial interface MLGraphBuilder { 1. Return |output|.
+
The following pooling algorithms are supported. -
- The {{MLGraphBuilder/averagePool2d(input, options)}} steps are: +
+ The averagePool2d(|input|, |options|) method steps are: 1. Let |output| be the result of running the [=MLGraphBuilder/pooling-op | create pooling operation=] given `"averagePool2d"`, |input| and |options|. 1. If that [=exception/throws=] an error, then re-[=exception/throw=] the error. 1. Return |output|.
-
- The {{MLGraphBuilder/l2Pool2d(input, options)}} steps are: +
+ The l2Pool2d(|input|, |options|) method steps are: 1. Let |output| be the result of running the [=MLGraphBuilder/pooling-op | create pooling operation=] given `"l2Pool2d"`, |input| and |options|. 1. If that [=exception/throws=] an error, then re-[=exception/throw=] the error. 1. Return |output|.
-
- The {{MLGraphBuilder/maxPool2d(input, options)}} steps are: +
+ The maxPool2d(|input|, |options|) method steps are: 1. Let |output| be the result of running the [=MLGraphBuilder/pooling-op | create pooling operation=] given `"maxPool2d"`, |input| and |options|. 1. If that [=exception/throws=] an error, then re-[=exception/throw=] the error. 1. Return |output|. @@ -4451,11 +4566,12 @@ partial interface MLGraphBuilder { - an {{MLOperand}}. The output tensor of the same shape as *input*.
+
- The {{MLGraphBuilder/prelu(input, slope)}} steps are: + The prelu(|input|, |slope|) method steps are: -
+
1. [=Assert=]: the type of |input| and |slope| is {{MLOperand}}. 1. Let |descriptor| be a new {{MLOperandDescriptor}}. 1. Set |descriptor|.{{MLOperandDescriptor/type}} to |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/type}}. @@ -4473,6 +4589,7 @@ partial interface MLGraphBuilder { 1. Return |output|.
+
@@ -4537,11 +4654,12 @@ partial interface MLGraphBuilder { - *SumSquare*: Compute the sum of the square of all the input values along the axes.
+
To create reduce operation given |op|, |input| and |options|, run the following steps: -
+
1. [=Assert=]: |op| is one of "reduceL1", "reduceL2", "reduceLogSum", "reduceLogSumExp", "reduceMax", "reduceMean", "reduceMin", "reduceProduct", "reduceSum", "reduceSumSquare". 1. [=Assert=]: the type of |input| is {{MLOperand}}. 1. If any of the following sub-steps fail, [=exception/throw=] an "{{OperationError}}" {{DOMException}}. @@ -4556,60 +4674,81 @@ partial interface MLGraphBuilder { 1. Return |output|.
+
The following reduce algorithms are supported. - The {{MLGraphBuilder/reduceL1(input, options)}} steps are: +
+ The reduceL1(|input|, |options|) method steps are: 1. Let |output| be the result of running the [=MLGraphBuilder/reduce-op | create reduce operation=] given "reduceL1", |input| and |options|. 1. If that [=exception/throws=] an error, then re-[=exception/throw=] the error. 1. Return |output|. +
- The {{MLGraphBuilder/reduceL2(input, options)}} steps are: +
+ The reduceL2(|input|, |options|) method steps are: 1. Let |output| be the result of running the [=MLGraphBuilder/reduce-op | create reduce operation=] given "reduceL2", |input| and |options|. 1. If that [=exception/throws=] an error, then re-[=exception/throw=] the error. 1. Return |output|. +
- The {{MLGraphBuilder/reduceLogSum(input, options)}} steps are: +
+ The reduceLogSum(|input|, |options|) method steps are: 1. Let |output| be the result of running the [=MLGraphBuilder/reduce-op | create reduce operation=] given "reduceLogSum", |input| and |options|. 1. If that [=exception/throws=] an error, then re-[=exception/throw=] the error. 1. Return |output|. +
- The {{MLGraphBuilder/reduceLogSumExp(input, options)}} steps are: +
+ The reduceLogSumExp(|input|, |options|) method steps are: 1. Let |output| be the result of running the [=MLGraphBuilder/reduce-op | create reduce operation=] given "reduceLogSumExp", |input| and |options|. 1. If that [=exception/throws=] an error, then re-[=exception/throw=] the error. 1. Return |output|. +
- The {{MLGraphBuilder/reduceMax(input, options)}} steps are: +
+ The reduceMax(|input|, |options|) method steps are: 1. Let |output| be the result of running the [=MLGraphBuilder/reduce-op | create reduce operation=] given "reduceMax", |input| and |options|. 1. If that [=exception/throws=] an error, then re-[=exception/throw=] the error. 1. Return |output|. +
- The {{MLGraphBuilder/reduceMean(input, options)}} steps are: +
+ The reduceMean(|input|, |options|) method steps are: 1. Let |output| be the result of running the [=MLGraphBuilder/reduce-op | create reduce operation=] given "reduceMean", |input| and |options|. 1. If that [=exception/throws=] an error, then re-[=exception/throw=] the error. 1. Return |output|. +
- The {{MLGraphBuilder/reduceMin(input, options)}} steps are: +
+ The reduceMin(|input|, |options|) method steps are: 1. Let |output| be the result of running the [=MLGraphBuilder/reduce-op | create reduce operation=] given "reduceMin", |input| and |options|. 1. If that [=exception/throws=] an error, then re-[=exception/throw=] the error. 1. Return |output|. +
- The {{MLGraphBuilder/reduceProduct(input, options)}} steps are: +
+ The reduceProduct(|input|, |options|) method steps are: 1. Let |output| be the result of running the [=MLGraphBuilder/reduce-op | create reduce operation=] given "reduceProduct", |input| and |options|. 1. If that [=exception/throws=] an error, then re-[=exception/throw=] the error. 1. Return |output|. +
- The {{MLGraphBuilder/reduceSum(input, options)}} steps are: +
+ The reduceSum(|input|, |options|) method steps are: 1. Let |output| be the result of running the [=MLGraphBuilder/reduce-op | create reduce operation=] given "reduceSum", |input| and |options|. 1. If that [=exception/throws=] an error, then re-[=exception/throw=] the error. 1. Return |output|. +
- The {{MLGraphBuilder/reduceSumSquare(input, options)}} steps are: +
+ The reduceSumSquare(|input|, |options|) method steps are: 1. Let |output| be the result of running the [=MLGraphBuilder/reduce-op | create reduce operation=] given "reduceSumSquare", |input| and |options|. 1. If that [=exception/throws=] an error, then re-[=exception/throw=] the error. 1. Return |output|. +
### The relu() method ### {#api-mlgraphbuilder-relu-method} @@ -4645,11 +4784,12 @@ partial interface MLGraphBuilder { - an {{MLOperand}}. The output tensor of the same shape as *input*.
+
- The {{MLGraphBuilder/relu(input)}} steps are: + The relu(|input|) method steps are: -
+
1. [=Assert=]: the type of |input| is {{MLOperand}}. 1. If any of the following sub-steps fail, [=exception/throw=] an "{{OperationError}}" {{DOMException}}. 1. Let |output| be the result of copying an MLOperand given |input|. @@ -4663,6 +4803,7 @@ partial interface MLGraphBuilder { 1. Return |output|.
+
#### The {{MLGraphBuilder/relu()}} method #### {#api-mlgraphbuilder-relu}
@@ -4673,16 +4814,18 @@ partial interface MLGraphBuilder { - an {{MLActivation}}. The activation function representing the relu operation.
+
- The {{MLGraphBuilder/relu()}} method steps are: + The relu() method steps are: -
+
1. Let |op| be the result of creating an MLActivation given [=this=] and `"relu"`. 1. If that [=exception/throws=] an error, re-[=exception/throw=] the error. 1. Return |op|.
+
### The resample2d() method ### {#api-mlgraphbuilder-resample2d-method} Resample the tensor values from the source to the destination spatial dimensions according to the scaling factors. @@ -4737,11 +4880,12 @@ partial interface MLGraphBuilder { The default value is [2, 3]. +
To check resample options given |options|, run the following steps: -
+
1. If |options|.{{MLResample2dOptions/mode}} [=map/exists=], and if its value is not one of `"nearest-neighbor"` or `"linear"`, return `false`. 1. If |options|.{{MLResample2dOptions/scales}} does not [=map/exist=], set it to to `« 1.0, 1.0 »`. 1. Otherwise, if any of its values is not greater than `0`, return `false`. @@ -4751,12 +4895,14 @@ partial interface MLGraphBuilder { 1. Return `true`.
+
+
To resample output sizes given |input| and |options|, run the following steps: -
+
1. Let |desc| be an {{MLOperandDescriptor}} initialized to |input|.{{MLOperand/[[descriptor]]}}. 1. If |options|.{{MLResample2dOptions/sizes}} [=map/exists=], then set |desc|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}} to |options|.{{MLResample2dOptions/sizes}} and return |desc|. 1. For |index| in [=the range=] 0 to the [=rank=] of |desc|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}, exclusive: @@ -4767,12 +4913,14 @@ partial interface MLGraphBuilder { 1. Return |desc|.
+
+
- The {{MLGraphBuilder/resample2d(input, options)}} steps are: + The resample2d(|input|, |options|) method steps are: -
+
1. Check if the input is a 4-dimensional tensor: if the [=list/size=] of |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}} is not `4`, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. 1. If running the check resample options steps given |options| returns `false`, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. 1. Let |desc| be the result of running the resample output sizes steps given |options|. @@ -4789,6 +4937,7 @@ partial interface MLGraphBuilder { 1. Return |output|.
+
### The reshape() method ### {#api-mlgraphbuilder-reshape-method} Alter the shape of a tensor to a new shape. Reshape does not copy or change the content of the tensor. It just changes the tensor's logical dimensions for the subsequent operations. @@ -4812,11 +4961,12 @@ partial interface MLGraphBuilder { tensor is specified by the *newShape* argument.
+
- The {{MLGraphBuilder/reshape(input, newShape)}} steps are: + The reshape(|input|, |newShape|) method steps are: -
+
1. [=Assert=]: the type of |input| is {{MLOperand}}. 1. Let |outputShape| be an empty array of {{unsigned long}}. 1. If |newShape| is a scalar [=number=], set |outputShape| to `« 1 »`. @@ -4842,6 +4992,7 @@ partial interface MLGraphBuilder { 1. Return |output|.
+
### The sigmoid() method ### {#api-mlgraphbuilder-sigmoid-method} Compute the sigmoid function of the input tensor. The calculation follows the expression `1 / (exp(-x) + 1)`. @@ -4879,11 +5030,12 @@ partial interface MLGraphBuilder { - an {{MLOperand}}. The output tensor of the same shape as *input*.
+
- The {{MLGraphBuilder/sigmoid(input)}} steps are: + The sigmoid(|input|) method steps are: -
+
1. [=Assert=]: the type of |input| is {{MLOperand}}. 1. If any of the following sub-steps fail, [=exception/throw=] an "{{OperationError}}" {{DOMException}}. 1. Let |output| be the result of copying an MLOperand given |input|. @@ -4897,6 +5049,7 @@ partial interface MLGraphBuilder { 1. Return |output|.
+
#### The {{MLGraphBuilder/sigmoid()}} method #### {#api-mlgraphbuilder-sigmoid}
@@ -4907,16 +5060,18 @@ partial interface MLGraphBuilder { - an {{MLActivation}}. The activation function representing the sigmoid operation.
+
- The {{MLGraphBuilder/sigmoid()}} method steps are: + The sigmoid() method steps are: -
+
1. Let |op| be the result of creating an MLActivation given [=this=] and `"sigmoid"`. 1. If that [=exception/throws=] an error, re-[=exception/throw=] the error. 1. Return |op|.
+
### The slice() method ### {#api-mlgraphbuilder-slice} Produce a slice of the input tensor. @@ -4934,11 +5089,12 @@ partial interface MLGraphBuilder { **Returns:** an {{MLOperand}}. The output tensor of the same rank as the input tensor with tensor values stripped to the specified starting and ending indices in each dimension.
+
- The {{MLGraphBuilder/slice(input, starts, sizes)}} steps are: + The slice(|input|, |starts|, |sizes|) method steps are: -
+
1. [=Assert=]: the type of |input| is {{MLOperand}}. 1. If |sizes|.size is 0, then [=exception/throw=] a {{TypeError}}. 1. If the [=list/size=] of |starts| and |sizes| is not equal to the rank of |input|, then [=exception/throw=] a {{TypeError}}. @@ -4954,6 +5110,7 @@ partial interface MLGraphBuilder { 1. Return |output|.
+
### The softmax() method ### {#api-mlgraphbuilder-softmax-method} Compute the [softmax](https://en.wikipedia.org/wiki/Softmax_function) values of @@ -4995,11 +5152,12 @@ partial interface MLGraphBuilder { - an {{MLOperand}}. The output 2-D tensor that contains the softmax results, of the same shape as *input*.
+
- The {{MLGraphBuilder/softmax(input)}} steps are: + The softmax(|input|) method steps are: -
+
1. [=Assert=]: the type of |input| is {{MLOperand}}. 1. If the [=list/size=] of |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}} is not 2, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. 1. If any of the following sub-steps fail, [=exception/throw=] an "{{OperationError}}" {{DOMException}}. @@ -5014,6 +5172,7 @@ partial interface MLGraphBuilder { 1. Return |output|.
+
#### The {{MLGraphBuilder/softmax()}} method #### {#api-mlgraphbuilder-softmax}
@@ -5023,16 +5182,19 @@ partial interface MLGraphBuilder { **Returns:** - an {{MLActivation}}. The activation function representing the softmax operation.
+ +
- The {{MLGraphBuilder/softmax()}} method steps are: + The softmax() method steps are: -
+
1. Let |op| be the result of creating an MLActivation given and `"softmax"`. 1. If that [=exception/throws=] an error, re-[=exception/throw=] the error. 1. Return |op|.
+
### The softplus() method ### {#api-mlgraphbuilder-softplus-method} Compute the softplus function of the input tensor. The calculation follows the expression `ln(1 + exp(steepness * x)) / steepness`. @@ -5084,11 +5246,12 @@ partial interface MLGraphBuilder { - an {{MLOperand}}. The output tensor of the same shape as *input*.
+
- The {{MLGraphBuilder/softplus(input, options)}} method steps are: + The softplus(|input|, |options|) method steps are: -
+
1. If any of the following sub-steps fail, [=exception/throw=] an "{{OperationError}}" {{DOMException}}. 1. Let |output| be the result of copying an MLOperand given |input|. 1. Make a request to the underlying platform to: @@ -5101,6 +5264,7 @@ partial interface MLGraphBuilder { 1. Return |output|.
+
#### The {{MLGraphBuilder/softplus(options)}} method #### {#api-mlgraphbuilder-softplus-options}
@@ -5111,16 +5275,18 @@ partial interface MLGraphBuilder { - an {{MLActivation}}. The activation function representing the softplus operation.
+
- The {{MLGraphBuilder/softplus(options)}} method steps are: + The softplus(|options|) method steps are: -
+
1. Let |op| be the result of creating an MLActivation given [=this=], `"softplus"` and |options|. 1. If that [=exception/throws=] an error, re-[=exception/throw=] the error. 1. Return |op|.
+
### The softsign() method ### {#api-mlgraphbuilder-softsign-method} Compute the softsign function of the input tensor. The calculation follows the expression `x / (1 + |x|)`. @@ -5154,11 +5320,12 @@ partial interface MLGraphBuilder { - an {{MLOperand}}. The output tensor of the same shape as *input*.
+
- The {{MLGraphBuilder/softsign(input)}} steps are: + The softsign(|input|) method steps are: -
+
1. [=Assert=]: the type of |input| is {{MLOperand}}. 1. If any of the following sub-steps fail, [=exception/throw=] an "{{OperationError}}" {{DOMException}}. 1. Let |output| be the result of copying an MLOperand given |input|. @@ -5172,6 +5339,7 @@ partial interface MLGraphBuilder { 1. Return |output|.
+
#### The {{MLGraphBuilder/softsign()}} method #### {#api-mlgraphbuilder-softsign}
@@ -5181,16 +5349,19 @@ partial interface MLGraphBuilder { **Returns:** - an {{MLActivation}}. The activation function representing the softsign operation.
+ +
- The {{MLGraphBuilder/softsign()}} method steps are: + The softsign() method steps are: -
+
1. Let |op| be the result of creating an MLActivation given [=this=] and `"softsign"`. 1. If that [=exception/throws=] an error, re-[=exception/throw=] the error. 1. Return |op|.
+
### The split() method ### {#api-mlgraphbuilder-split} Split the input tensor into a number of sub tensors along the given axis. @@ -5223,11 +5394,12 @@ partial interface MLGraphBuilder { The default value is `0`. +
- The {{MLGraphBuilder/split(input, splits, options)}} steps are: + The split(|input|, |splits|, |options|) method steps are: -
+
1. [=Assert=]: the type of |input| is {{MLOperand}}. 1. If |splits| is not a non-zero {{unsigned long}} or a sequence of {{unsigned long}}, then [=exception/throw=] a {{TypeError}}. 1. If |splits| is an {{unsigned long}}, and |input|.{{MLOperandDescriptor/dimensions}}[|options|.{{MLSplitOptions/axis}}] % |splits| is not 0, then [=exception/throw=] a {{TypeError}}. @@ -5244,6 +5416,7 @@ partial interface MLGraphBuilder { 1. Return |output|.
+
@@ -5299,11 +5472,12 @@ partial interface MLGraphBuilder { When not specified, every shape dimensions of size 1 in the tensor are eliminated. +
- The {{MLGraphBuilder/squeeze(input, options)}} steps are: + The squeeze(|input|, |options|) method steps are: -
+
1. [=Assert=]: the type of |input| is {{MLOperand}}. 1. If |options|.{{MLSqueezeOptions/axes}} [=map/exists=], then: 1. Let |dimensions| be |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}. @@ -5324,6 +5498,7 @@ partial interface MLGraphBuilder { 1. Return |output|.
+
### The tanh() method ### {#api-mlgraphbuilder-tanh-method} Compute the hyperbolic tangent function of the input tensor. The calculation follows the expression `(exp(2 * x) - 1) / (exp(2 * x) + 1)`. @@ -5359,11 +5534,12 @@ partial interface MLGraphBuilder { - an {{MLOperand}}. The output tensor of the same shape as *input*.
+
- The {{MLGraphBuilder/tanh(input)}} steps are: + The tanh(|input|) method steps are: -
+
1. [=Assert=]: the type of |input| is {{MLOperand}}. 1. If any of the following sub-steps fail, [=exception/throw=] an "{{OperationError}}" {{DOMException}}. 1. Let |output| be the result of copying an MLOperand given |input|. @@ -5377,6 +5553,7 @@ partial interface MLGraphBuilder { 1. Return |output|.
+
#### The {{MLGraphBuilder/tanh()}} method #### {#api-mlgraphbuilder-tanh}
@@ -5387,16 +5564,18 @@ partial interface MLGraphBuilder { - an {{MLActivation}}. The activation function representing the tanh operation.
+
- The {{MLGraphBuilder/tanh()}} method steps are: + The tanh()
method steps are: -
+
1. Let |op| be the result of creating an MLActivation given [=this=] and `"tanh"`. 1. If that [=exception/throws=] an error, re-[=exception/throw=] the error. 1. Return |op|.
+
### The transpose() method ### {#api-mlgraphbuilder-transpose} Permute the dimensions of the input tensor according to the *permutation* argument. @@ -5428,11 +5607,12 @@ partial interface MLGraphBuilder { These default values cause the output to become a transposed tensor of the input. When specified, the number of values in the sequence must be the same as the [=rank=] of the input tensor, and the values in the sequence must be within the range from 0 to N-1 with no two or more same values found in the sequence. +
- The {{MLGraphBuilder/transpose(input, options)}} steps are: + The transpose(|input|, |options|) method steps are: -
+
1. [=Assert=]: the type of |input| is {{MLOperand}}. 1. If |options|.{{MLTransposeOptions/permutation}} does not [=map/exist=], let |options|.{{MLTransposeOptions/permutation}} be the reversed sequence of all indices for |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}. 1. Otherwise if |options|.{{MLTransposeOptions/permutation}} [=map/exists=]: @@ -5451,6 +5631,7 @@ partial interface MLGraphBuilder { 1. Return |output|.
+
Examples {#examples} ===================== From 5d1cb671da7bb7056858d1d6cdcb96b80781cae8 Mon Sep 17 00:00:00 2001 From: Joshua Bell Date: Tue, 22 Aug 2023 12:43:08 -0700 Subject: [PATCH 097/112] Improve algorithm linting and method linking Tag algorithms with the appropriate attribute, which lets Bikeshed lint the algorithm against the steps to catch bad variable references, etc. Other changes: * Collapse
and
into one tag where they had been introduced. * Eliminate duplicate ID warnings - see [1] * Ignore unused var for intializeGraph(). --- index.bs | 291 +++++++++++++++++++++++++------------------------------ 1 file changed, 131 insertions(+), 160 deletions(-) diff --git a/index.bs b/index.bs index 9886518d..976db8cd 100644 --- a/index.bs +++ b/index.bs @@ -810,8 +810,7 @@ Its default allowlist is 'self'. ### The {{ML/createContext()}} method ### {#api-ml-createcontext} -
-
+
To create a context given |options|, run these steps: @@ -828,10 +827,8 @@ Its default allowlist is 'self'. 1. Return |context|.
-
-
-
+
The createContext(|options|) steps are: @@ -844,10 +841,8 @@ Its default allowlist is 'self'. 1. [=Resolve=] |promise| with |context|.
-
-
-
+
The createContext(|gpuDevice|) method steps are: @@ -860,12 +855,10 @@ Its default allowlist is 'self'. 1. [=Resolve=] |promise| with |context|.
-
### The {{ML/createContextSync()}} method ### {#api-ml-createcontextsync} -
-
+
The createContextSync(|options|) method steps are: @@ -876,10 +869,8 @@ Its default allowlist is 'self'. 1. Return |context|.
-
-
-
+
The createContextSync(|gpuDevice|) method steps are: @@ -890,7 +881,6 @@ Its default allowlist is 'self'. 1. Return |context|.
-
## The MLGraph interface ## {#api-mlgraph} The {{MLGraph}} interface represents a compiled computational graph. A compiled graph once constructed is immutable and cannot be subsequently changed. @@ -946,7 +936,7 @@ dictionary MLOperandDescriptor { }; -
+
The byte length of an {{MLOperandDescriptor}} |desc| is the value returned by the following steps: @@ -1007,7 +997,7 @@ Since the {{MLOperand/[[builder]]}} object is bound by the {{MLGraphBuilder/cons #### Creating {{MLOperand}} #### {#api-mloperand-create} The {{MLOperand}} objects are created by the methods of {{MLGraphBuilder}}, internally using the following algorithms. -
+
To create an MLOperand given |builder| and |desc|, run the following steps: @@ -1021,7 +1011,7 @@ The {{MLOperand}} objects are created by the methods of {{MLGraphBuilder}}, inte
-
+
To copy an MLOperand given |operand|, run the following steps: @@ -1035,7 +1025,7 @@ The {{MLOperand}} objects are created by the methods of {{MLGraphBuilder}}, inte
-
+
To check dimensions given |dimensions| and |type|, run the following steps: @@ -1048,7 +1038,7 @@ The {{MLOperand}} objects are created by the methods of {{MLGraphBuilder}}, inte
-
+
To validate MLOperand given |operand| and |builder|, run the following steps: @@ -1098,7 +1088,7 @@ These activations function types are used to create other operations. One such u The {{MLActivation}} objects (including the ones passed as input to methods) are created by the methods of {{MLGraphBuilder}} and are identified by their name. The |options| dictionary is defined by those methods. The actual creation of the activation function e.g. a [[#api-mlgraphbuilder-sigmoid-method]] or [[#api-mlgraphbuilder-relu-method]] can then be deferred until when the rest of the graph is ready to connect with it such as during the construction of [[#api-mlgraphbuilder-conv2d]] for example.
-
+
To create an MLActivation given |builder|, |name|, |options| and |init-steps|, run the following steps: @@ -1175,7 +1165,8 @@ When the {{[[contextType]]}} is set to [=default-context|default=] with the {{ML
### The {{MLContext}} validation algorithm ### {#api-mlcontext-validate} -
+ +
To validate MLContext, given |context|, run these steps: @@ -1208,8 +1199,7 @@ partial interface MLContext { **Returns:** {{undefined}}.
-
-
+
The computeSync(|graph|, |inputs|, |outputs|) method steps are: @@ -1224,7 +1214,7 @@ partial interface MLContext {
-
+
To validate graph resources, given |resources| and |descriptors|, run the following steps: @@ -1238,7 +1228,7 @@ partial interface MLContext {
-
+
To validate buffer with descriptor given |bufferView| and |descriptor|, run the following steps: @@ -1249,7 +1239,7 @@ partial interface MLContext {
-
+
To execute graph, given |graph|, |inputs| and |outputs|, run the following steps: @@ -1317,7 +1307,8 @@ partial interface MLContext {
### The {{MLNamedArrayBufferViews}} transfer algorithm ### {#mlnamedarraybufferviews-transfer-alg} -
+ +
To transfer an {{MLNamedArrayBufferViews}} |views|: @@ -1360,8 +1351,7 @@ partial interface MLContext { **Returns:** Promise<{{MLComputeResult}}>.
-
-
+
The compute(|graph|, |inputs|, |outputs|) method steps are: @@ -1471,10 +1461,9 @@ partial interface MLCommandEncoder { **Returns:** {{undefined}}.
-
-
+
- The initializeGraph(|graph|) method steps are: + The initializeGraph(graph) method steps are:
@@ -1502,8 +1491,7 @@ partial interface MLCommandEncoder { **Returns:** {{undefined}}.
-
-
+
The dispatch(|graph|, |inputs|, |outputs|) method steps are: @@ -1549,8 +1537,7 @@ partial interface MLCommandEncoder { **Returns:** {{GPUCommandBuffer}}.
-
-
+
The finish(|descriptor|) method steps are: @@ -1633,8 +1620,7 @@ Both {{MLGraphBuilder}}.{{MLGraphBuilder/build()}} and {{MLGraphBuilder}}.{{MLGr ### The {{MLGraphBuilder}} constructor ### {#api-mlgraphbuilder-constructor} -
-
+
The [=new=] MLGraphBuilder(context) constructor steps are: @@ -1656,8 +1642,7 @@ Create a named {{MLOperand}} based on a descriptor, that can be used as an input **Returns:**: an {{MLOperand}} object.
-
-
+
The input(|name|, |descriptor|) method steps are: @@ -1688,8 +1673,7 @@ Build a composed graph up to a given output operand into a computational graph, #### The {{MLGraphBuilder/build(outputs)}} method #### {#api-mlgraphbuilder-build-outputs} -
-
+
The build(|outputs|) method steps are: @@ -1707,8 +1691,7 @@ Build a composed graph up to a given output operand into a computational graph, #### The {{MLGraphBuilder/buildSync(outputs)}} method #### {#api-mlgraphbuilder-buildsync-outputs} -
-
+
The buildSync(|outputs|) method steps are: @@ -1754,8 +1737,7 @@ Create a constant {{MLOperand}} that can be used in {{MLGraphBuilder}} methods. **Returns:**: an {{MLOperand}} object.
-
-
+
The constant(|descriptor|, |bufferView|) method steps are: @@ -1788,8 +1770,7 @@ Create a constant {{MLOperand}} that can be used in {{MLGraphBuilder}} methods. **Returns:**: an {{MLOperand}} object.
-
-
+
The constant(|value|, |type|) method steps are: @@ -1867,8 +1848,7 @@ partial interface MLGraphBuilder { **Returns:** an {{MLOperand}}. The batch-normalized N-D tensor of the same shape as *input*.
-
-
+
The batchNormalization(|input|, |mean|, |variance|, |options|) method steps are: @@ -1953,7 +1933,7 @@ partial interface MLGraphBuilder {
-
+
To check clamp options given |options|, run the following steps: @@ -1974,8 +1954,8 @@ partial interface MLGraphBuilder { - an {{MLOperand}}. The output tensor of the same shape as *operand*.
-
-
+
+ The clamp(|operand|, |options|) method steps are: @@ -2006,8 +1986,8 @@ partial interface MLGraphBuilder { - an {{MLActivation}}. The operator representing the clamp operation.
-
-
+
+ The clamp(|options|) method steps are: @@ -2039,8 +2019,8 @@ partial interface MLGraphBuilder { computed as the sum of all the input sizes of the same dimension.
-
-
+
+ The concat(|inputs|, |axis|) method steps are: @@ -2199,8 +2179,8 @@ partial interface MLGraphBuilder { for *"oihw"* layout, [height, width, 1, options.groups] for *"hwio"* layout, [options.groups, height, width, 1] for *"ohwi"* layout and [1, height, width, options.groups] for *"ihwo"* layout.
-
-
+
+ The conv2d(|input|, |filter|, |options|) method steps are: @@ -2376,8 +2356,8 @@ partial interface MLGraphBuilder { *output_size = (input_size - 1) ** *stride + (filter_size - 1) ** *dilation + 1 - beginning_padding - ending_padding + output_padding*
-
-
+
+ The convTranspose2d(|input|, |filter|, |options|) method steps are: @@ -2467,8 +2447,8 @@ partial interface MLGraphBuilder { - *pow*: Compute the values of the values of the first input tensor to the power of the values of the second input tensor, element-wise.
-
-
+
+ To create element-wise binary operation given |op|, |a| and |b|, run the following steps: @@ -2494,8 +2474,8 @@ partial interface MLGraphBuilder {
-
-
+
+ To broadcast-shapes given |shape1| and |shape2|, run the following steps: @@ -2604,8 +2584,8 @@ partial interface MLGraphBuilder { - *tan*: Compute the tangent of the input tensor, element-wise.
-
-
+
+ To create element-wise unary operation given |op| and |input|, run the following steps: @@ -2741,8 +2721,8 @@ partial interface MLGraphBuilder { - an {{MLOperand}}. The output tensor of the same shape as *input*.
-
-
+
+ The elu(|input|, |options|) method steps are: @@ -2771,8 +2751,8 @@ partial interface MLGraphBuilder { - an {{MLActivation}}. The activation function representing the elu operation.
-
-
+
+ The elu(|options|) method steps are: @@ -2832,8 +2812,8 @@ partial interface MLGraphBuilder { **Returns:** an {{MLOperand}}. The output 2-D tensor of shape [M, N] that contains the calculated product of all the inputs.
-
-
+
+ The gemm(|a|, |b|, |options|) method steps are: @@ -2965,8 +2945,8 @@ partial interface MLGraphBuilder { **Returns:** a sequence of {{MLOperand}}. The first element of the sequence is a 3-D tensor of shape [num_directions, batch_size, hidden_size], the cell output from the last time step of the network. Additionally, if |options|.{{MLGruOptions/returnSequence}} is set to `true`, the second element is the 4-D output tensor of shape [steps, num_directions, batch_size, hidden_size] containing every cell outputs from each time step in the temporal sequence.
-
-
+
+ The gru(|input|, |weight|, |recurrentWeight|, |steps|, |hiddenSize|, |options|) method steps are: @@ -3117,8 +3097,8 @@ partial interface MLGraphBuilder { **Returns:** an {{MLOperand}}. The 2-D tensor of shape [batch_size, hidden_size], the cell output hidden state of a single time step of the recurrent network.
-
-
+
+ The gruCell(|input|, |weight|, |recurrentWeight|, |hiddenState|, |hiddenSize|, |options|) method steps are: @@ -3308,8 +3288,8 @@ partial interface MLGraphBuilder { - an {{MLOperand}}. The output tensor of the same shape as *input*.
-
-
+
+ The hardSigmoid(|input|, |options|) method steps are: @@ -3338,8 +3318,8 @@ partial interface MLGraphBuilder { - an {{MLActivation}}. The activation function representing the hard sigmoid operation.
-
-
+
+ The hardSigmoid(|options|) method steps are: @@ -3391,8 +3371,8 @@ partial interface MLGraphBuilder { - an {{MLOperand}}. The output tensor of the same shape as *input*.
-
-
+
+ The hardSwish(|input|) method steps are: @@ -3420,10 +3400,9 @@ partial interface MLGraphBuilder { - an {{MLActivation}}. The activation function representing the hard-swish operation.
-
-
+
- The hardSwish() method steps are: + The hardSwish() method steps are:
1. Let |op| be the result of creating an MLActivation given [=this=] and `"hardSwish"`. @@ -3478,8 +3457,8 @@ The {{MLInstanceNormalizationOptions}} members are: **Returns:** an {{MLOperand}}. The instance-normalized 4-D tensor of the same shape as *input*.
-
-
+
+ The instanceNormalization(|input|, |options|) method steps are: @@ -3590,8 +3569,8 @@ partial interface MLGraphBuilder { - an {{MLOperand}}. The output tensor of the same shape as *input*.
-
-
+
+ The leakyRelu(|input|, |options|) method steps are: @@ -3619,8 +3598,8 @@ partial interface MLGraphBuilder { - an {{MLActivation}}. The activation function representing the leaky relu operation.
-
-
+
+ The leakyRelu(|options|) method steps are: @@ -3685,8 +3664,8 @@ partial interface MLGraphBuilder { - an {{MLOperand}}. The output tensor of the same shape as *input*.
-
-
+
+ The linear(|input|, |options|) method steps are: @@ -3714,8 +3693,8 @@ partial interface MLGraphBuilder { - an {{MLActivation}}. The activation function representing the linear operation.
-
-
+
+ The linear(|options|) method steps are: @@ -3806,8 +3785,8 @@ partial interface MLGraphBuilder { **Returns:** a sequence of {{MLOperand}}. The first element of the sequence is a 3-D tensor of shape [num_directions, batch_size, hidden_size], the output hidden state from the last time step of the network. The second element is a 3-D tensor of shape [num_directions, batch_size, hidden_size], the output cell state from the last time step of the network. Additionally, if |options|.{{MLLstmOptions/returnSequence}} is set to true, the third element is the 4-D output tensor of shape [steps, num_directions, batch_size, hidden_size] containing every output from each time step in the temporal sequence.
-
-
+
+ The lstm(|input|, |weight|, |recurrentWeight|, |steps|, |hiddenSize|, |options|) method steps are: @@ -4010,8 +3989,8 @@ partial interface MLGraphBuilder { **Returns:** a sequence of {{MLOperand}}. The first element of the sequence is the output hidden state of the current time step of the recurrent network. The following element is the output cell state. Both elements are 2-D tensors of shape [batch_size, hidden_size].
-
-
+
+ The lstmCell(|input|, |weight|, |recurrentWeight|, |hiddenState|, |cellState|, |hiddenSize|, |options|) method steps are: @@ -4197,8 +4176,7 @@ partial interface MLGraphBuilder { - If both *a* and *b* are 1-dimensional, the operation is a vector dot-product, which produces a scalar output.
-
-
+
To calculate matmul output sizes, given |a| and |b| run the following steps: @@ -4214,10 +4192,9 @@ partial interface MLGraphBuilder { 1. Return |shape|.
-
-
-
+
+ The matmul(|a|, |b|) method steps are: @@ -4290,7 +4267,7 @@ partial interface MLGraphBuilder { *output size = beginning padding + input size + ending padding*
-
+
To calculate padding output sizes, given |input|, |beginningPadding| and |endingPadding|, run the following steps: @@ -4303,8 +4280,8 @@ partial interface MLGraphBuilder {
-
-
+
+ The pad(|input|, |beginningPadding|, |endingPadding|, |options|) method steps are: @@ -4484,8 +4461,8 @@ partial interface MLGraphBuilder { -
-
+
+ To create pooling operation given |op|, |input| and |options|, run the following steps: @@ -4566,8 +4543,8 @@ partial interface MLGraphBuilder { - an {{MLOperand}}. The output tensor of the same shape as *input*.
-
-
+
+ The prelu(|input|, |slope|) method steps are: @@ -4654,8 +4631,8 @@ partial interface MLGraphBuilder { - *SumSquare*: Compute the sum of the square of all the input values along the axes.
-
-
+
+ To create reduce operation given |op|, |input| and |options|, run the following steps: @@ -4784,8 +4761,8 @@ partial interface MLGraphBuilder { - an {{MLOperand}}. The output tensor of the same shape as *input*.
-
-
+
+ The relu(|input|) method steps are: @@ -4814,10 +4791,9 @@ partial interface MLGraphBuilder { - an {{MLActivation}}. The activation function representing the relu operation.
-
-
+
- The relu() method steps are: + The relu() method steps are:
1. Let |op| be the result of creating an MLActivation given [=this=] and `"relu"`. @@ -4880,8 +4856,8 @@ partial interface MLGraphBuilder { The default value is [2, 3]. -
-
+
+ To check resample options given |options|, run the following steps: @@ -4897,8 +4873,8 @@ partial interface MLGraphBuilder {
-
-
+
+ To resample output sizes given |input| and |options|, run the following steps: @@ -4915,8 +4891,8 @@ partial interface MLGraphBuilder {
-
-
+
+ The resample2d(|input|, |options|) method steps are: @@ -4961,8 +4937,8 @@ partial interface MLGraphBuilder { tensor is specified by the *newShape* argument.
-
-
+
+ The reshape(|input|, |newShape|) method steps are: @@ -5030,8 +5006,8 @@ partial interface MLGraphBuilder { - an {{MLOperand}}. The output tensor of the same shape as *input*.
-
-
+
+ The sigmoid(|input|) method steps are: @@ -5060,10 +5036,9 @@ partial interface MLGraphBuilder { - an {{MLActivation}}. The activation function representing the sigmoid operation.
-
-
+
- The sigmoid() method steps are: + The sigmoid() method steps are:
1. Let |op| be the result of creating an MLActivation given [=this=] and `"sigmoid"`. @@ -5089,8 +5064,8 @@ partial interface MLGraphBuilder { **Returns:** an {{MLOperand}}. The output tensor of the same rank as the input tensor with tensor values stripped to the specified starting and ending indices in each dimension.
-
-
+
+ The slice(|input|, |starts|, |sizes|) method steps are: @@ -5152,8 +5127,8 @@ partial interface MLGraphBuilder { - an {{MLOperand}}. The output 2-D tensor that contains the softmax results, of the same shape as *input*.
-
-
+
+ The softmax(|input|) method steps are: @@ -5183,10 +5158,9 @@ partial interface MLGraphBuilder { - an {{MLActivation}}. The activation function representing the softmax operation.
-
-
+
- The softmax() method steps are: + The softmax() method steps are:
1. Let |op| be the result of creating an MLActivation given and `"softmax"`. @@ -5246,8 +5220,8 @@ partial interface MLGraphBuilder { - an {{MLOperand}}. The output tensor of the same shape as *input*.
-
-
+
+ The softplus(|input|, |options|) method steps are: @@ -5275,8 +5249,8 @@ partial interface MLGraphBuilder { - an {{MLActivation}}. The activation function representing the softplus operation.
-
-
+
+ The softplus(|options|) method steps are: @@ -5320,8 +5294,8 @@ partial interface MLGraphBuilder { - an {{MLOperand}}. The output tensor of the same shape as *input*.
-
-
+
+ The softsign(|input|) method steps are: @@ -5350,10 +5324,9 @@ partial interface MLGraphBuilder { - an {{MLActivation}}. The activation function representing the softsign operation.
-
-
+
- The softsign() method steps are: + The softsign() method steps are:
1. Let |op| be the result of creating an MLActivation given [=this=] and `"softsign"`. @@ -5394,8 +5367,8 @@ partial interface MLGraphBuilder { The default value is `0`. -
-
+
+ The split(|input|, |splits|, |options|) method steps are: @@ -5472,8 +5445,8 @@ partial interface MLGraphBuilder { When not specified, every shape dimensions of size 1 in the tensor are eliminated. -
-
+
+ The squeeze(|input|, |options|) method steps are: @@ -5534,8 +5507,8 @@ partial interface MLGraphBuilder { - an {{MLOperand}}. The output tensor of the same shape as *input*.
-
-
+
+ The tanh(|input|) method steps are: @@ -5564,10 +5537,9 @@ partial interface MLGraphBuilder { - an {{MLActivation}}. The activation function representing the tanh operation.
-
-
+
- The tanh()
method steps are: + The tanh() method steps are:
1. Let |op| be the result of creating an MLActivation given [=this=] and `"tanh"`. @@ -5607,8 +5579,7 @@ partial interface MLGraphBuilder { These default values cause the output to become a transposed tensor of the input. When specified, the number of values in the sequence must be the same as the [=rank=] of the input tensor, and the values in the sequence must be within the range from 0 to N-1 with no two or more same values found in the sequence. -
-
+
The transpose(|input|, |options|) method steps are: From 5ccd7b09c9298fd592e90daa4b27a18159439db8 Mon Sep 17 00:00:00 2001 From: Joshua Bell Date: Tue, 22 Aug 2023 13:55:49 -0700 Subject: [PATCH 098/112] Remove orphaned
s --- index.bs | 60 -------------------------------------------------------- 1 file changed, 60 deletions(-) diff --git a/index.bs b/index.bs index 976db8cd..5c354940 100644 --- a/index.bs +++ b/index.bs @@ -1212,7 +1212,6 @@ partial interface MLContext { 1. Return {{undefined}}.
-
@@ -1372,7 +1371,6 @@ partial interface MLContext { 1. [=Resolve=] |promise| with |result|.
-
#### Examples #### {#api-mlcontext-async-execution-examples}
@@ -1471,7 +1469,6 @@ partial interface MLCommandEncoder {
-
### Dispatch Execution Commands ### {#api-mlcommandencoder-dispatch-commands} Record the {{MLGraph}} execution with the inputs {{MLNamedGPUResources}} and outputs {{MLNamedGPUResources}}. @@ -1519,7 +1516,6 @@ partial interface MLCommandEncoder { 1. Return {{undefined}}.
-
### Generate GPU Command Buffer ### {#api-mlcommandencoder-generate-gpu-command-buffer} Complete the recording of ML workload and return a WebGPU-compatible {{GPUCommandBuffer}} containing the recorded workload. @@ -1550,7 +1546,6 @@ partial interface MLCommandEncoder { 1. Return a {{GPUCommandBuffer}} containing the recorded workload.
-
## The MLGraphBuilder interface ## {#api-mlgraphbuilder} @@ -1630,7 +1625,6 @@ Both {{MLGraphBuilder}}.{{MLGraphBuilder/build()}} and {{MLGraphBuilder}}.{{MLGr 1. Set {{MLGraphBuilder/[[context]]}} to |context|.
-
### The {{MLGraphBuilder/input()}} method ### {#api-mlgraphbuilder-input} Create a named {{MLOperand}} based on a descriptor, that can be used as an input. @@ -1666,7 +1660,6 @@ Create a named {{MLOperand}} based on a descriptor, that can be used as an input 1. Return |operand|.
-
### The build() method ### {#api-mlgraphbuilder-build} Build a composed graph up to a given output operand into a computational graph, asynchronously or synchronously. @@ -1687,7 +1680,6 @@ Build a composed graph up to a given output operand into a computational graph, 1. If that [=exception/throws=], re-[=exception/throw=] the error.
-
#### The {{MLGraphBuilder/buildSync(outputs)}} method #### {#api-mlgraphbuilder-buildsync-outputs} @@ -1724,7 +1716,6 @@ Build a composed graph up to a given output operand into a computational graph, 1. Return |graph|.
-
### The constant() method ### {#api-mlgraphbuilder-constant-method} Create a constant {{MLOperand}} that can be used in {{MLGraphBuilder}} methods. @@ -1759,7 +1750,6 @@ Create a constant {{MLOperand}} that can be used in {{MLGraphBuilder}} methods. 1. Return |operand|.
-
#### The {{MLGraphBuilder/constant(value, type)}} method #### {#api-mlgraphbuilder-constant-value-type} @@ -1795,7 +1785,6 @@ Create a constant {{MLOperand}} that can be used in {{MLGraphBuilder}} methods. 1. Return |operand|.
-
### The batchNormalization() method ### {#api-mlgraphbuilder-batchnorm} Normalize the tensor values of input features across the batch dimension using [[Batch-Normalization]]. For each input feature, the mean and variance values of that feature supplied in this calculation as parameters are previously computed across the batch dimension of the input during the model training phase of this operation. @@ -1868,7 +1857,6 @@ partial interface MLGraphBuilder { 1. Return |output|.
-
@@ -1974,7 +1962,6 @@ partial interface MLGraphBuilder { 1. Return |output|.
-
#### The {{MLGraphBuilder/clamp(options)}} method #### {#api-mlgraphbuilder-clamp-options}
@@ -1998,7 +1985,6 @@ partial interface MLGraphBuilder { 1. Return |op|.
-
### The concat() method ### {#api-mlgraphbuilder-concat} Concatenates the input tensors along a given axis. @@ -2057,7 +2043,6 @@ partial interface MLGraphBuilder { 1. Return |output|.
-
### The conv2d() method ### {#api-mlgraphbuilder-conv2d} Compute a 2-D convolution given 4-D input and filter tensors @@ -2226,7 +2211,6 @@ partial interface MLGraphBuilder { 1. Return |output|.
-
### The convTranspose2d() method ### {#api-mlgraphbuilder-convtranspose2d} Compute a 2-D transposed convolution given 4-D input and filter tensors @@ -2406,7 +2390,6 @@ partial interface MLGraphBuilder { 1. Return |output|.
-
### Element-wise binary operations ### {#api-mlgraphbuilder-binary} Compute the element-wise binary addition, subtraction, multiplication, division, power, maximum and minimum of the two input tensors. @@ -2472,7 +2455,6 @@ partial interface MLGraphBuilder { 1. Return |output|.
-
@@ -2489,7 +2471,6 @@ partial interface MLGraphBuilder {
-
@@ -2604,7 +2585,6 @@ partial interface MLGraphBuilder { 1. Return |output|.
-
@@ -2739,7 +2719,6 @@ partial interface MLGraphBuilder { 1. Return |output|.
- #### The {{MLGraphBuilder/elu(options)}} method #### {#api-mlgraphbuilder-elu-options}
@@ -2761,7 +2740,6 @@ partial interface MLGraphBuilder { 1. Return |op|.
- ### The gemm() method ### {#api-mlgraphbuilder-gemm} Calculate the [general matrix multiplication of the Basic Linear Algebra Subprograms](https://en.wikipedia.org/wiki/Basic_Linear_Algebra_Subprograms#Level_3). The calculation follows the expression `alpha * A * B + beta * C`, where `A` is a 2-D tensor with shape [M, K] or [K, M], `B` is a 2-D tensor with shape [K, N] or [N, K], and `C` is broadcastable to the shape [M, N]. `A` and `B` may optionally be transposed prior to the calculation. @@ -2844,7 +2822,6 @@ partial interface MLGraphBuilder { 1. Return |output|. -
@@ -2976,7 +2953,6 @@ partial interface MLGraphBuilder { 1. Return |output|.
-
@@ -3129,7 +3105,6 @@ partial interface MLGraphBuilder { 1. Return |output|.
-
@@ -3307,7 +3282,6 @@ partial interface MLGraphBuilder { 1. Return |output|.
- #### The {{MLGraphBuilder/hardSigmoid(options)}} method #### {#api-mlgraphbuilder-hardsigmoid-options}
@@ -3329,7 +3303,6 @@ partial interface MLGraphBuilder { 1. Return |op|.
- ### The hardSwish() method ### {#api-mlgraphbuilder-hard-swish} Computes the nonlinear function `y = x * max(0, min(6, (x + 3))) / 6` that is introduced by [[MobileNetV3]] on the input tensor element-wise. @@ -3389,7 +3362,6 @@ partial interface MLGraphBuilder { 1. Return |output|. - #### The {{MLGraphBuilder/hardSwish()}} method #### {#api-mlgraphbuilder-hardswish}
@@ -3482,7 +3454,6 @@ The {{MLInstanceNormalizationOptions}} members are: 1. Return |output|.
-
@@ -3587,7 +3558,6 @@ partial interface MLGraphBuilder { 1. Return |output|.
- #### The {{MLGraphBuilder/leakyRelu(options)}} method #### {#api-mlgraphbuilder-leaky-relu-options}
@@ -3609,7 +3579,6 @@ partial interface MLGraphBuilder { 1. Return |op|.
- ### The linear() method ### {#api-mlgraphbuilder-linear} Calculate a linear function `y = alpha * x + beta` on the input tensor. @@ -3682,7 +3651,6 @@ partial interface MLGraphBuilder { 1. Return |output|. - #### The {{MLGraphBuilder/linear(options)}} method #### {#api-mlgraphbuilder-linear-options}
@@ -3704,7 +3672,6 @@ partial interface MLGraphBuilder { 1. Return |op|.
- ### The lstm() method ### {#api-mlgraphbuilder-lstm} Long Short-Term Memory [[LSTM]] recurrent network uses an input, output, forget, and cell gate to compute the output state that rolls into the output across the temporal sequence of the network. @@ -3851,7 +3818,6 @@ partial interface MLGraphBuilder { 1. Return |output|. -
@@ -4031,7 +3997,6 @@ partial interface MLGraphBuilder { 1. Return |output|.
-
@@ -4215,7 +4180,6 @@ partial interface MLGraphBuilder { 1. Return |output|.
- ### The pad() method ### {#api-mlgraphbuilder-pad} Inflate the tensor with constant or mirrored values on the edges. @@ -4302,7 +4266,6 @@ partial interface MLGraphBuilder { 1. Return |output|. -
@@ -4498,7 +4461,6 @@ partial interface MLGraphBuilder { 1. Return |output|.
-
@@ -4566,7 +4528,6 @@ partial interface MLGraphBuilder { 1. Return |output|.
-
@@ -4651,7 +4612,6 @@ partial interface MLGraphBuilder { 1. Return |output|.
-
@@ -4780,7 +4740,6 @@ partial interface MLGraphBuilder { 1. Return |output|.
- #### The {{MLGraphBuilder/relu()}} method #### {#api-mlgraphbuilder-relu}
@@ -4801,7 +4760,6 @@ partial interface MLGraphBuilder { 1. Return |op|.
- ### The resample2d() method ### {#api-mlgraphbuilder-resample2d-method} Resample the tensor values from the source to the destination spatial dimensions according to the scaling factors. @@ -4871,7 +4829,6 @@ partial interface MLGraphBuilder { 1. Return `true`. -
@@ -4889,7 +4846,6 @@ partial interface MLGraphBuilder { 1. Return |desc|.
-
@@ -4913,7 +4869,6 @@ partial interface MLGraphBuilder { 1. Return |output|.
- ### The reshape() method ### {#api-mlgraphbuilder-reshape-method} Alter the shape of a tensor to a new shape. Reshape does not copy or change the content of the tensor. It just changes the tensor's logical dimensions for the subsequent operations. @@ -4968,7 +4923,6 @@ partial interface MLGraphBuilder { 1. Return |output|. - ### The sigmoid() method ### {#api-mlgraphbuilder-sigmoid-method} Compute the sigmoid function of the input tensor. The calculation follows the expression `1 / (exp(-x) + 1)`. @@ -5025,7 +4979,6 @@ partial interface MLGraphBuilder { 1. Return |output|. - #### The {{MLGraphBuilder/sigmoid()}} method #### {#api-mlgraphbuilder-sigmoid}
@@ -5046,7 +4999,6 @@ partial interface MLGraphBuilder { 1. Return |op|.
- ### The slice() method ### {#api-mlgraphbuilder-slice} Produce a slice of the input tensor. @@ -5085,7 +5037,6 @@ partial interface MLGraphBuilder { 1. Return |output|. - ### The softmax() method ### {#api-mlgraphbuilder-softmax-method} Compute the [softmax](https://en.wikipedia.org/wiki/Softmax_function) values of @@ -5147,7 +5098,6 @@ partial interface MLGraphBuilder { 1. Return |output|. - #### The {{MLGraphBuilder/softmax()}} method #### {#api-mlgraphbuilder-softmax}
@@ -5168,7 +5118,6 @@ partial interface MLGraphBuilder { 1. Return |op|.
- ### The softplus() method ### {#api-mlgraphbuilder-softplus-method} Compute the softplus function of the input tensor. The calculation follows the expression `ln(1 + exp(steepness * x)) / steepness`. @@ -5238,7 +5187,6 @@ partial interface MLGraphBuilder { 1. Return |output|. - #### The {{MLGraphBuilder/softplus(options)}} method #### {#api-mlgraphbuilder-softplus-options}
@@ -5260,7 +5208,6 @@ partial interface MLGraphBuilder { 1. Return |op|.
- ### The softsign() method ### {#api-mlgraphbuilder-softsign-method} Compute the softsign function of the input tensor. The calculation follows the expression `x / (1 + |x|)`. @@ -5313,7 +5260,6 @@ partial interface MLGraphBuilder { 1. Return |output|. - #### The {{MLGraphBuilder/softsign()}} method #### {#api-mlgraphbuilder-softsign}
@@ -5334,7 +5280,6 @@ partial interface MLGraphBuilder { 1. Return |op|.
- ### The split() method ### {#api-mlgraphbuilder-split} Split the input tensor into a number of sub tensors along the given axis. @@ -5389,7 +5334,6 @@ partial interface MLGraphBuilder { 1. Return |output|. -
@@ -5471,7 +5415,6 @@ partial interface MLGraphBuilder { 1. Return |output|.
- ### The tanh() method ### {#api-mlgraphbuilder-tanh-method} Compute the hyperbolic tangent function of the input tensor. The calculation follows the expression `(exp(2 * x) - 1) / (exp(2 * x) + 1)`. @@ -5526,7 +5469,6 @@ partial interface MLGraphBuilder { 1. Return |output|. - #### The {{MLGraphBuilder/tanh()}} method #### {#api-mlgraphbuilder-tanh}
@@ -5547,7 +5489,6 @@ partial interface MLGraphBuilder { 1. Return |op|.
- ### The transpose() method ### {#api-mlgraphbuilder-transpose} Permute the dimensions of the input tensor according to the *permutation* argument. @@ -5602,7 +5543,6 @@ partial interface MLGraphBuilder { 1. Return |output|. - Examples {#examples} ===================== From 305de15e983ed31497c731ef7fb2276cd0522c49 Mon Sep 17 00:00:00 2001 From: Zoltan Kis Date: Wed, 23 Aug 2023 10:40:01 +0300 Subject: [PATCH 099/112] Replace snake_case with camelCase for variables in algorithms. Signed-off-by: Zoltan Kis --- index.bs | 258 +++++++++++++++++++++++++++---------------------------- 1 file changed, 129 insertions(+), 129 deletions(-) diff --git a/index.bs b/index.bs index 5c354940..d2253baa 100644 --- a/index.bs +++ b/index.bs @@ -2081,19 +2081,19 @@ partial interface MLGraphBuilder {
: padding :: - A sequence of {{unsigned long}} of length 4: [beginning_height, ending_height, beginning_width, ending_width]. + A sequence of {{unsigned long}} of length 4: [beginningHeight, endingHeight, beginningWidth, endingWidth]. Specifies the additional rows and columns added to the beginning and ending of each spatial dimension of the convolution input. The default value is [0, 0, 0, 0]. : strides :: - A sequence of {{unsigned long}} of length 2: [stride_height, stride_width]. + A sequence of {{unsigned long}} of length 2: [strideHeight, strideWidth]. Specifies the stride of the sliding window for each spatial dimension of the convolution input. The default value is [1, 1]. : dilations :: - A sequence of {{unsigned long}} of length 2: [dilation_height, dilation_width]. Specifies the dilation factor for each spatial dimension applied on the convolution filter (kernel). + A sequence of {{unsigned long}} of length 2: [dilationHeight, dilationWidth]. Specifies the dilation factor for each spatial dimension applied on the convolution filter (kernel). The default value is [1, 1]. : autoPad @@ -2118,27 +2118,27 @@ partial interface MLGraphBuilder { An {{MLInputOperandLayout}} [=string=]. Specifies the layout format of the input and output tensor as follows: - **"nchw"** - - input tensor: *[batches, input_channels, height, width]* - - output tensor: *[batches, output_channels, height, width]* + - input tensor: *[batches, inputChannels, height, width]* + - output tensor: *[batches, outputChannels, height, width]* - **"nhwc"**: - - input tensor: *[batches, height, width, input_channels]* - - output tensor: *[batches, height, width, output_channels]* + - input tensor: *[batches, height, width, inputChannels]* + - output tensor: *[batches, height, width, outputChannels]* The default value is *"nchw"*. : filterLayout :: An {{MLConv2dFilterOperandLayout}} [=string=]. Specifies the layout format of the filter tensor as follow: - - **"oihw"**: *[output_channels, input_channels/groups, height, width]* - - **"hwio"**: *[height, width, input_channels/groups, output_channels]* - - **"ohwi"**: *[output_channels, height, width, input_channels/groups]* - - **"ihwo"**: *[input_channels/groups, height, width, output_channels]* + - **"oihw"**: *[outputChannels, inputChannels/groups, height, width]* + - **"hwio"**: *[height, width, inputChannels/groups, outputChannels]* + - **"ohwi"**: *[outputChannels, height, width, inputChannels/groups]* + - **"ihwo"**: *[inputChannels/groups, height, width, outputChannels]* The default value is *"oihw"*. : bias :: An {{MLOperand}} object. - Specifies the additional 1-D tensor with the shape of *[output_channels]* whose values are to be added to the convolution result. + Specifies the additional 1-D tensor with the shape of *[outputChannels]* whose values are to be added to the convolution result. : activation :: @@ -2156,11 +2156,11 @@ partial interface MLGraphBuilder { **Returns:** an {{MLOperand}}. The output 4-D tensor that contains the convolution result. The output shape is interpreted according to the *options*.{{MLConv2dOptions/inputLayout}} value. More specifically, the spatial dimensions or the sizes of the last two dimensions of the output tensor for the *nchw* input layout can be calculated as follow: - *output_size = 1 + (input_size - (filter_size - 1) ** *dilation - 1 + beginning_padding + ending_padding) / stride* + *outputSize = 1 + (inputSize - (filterSize - 1) ** *dilation - 1 + beginningPadding + endingPadding) / stride*
- A *depthwise* conv2d operation is a variant of grouped convolution, used in models like the MobileNet, where the *options.groups* = input_channels = output_channels and the shape of filter tensor is [options.groups, 1, height, width] + A *depthwise* conv2d operation is a variant of grouped convolution, used in models like the MobileNet, where the *options.groups* = inputChannels = outputChannels and the shape of filter tensor is [options.groups, 1, height, width] for *"oihw"* layout, [height, width, 1, options.groups] for *"hwio"* layout, [options.groups, height, width, 1] for *"ohwi"* layout and [1, height, width, options.groups] for *"ihwo"* layout.
@@ -2171,10 +2171,10 @@ partial interface MLGraphBuilder {
1. [=Assert=]: the type of |input| and |filter| is {{MLOperand}}. - 1. Let |input_size| be the [=list/size=] of |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}. - 1. Let |filter_size| be the [=list/size=] of |filter|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}. - 1. If |input_size| is not `4`, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. - 1. If |filter_size| is not `4`, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. + 1. Let |inputSize| be the [=list/size=] of |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}. + 1. Let |filterSize| be the [=list/size=] of |filter|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}. + 1. If |inputSize| is not `4`, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. + 1. If |filterSize| is not `4`, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. 1. If the type of |input| and |filter| is not the same, then [=exception/throw=] a {{TypeError}}. 1. If |options|.{{MLConv2dOptions/padding}} does not [=map/exist=], set it to `« 0, 0, 0, 0 »`. 1. Else if the [=list/size=] of |options|.{{MLConv2dOptions/padding}} is not 4, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. @@ -2185,19 +2185,19 @@ partial interface MLGraphBuilder { 1. Else if the [=list/size=] of |options|.{{MLConv2dOptions/dilations}} is not `2`, then [=exception/throw=] a {{TypeError}}. 1. If |options|.{{MLConv2dOptions/autoPad}} does not [=map/exist=], set it to `"explicit"`. 1. If |options|.{{MLConv2dOptions/groups}} is 0, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. - 1. If |input_size| / |options|.{{MLConv2dOptions/groups}} is not equal to |filter_size|, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. - 1. Else if |input_size| % |options|.{{MLConv2dOptions/groups}} is not 0, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. + 1. If |inputSize| / |options|.{{MLConv2dOptions/groups}} is not equal to |filterSize|, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. + 1. Else if |inputSize| % |options|.{{MLConv2dOptions/groups}} is not 0, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. 1. If |options|.{{MLConv2dOptions/bias}} [=map/exists=]: 1. [=Assert=]: the type of |options|.{{MLConv2dOptions/bias}} is {{MLOperand}}. 1. If the [=list/size=] of |options|.{{MLConv2dOptions/bias}}.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}} is not 1, then [=exception/throw=] a {{TypeError}}. 1. If |options|.{{MLConv2dOptions/bias}}.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/type}} is not the same as |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/type}}, then [=exception/throw=] a {{TypeError}}. 1. If |options|.{{MLConv2dOptions/activation}} [=map/exists=]: 1. [=Assert=]: the type of |options|.{{MLConv2dOptions/activation}} is {{MLActivation}}. - 1. Let |output_shape| be the result of invoking the underlying implementation for calculating output dimensions, given |options|. - 1. If |output_shape| is not the same as the shape of |options|.{{MLConv2dOptions/bias}}.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. + 1. Let |outputShape| be the result of invoking the underlying implementation for calculating output dimensions, given |options|. + 1. If |outputShape| is not the same as the shape of |options|.{{MLConv2dOptions/bias}}.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. 1. Let |desc| a new {{MLOperandDescriptor}}. 1. Set |desc|.{{MLOperandDescriptor/type}} to |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/type}}. - 1. Set |desc|.{{MLOperandDescriptor/dimensions}} to |output_shape|. + 1. Set |desc|.{{MLOperandDescriptor/dimensions}} to |outputShape|. 1. If any of the following sub-steps fail, [=exception/throw=] an "{{OperationError}}" {{DOMException}}. 1. Let |output| be the result of creating an MLOperand given [=this=] and |desc|. 1. Make a request to the underlying platform to: @@ -2246,19 +2246,19 @@ partial interface MLGraphBuilder {
: padding :: - A sequence of {{unsigned long}} of length 4: [beginning_height, ending_height, beginning_width, ending_width]. + A sequence of {{unsigned long}} of length 4: [beginningHeight, endingHeight, beginningWidth, endingWidth]. Specifies the additional rows and columns added to the beginning and ending of each spatial dimension of the convolution input. The default value is [0, 0, 0, 0]. : strides :: - A sequence of {{unsigned long}} of length 2: [stride_height, stride_width]. + A sequence of {{unsigned long}} of length 2: [strideHeight, strideWidth]. Specifies the stride of the sliding window for each spatial dimension of the convolution input. The default value is [1, 1]. : dilations :: - A sequence of {{unsigned long}} of length 2: [dilation_height, dilation_width]. Specifies the dilation factor for each spatial dimension applied on the convolution filter (kernel). + A sequence of {{unsigned long}} of length 2: [dilationHeight, dilationWidth]. Specifies the dilation factor for each spatial dimension applied on the convolution filter (kernel). The default value is [1, 1]. : outputPadding @@ -2300,26 +2300,26 @@ partial interface MLGraphBuilder { An {{MLInputOperandLayout}} [=string=]. Specifies the layout format of the input and output tensor as follows: - **"nchw"** - - input tensor: *[batches, input_channels, height, width]* - - output tensor: *[batches, output_channels, height, width]* + - input tensor: *[batches, inputChannels, height, width]* + - output tensor: *[batches, outputChannels, height, width]* - **"nhwc"**: - - input tensor: *[batches, height, width, input_channels]* - - output tensor: *[batches, height, width, output_channels]* + - input tensor: *[batches, height, width, inputChannels]* + - output tensor: *[batches, height, width, outputChannels]* The default value is *"nchw"*. : filterLayout :: An {{MLConvTranspose2dFilterOperandLayout}} [=string=]. Specifies the layout format of the filter tensor as follow: - - **"iohw"**: [input_channels, output_channels/groups, height, width] - - **"hwoi"**: [height, width, output_channels/groups, input_channels] - - **"ohwi"**: [output_channels/groups, height, width, input_channels] + - **"iohw"**: [inputChannels, outputChannels/groups, height, width] + - **"hwoi"**: [height, width, outputChannels/groups, inputChannels] + - **"ohwi"**: [outputChannels/groups, height, width, inputChannels] The default value is *"iohw"*. : bias :: An {{MLOperand}} object. - Specifies the additional 1-D tensor with the shape of *[output_channels]* whose values are to be added to the convolution result. + Specifies the additional 1-D tensor with the shape of *[outputChannels]* whose values are to be added to the convolution result. : activation :: @@ -2337,7 +2337,7 @@ partial interface MLGraphBuilder { **Returns:** an {{MLOperand}}. The output 4-D tensor that contains the transposed convolution result. The output shape is interpreted according to the *options*.{{MLConvTranspose2dOptions/inputLayout}} value. More specifically, unless the *options*.{{MLConvTranspose2dOptions/outputSizes}} values are explicitly specified, the *options*.{{MLConvTranspose2dOptions/outputPadding}} may be needed to compute the spatial dimension values of the output tensor as follow: - *output_size = (input_size - 1) ** *stride + (filter_size - 1) ** *dilation + 1 - beginning_padding - ending_padding + output_padding* + *outputSize = (inputSize - 1) ** *stride + (filterSize - 1) ** *dilation + 1 - beginningPadding - endingPadding + outputPadding*
@@ -2347,10 +2347,10 @@ partial interface MLGraphBuilder {
1. [=Assert=]: the type of |input| and |filter| is {{MLOperand}}. - 1. Let |input_size| be the [=list/size=] of |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}. - 1. Let |filter_size| be the [=list/size=] of |filter|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}. - 1. If |input_size| is not `4`, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. - 1. If |filter_size| is not `4`, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. + 1. Let |inputSize| be the [=list/size=] of |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}. + 1. Let |filterSize| be the [=list/size=] of |filter|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}. + 1. If |inputSize| is not `4`, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. + 1. If |filterSize| is not `4`, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. 1. If the type of |input| and |filter| is not the same, then [=exception/throw=] a {{TypeError}}. 1. If |options|.{{MLConvTranspose2dOptions/padding}} does not [=map/exist=], set it to `« 0, 0, 0, 0 »`. 1. Else if the [=list/size=] of |options|.{{MLConvTranspose2dOptions/padding}} is not 4, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. @@ -2364,19 +2364,19 @@ partial interface MLGraphBuilder { 1. If |options|.{{MLConvTranspose2dOptions/outputSizes}} [=map/exists=]: 1. If the [=list/size=] of |options|.{{MLConvTranspose2dOptions/outputSizes}} is not `2`, then [=exception/throw=] a {{TypeError}}. 1. If the elements of |options|.{{MLConvTranspose2dOptions/outputSizes}} are not smaller than the elements at the same dimension (index) for |options|.{{MLConvTranspose2dOptions/strides}}, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. - 1. If |input_size| / |options|.{{MLConvTranspose2dOptions/groups}} is not equal to |filter_size|, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. - 1. Else if |input_size| % |options|.{{MLConvTranspose2dOptions/groups}} is not 0, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. + 1. If |inputSize| / |options|.{{MLConvTranspose2dOptions/groups}} is not equal to |filterSize|, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. + 1. Else if |inputSize| % |options|.{{MLConvTranspose2dOptions/groups}} is not 0, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. 1. If |options|.{{MLConvTranspose2dOptions/bias}} [=map/exists=]: 1. [=Assert=]: the type of |options|.{{MLConvTranspose2dOptions/bias}} is {{MLOperand}}. 1. If the [=list/size=] of |options|.{{MLConvTranspose2dOptions/bias}}.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}} is not 1, then [=exception/throw=] a {{TypeError}}. 1. If |options|.{{MLConvTranspose2dOptions/bias}}.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/type}} is not the same as |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/type}}, then [=exception/throw=] a {{TypeError}}. 1. If |options|.{{MLConvTranspose2dOptions/activation}} [=map/exists=]: 1. [=Assert=]: the type of |options|.{{MLConvTranspose2dOptions/activation}} is {{MLActivation}}. - 1. Let |output_shape| be the result of invoking the underlying implementation for calculating output dimensions, given |options|. - 1. If |output_shape| is not the same as the shape of |options|.{{MLConvTranspose2dOptions/bias}}.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. + 1. Let |outputShape| be the result of invoking the underlying implementation for calculating output dimensions, given |options|. + 1. If |outputShape| is not the same as the shape of |options|.{{MLConvTranspose2dOptions/bias}}.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. 1. Let |desc| a new {{MLOperandDescriptor}}. 1. Set |desc|.{{MLOperandDescriptor/type}} to |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/type}}. - 1. Set |desc|.{{MLOperandDescriptor/dimensions}} to |output_shape|. + 1. Set |desc|.{{MLOperandDescriptor/dimensions}} to |outputShape|. 1. If any of the following sub-steps fail, [=exception/throw=] an "{{OperationError}}" {{DOMException}}. 1. Let |output| be the result of creating an MLOperand given [=this=] and |desc|. 1. Make a request to the underlying platform to: @@ -2877,15 +2877,15 @@ partial interface MLGraphBuilder {
: bias :: - An {{MLOperand}}. Specifies the 2-D input bias tensor of shape [num_directions, 3 * hidden_size]. The ordering of the bias vectors in the second dimension of the tensor shape is specified according to the {{MLGruOptions/layout}} argument. + An {{MLOperand}}. Specifies the 2-D input bias tensor of shape [numDirections, 3 * hiddenSize]. The ordering of the bias vectors in the second dimension of the tensor shape is specified according to the {{MLGruOptions/layout}} argument. : recurrentBias :: - An {{MLOperand}}. Specifies the 2-D recurrent bias tensor of shape [num_directions, 3 * hidden_size]. The ordering of the bias vectors in the second dimension of the tensor shape is specified according to the {{MLGruOptions/layout}} argument. + An {{MLOperand}}. Specifies the 2-D recurrent bias tensor of shape [numDirections, 3 * hiddenSize]. The ordering of the bias vectors in the second dimension of the tensor shape is specified according to the {{MLGruOptions/layout}} argument. : initialHiddenState :: - An {{MLOperand}}. The 3-D initial hidden state tensor of shape [num_directions, batch_size, hidden_size]. + An {{MLOperand}}. The 3-D initial hidden state tensor of shape [numDirections, batchSize, hiddenSize]. When not specified, implementations SHOULD use a tensor filled with zero. : resetAfter @@ -2912,14 +2912,14 @@ partial interface MLGraphBuilder {
**Arguments:** - - *input*: an {{MLOperand}}. The input 3-D tensor of shape [steps, batch_size, input_size]. - - *weight*: an {{MLOperand}}. The 3-D input weight tensor of shape [num_directions, 3 * hidden_size, input_size]. The ordering of the weight vectors in the second dimension of the tensor shape is specified according to the |options|.{{MLGruOptions/layout}} argument. - - *recurrentWeight*: an {{MLOperand}}. The 3-D recurrent weight tensor of shape [num_directions, 3 * hidden_size, hidden_size]. The ordering of the weight vectors in the second dimension of the tensor shape is specified according to the |options|.{{MLGruOptions/layout}} argument. + - *input*: an {{MLOperand}}. The input 3-D tensor of shape [steps, batchSize, inputSize]. + - *weight*: an {{MLOperand}}. The 3-D input weight tensor of shape [numDirections, 3 * hiddenSize, inputSize]. The ordering of the weight vectors in the second dimension of the tensor shape is specified according to the |options|.{{MLGruOptions/layout}} argument. + - *recurrentWeight*: an {{MLOperand}}. The 3-D recurrent weight tensor of shape [numDirections, 3 * hiddenSize, hiddenSize]. The ordering of the weight vectors in the second dimension of the tensor shape is specified according to the |options|.{{MLGruOptions/layout}} argument. - *steps*: an {{unsigned long}} scalar. The number of time steps in the recurrent network. The value must be greater than 0. - *hiddenSize*: an {{unsigned long}} scalar. The value of the third dimension of the cell output tensor shape. It indicates the number of features in the hidden state. - *options*: an optional {{MLGruOptions}}. The optional parameters of the operation. - **Returns:** a sequence of {{MLOperand}}. The first element of the sequence is a 3-D tensor of shape [num_directions, batch_size, hidden_size], the cell output from the last time step of the network. Additionally, if |options|.{{MLGruOptions/returnSequence}} is set to `true`, the second element is the 4-D output tensor of shape [steps, num_directions, batch_size, hidden_size] containing every cell outputs from each time step in the temporal sequence. + **Returns:** a sequence of {{MLOperand}}. The first element of the sequence is a 3-D tensor of shape [numDirections, batchSize, hiddenSize], the cell output from the last time step of the network. Additionally, if |options|.{{MLGruOptions/returnSequence}} is set to `true`, the second element is the 4-D output tensor of shape [steps, numDirections, batchSize, hiddenSize] containing every cell outputs from each time step in the temporal sequence.
@@ -2976,11 +2976,11 @@ partial interface MLGraphBuilder { let currentRecurrentBias = []; for (let dir = 0; dir < numDirections; ++dir) { - currentWeight.push(builder.squeeze(builder.slice(weight, [dir, 0, 0], [1, 3 * hidden_size, input_size]), { axes: [0] })); - currentRecurrentWeight.push(builder.squeeze(builder.slice(recurrentWeight, [dir, 0, 0], [1, 3 * hidden_size, hidden_size]), { axes: [0] })); - currentBias.push(options.bias ? (builder.squeeze(builder.slice(options.bias, [dir, 0], [1, 3 * hidden_size]), { axes: [0] })) : null); + currentWeight.push(builder.squeeze(builder.slice(weight, [dir, 0, 0], [1, 3 * hiddenSize, inputSize]), { axes: [0] })); + currentRecurrentWeight.push(builder.squeeze(builder.slice(recurrentWeight, [dir, 0, 0], [1, 3 * hiddenSize, hiddenSize]), { axes: [0] })); + currentBias.push(options.bias ? (builder.squeeze(builder.slice(options.bias, [dir, 0], [1, 3 * hiddenSize]), { axes: [0] })) : null); currentRecurrentBias.push(options.recurrentBias ? - (builder.squeeze(builder.slice(options.recurrentBias, [dir, 0], [1, 3 * hidden_size]), { axes: [0] })) : null); + (builder.squeeze(builder.slice(options.recurrentBias, [dir, 0], [1, 3 * hiddenSize]), { axes: [0] })) : null); } for (let step = 0; step < steps; ++step) { @@ -2988,12 +2988,12 @@ partial interface MLGraphBuilder { let currentOutput = null; for (let dir = 0; dir < numDirections; ++dir) { - currentHidden.push(builder.squeeze(builder.slice(hiddenState, [dir, 0, 0], [1, batch_size, hidden_size]), { axes: [0] })); + currentHidden.push(builder.squeeze(builder.slice(hiddenState, [dir, 0, 0], [1, batchSize, hiddenSize]), { axes: [0] })); } for (let dir = 0; dir < numDirections; ++dir) { let slice = (dir == 1 || options.direction == "backward" ? steps - step - 1 : step); - let currentInput = builder.squeeze(builder.slice(input, [slice, 0, 0], [1, batch_size, input_size]), { axes: [0] }); + let currentInput = builder.squeeze(builder.slice(input, [slice, 0, 0], [1, batchSize, inputSize]), { axes: [0] }); let result = builder.reshape( builder.gruCell( @@ -3042,11 +3042,11 @@ partial interface MLGraphBuilder {
: bias :: - An {{MLOperand}}. Specifies the 1-D input bias tensor of shape [3 * hidden_size]. The ordering of the bias vectors in the second dimension of the tensor shape is specified according to the {{MLGruOptions/layout}} argument. + An {{MLOperand}}. Specifies the 1-D input bias tensor of shape [3 * hiddenSize]. The ordering of the bias vectors in the second dimension of the tensor shape is specified according to the {{MLGruOptions/layout}} argument. : recurrentBias :: - An {{MLOperand}}. Specifies the 1-D recurrent bias tensor of shape [3 * hidden_size]. The ordering of the bias vectors in the second dimension of the tensor shape is specified according to the {{MLGruOptions/layout}} argument. + An {{MLOperand}}. Specifies the 1-D recurrent bias tensor of shape [3 * hiddenSize]. The ordering of the bias vectors in the second dimension of the tensor shape is specified according to the {{MLGruOptions/layout}} argument. : resetAfter :: @@ -3063,14 +3063,14 @@ partial interface MLGraphBuilder {
**Arguments:** - - *input*: an {{MLOperand}}. The input 2-D tensor of shape [batch_size, input_size]. - - *weight*: an {{MLOperand}}. The 2-D input weight tensor of shape [3 * hidden_size, input_size]. The ordering of the weight vectors in the first dimension of the tensor shape is specified according to the *options.layout* argument. - - *recurrentWeight*: an {{MLOperand}}. The 2-D recurrent weight tensor of shape [3 * hidden_size, hidden_size]. The ordering of the weight vectors in the first dimension of the tensor shape is specified according to the *options.layout* argument. - - *hiddenState*: an {{MLOperand}}. The 2-D input hidden state tensor of shape [batch_size, hidden_size]. + - *input*: an {{MLOperand}}. The input 2-D tensor of shape [batchSize, inputSize]. + - *weight*: an {{MLOperand}}. The 2-D input weight tensor of shape [3 * hiddenSize, inputSize]. The ordering of the weight vectors in the first dimension of the tensor shape is specified according to the *options.layout* argument. + - *recurrentWeight*: an {{MLOperand}}. The 2-D recurrent weight tensor of shape [3 * hiddenSize, hiddenSize]. The ordering of the weight vectors in the first dimension of the tensor shape is specified according to the *options.layout* argument. + - *hiddenState*: an {{MLOperand}}. The 2-D input hidden state tensor of shape [batchSize, hiddenSize]. - *hiddenSize*: an {{unsigned long}} scalar. The value of the second dimension of the output tensor shape. It indicates the number of features in the hidden state. - *options*: an optional {{MLGruCellOptions}}. The optional parameters of the operation. - **Returns:** an {{MLOperand}}. The 2-D tensor of shape [batch_size, hidden_size], the cell output hidden state of a single time step of the recurrent network. + **Returns:** an {{MLOperand}}. The 2-D tensor of shape [batchSize, hiddenSize], the cell output hidden state of a single time step of the recurrent network.
@@ -3125,11 +3125,11 @@ partial interface MLGraphBuilder { builder.add( builder.matmul( input, - builder.transpose(builder.slice(weight, [0, 0], [hiddenSize, input_size])) + builder.transpose(builder.slice(weight, [0, 0], [hiddenSize, inputSize])) ), builder.matmul( hiddenState, - builder.transpose(builder.slice(recurrentWeight, [0, 0], [hiddenSize, hidden_size])) + builder.transpose(builder.slice(recurrentWeight, [0, 0], [hiddenSize, hiddenSize])) ) ) ) @@ -3145,11 +3145,11 @@ partial interface MLGraphBuilder { builder.add( builder.matmul( input, - builder.transpose(builder.slice(weight, [hiddenSize, 0], [hiddenSize, input_size])) + builder.transpose(builder.slice(weight, [hiddenSize, 0], [hiddenSize, inputSize])) ), builder.matmul( hiddenState, - builder.transpose(builder.slice(recurrentWeight, [hiddenSize, 0], [hiddenSize, hidden_size])) + builder.transpose(builder.slice(recurrentWeight, [hiddenSize, 0], [hiddenSize, hiddenSize])) ) ) ) @@ -3164,7 +3164,7 @@ partial interface MLGraphBuilder { builder.add( builder.matmul( input, - builder.transpose(builder.slice(weight, [2 * hiddenSize, 0], [hiddenSize, input_size])) + builder.transpose(builder.slice(weight, [2 * hiddenSize, 0], [hiddenSize, inputSize])) ), builder.mul( r, @@ -3172,7 +3172,7 @@ partial interface MLGraphBuilder { (options.recurrentBias ? builder.slice(options.recurrentBias, [2 * hiddenSize], [hiddenSize]) : zero), builder.matmul( hiddenState, - builder.transpose(builder.slice(recurrentWeight, [2 * hiddenSize, 0], [hiddenSize, hidden_size])) + builder.transpose(builder.slice(recurrentWeight, [2 * hiddenSize, 0], [hiddenSize, hiddenSize])) ) ) ) @@ -3190,11 +3190,11 @@ partial interface MLGraphBuilder { builder.add( builder.matmul( input, - builder.transpose(builder.slice(weight, [2 * hiddenSize, 0], [hiddenSize, input_size])) + builder.transpose(builder.slice(weight, [2 * hiddenSize, 0], [hiddenSize, inputSize])) ), builder.matmul( builder.mul(r, hiddenState), - builder.transpose(builder.slice(recurrentWeight, [2 * hiddenSize, 0], [hiddenSize, hidden_size])) + builder.transpose(builder.slice(recurrentWeight, [2 * hiddenSize, 0], [hiddenSize, hiddenSize])) ) ) ) @@ -3705,23 +3705,23 @@ partial interface MLGraphBuilder {
: bias :: - An {{MLOperand}}. Specifies the 2-D input bias tensor of shape [num_directions, 4 * hidden_size]. The ordering of the bias vectors in the second dimension of the tensor shape is specified according to {{MLLstmOptions/layout}}. + An {{MLOperand}}. Specifies the 2-D input bias tensor of shape [numDirections, 4 * hiddenSize]. The ordering of the bias vectors in the second dimension of the tensor shape is specified according to {{MLLstmOptions/layout}}. : recurrentBias :: - An {{MLOperand}}. Specifies the 2-D recurrent bias tensor of shape [num_directions, 4 * hidden_size]. The ordering of the bias vectors in the first dimension of the tensor shape is specified according to {{MLLstmOptions/layout}}. + An {{MLOperand}}. Specifies the 2-D recurrent bias tensor of shape [numDirections, 4 * hiddenSize]. The ordering of the bias vectors in the first dimension of the tensor shape is specified according to {{MLLstmOptions/layout}}. : peepholeWeight :: - An {{MLOperand}}. Specifies the 2-D weight tensor for peepholes of shape [num_directions, 3 * hidden_size]. The pack ordering of the weight vectors is for the `input (i)`, `output (o)`, and `forget (f)` gate, respectively. + An {{MLOperand}}. Specifies the 2-D weight tensor for peepholes of shape [numDirections, 3 * hiddenSize]. The pack ordering of the weight vectors is for the `input (i)`, `output (o)`, and `forget (f)` gate, respectively. : initialHiddenState :: - An {{MLOperand}}. Specifies the 3-D initial hidden state tensor of shape [num_directions, batch_size, hidden_size]. When not specified, implementations SHOULD use a tensor filled with zero. + An {{MLOperand}}. Specifies the 3-D initial hidden state tensor of shape [numDirections, batchSize, hiddenSize]. When not specified, implementations SHOULD use a tensor filled with zero. : initialCellState :: - An {{MLOperand}}. Specifies the 3-D initial hidden state tensor of shape [num_directions, batch_size, hidden_size]. When not specified, implementations SHOULD use a tensor filled with zero. + An {{MLOperand}}. Specifies the 3-D initial hidden state tensor of shape [numDirections, batchSize, hiddenSize]. When not specified, implementations SHOULD use a tensor filled with zero. : returnSequence :: @@ -3742,14 +3742,14 @@ partial interface MLGraphBuilder {
**Arguments:** - - *input*: an {{MLOperand}}. The input 3-D tensor of shape [steps, batch_size, input_size]. - - *weight*: an {{MLOperand}}. The 3-D input weight tensor of shape [num_directions, 4 * hidden_size, input_size]. The ordering of the weight vectors in the second dimension of the tensor shape is specified according to the |options|.{{MLLstmOptions/layout}}. - - *recurrentWeight*: an {{MLOperand}}. The 3-D recurrent weight tensor of shape [num_directions, 4 * hidden_size, hidden_size]. The ordering of the weight vectors in the second dimension of the tensor shape is specified according to the |options|.{{MLLstmOptions/layout}} argument. + - *input*: an {{MLOperand}}. The input 3-D tensor of shape [steps, batchSize, inputSize]. + - *weight*: an {{MLOperand}}. The 3-D input weight tensor of shape [numDirections, 4 * hiddenSize, inputSize]. The ordering of the weight vectors in the second dimension of the tensor shape is specified according to the |options|.{{MLLstmOptions/layout}}. + - *recurrentWeight*: an {{MLOperand}}. The 3-D recurrent weight tensor of shape [numDirections, 4 * hiddenSize, hiddenSize]. The ordering of the weight vectors in the second dimension of the tensor shape is specified according to the |options|.{{MLLstmOptions/layout}} argument. - *steps*: an {{unsigned long}} scalar. The number of time steps in the recurrent network. The value must be greater than 0. - *hiddenSize*: an {{unsigned long}} scalar. The value of the third dimension of the cell output tensor shape. It indicates the number of features in the hidden state. - *options*: an optional {{MLLstmOptions}}. The optional parameters of the operation. - **Returns:** a sequence of {{MLOperand}}. The first element of the sequence is a 3-D tensor of shape [num_directions, batch_size, hidden_size], the output hidden state from the last time step of the network. The second element is a 3-D tensor of shape [num_directions, batch_size, hidden_size], the output cell state from the last time step of the network. Additionally, if |options|.{{MLLstmOptions/returnSequence}} is set to true, the third element is the 4-D output tensor of shape [steps, num_directions, batch_size, hidden_size] containing every output from each time step in the temporal sequence. + **Returns:** a sequence of {{MLOperand}}. The first element of the sequence is a 3-D tensor of shape [numDirections, batchSize, hiddenSize], the output hidden state from the last time step of the network. The second element is a 3-D tensor of shape [numDirections, batchSize, hiddenSize], the output cell state from the last time step of the network. Additionally, if |options|.{{MLLstmOptions/returnSequence}} is set to true, the third element is the 4-D output tensor of shape [steps, numDirections, batchSize, hiddenSize] containing every output from each time step in the temporal sequence.
@@ -3759,39 +3759,39 @@ partial interface MLGraphBuilder {
1. If |options|.{{MLLstmOptions/direction}} is not one of {{MLRecurrentNetworkDirection}}, then [=exception/throw=] a {{TypeError}}. - 1. Let |num_directions| be `1` if |options|.{{MLLstmOptions/direction}} is `"forward"`, or otherwise let it be `2`. + 1. Let |numDirections| be `1` if |options|.{{MLLstmOptions/direction}} is `"forward"`, or otherwise let it be `2`. 1. [=Assert=]: the type of |input|, |weight| and |recurrentWeight| is {{MLOperand}}.
The shape of |input|, |weight| or |recurrentWeight| could be also checked here.
1. If |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}[0] is not equal to |steps|, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. - 1. Let |batch_size| be |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}[1]. + 1. Let |batchSize| be |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}[1]. 1. If |options|.{{MLLstmOptions/bias}} [=map/exists=]: 1. [=Assert=]: its type is {{MLOperand}}. 1. If its rank is not `2`, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. - 1. If |options|.{{MLLstmOptions/bias}}.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}[0] is not |num_directions|, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. + 1. If |options|.{{MLLstmOptions/bias}}.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}[0] is not |numDirections|, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. 1. If |options|.{{MLLstmOptions/bias}}.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}[1] is not 4 * |hiddenSize|, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. 1. If |options|.{{MLLstmOptions/recurrentBias}} [=map/exists=]: 1. [=Assert=]: its type is {{MLOperand}}. 1. If its rank is not `2`, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. - 1. If |options|.{{MLLstmOptions/recurrentBias}}.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}[0] is not |num_directions|, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. + 1. If |options|.{{MLLstmOptions/recurrentBias}}.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}[0] is not |numDirections|, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. 1. If |options|.{{MLLstmOptions/recurrentBias}}.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}[1] is not 4 * |hiddenSize|, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. 1. If |options|.{{MLLstmOptions/peepholeWeight}} [=map/exists=]: 1. [=Assert=]: its type is {{MLOperand}}. 1. If its rank is not `2`, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. - 1. If |options|.{{MLLstmOptions/peepholeWeight}}.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}[0] is not |num_directions|, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. + 1. If |options|.{{MLLstmOptions/peepholeWeight}}.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}[0] is not |numDirections|, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. 1. If |options|.{{MLLstmOptions/peepholeWeight}}.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}[1] is not 4 * |hiddenSize|, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. 1. If |options|.{{MLLstmOptions/initialHiddenState}} [=map/exists=]: 1. [=Assert=]: its type is {{MLOperand}}. 1. If its rank is not `3`, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. - 1. If |options|.{{MLLstmOptions/initialHiddenState}}.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}[0] is not |num_directions|, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. - 1. If |options|.{{MLLstmOptions/initialHiddenState}}.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}[1] is not equal to |batch_size|, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. + 1. If |options|.{{MLLstmOptions/initialHiddenState}}.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}[0] is not |numDirections|, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. + 1. If |options|.{{MLLstmOptions/initialHiddenState}}.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}[1] is not equal to |batchSize|, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. 1. If |options|.{{MLLstmOptions/initialHiddenState}}.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}[2] is not |hiddenSize|, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. 1. If |options|.{{MLLstmOptions/initialCellState}} [=map/exists=]: 1. [=Assert=]: its type is {{MLOperand}}. 1. If its rank is not `3`, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. - 1. If |options|.{{MLLstmOptions/initialCellState}}.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}[0] is not |num_directions|, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. - 1. If |options|.{{MLLstmOptions/initialCellState}}.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}[1] is not equal to |batch_size|, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. + 1. If |options|.{{MLLstmOptions/initialCellState}}.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}[0] is not |numDirections|, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. + 1. If |options|.{{MLLstmOptions/initialCellState}}.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}[1] is not equal to |batchSize|, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. 1. If |options|.{{MLLstmOptions/initialCellState}}.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}[2] is not |hiddenSize|, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. 1. If |options|.{{MLLstmOptions/layout}} is not one of {{MLLstmWeightLayout}}, then [=exception/throw=] a {{TypeError}}. 1. If |options|.{{MLLstmOptions/activations}} [=map/exists=]: @@ -3799,11 +3799,11 @@ partial interface MLGraphBuilder { 1. [=Assert=]: the type of its elements is {{MLActivation}}. 1. If any of the following sub-steps fail, [=exception/throw=] an "{{OperationError}}" {{DOMException}}. 1. Let |desc| a new {{MLOperandDescriptor}}. - 1. Set |desc|.{{MLOperandDescriptor/dimensions}} to [ |num_directions|, |batch_size|, |hiddenSize| ]. + 1. Set |desc|.{{MLOperandDescriptor/dimensions}} to [ |numDirections|, |batchSize|, |hiddenSize| ]. 1. Set |desc|.{{MLOperandDescriptor/type}} to |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/type}}. 1. Let |output0| be the result of creating an MLOperand given [=this=] and |desc|. 1. Let |output1| be the result of creating an MLOperand given [=this=] and |desc|. - 1. Set |desc|.{{MLOperandDescriptor/dimensions}} to [ |steps|, |num_directions|, |batch_size|, |hiddenSize| ]. + 1. Set |desc|.{{MLOperandDescriptor/dimensions}} to [ |steps|, |numDirections|, |batchSize|, |hiddenSize| ]. 1. If |options|.{{MLLstmOptions/returnSequence}} is set to true: 1. Let |output2| be the result of creating an MLOperand given [=this=] and |desc|. 1. Let |output| be the array [ |output0|, |output1|, |output2| ]. @@ -3849,13 +3849,13 @@ partial interface MLGraphBuilder { let currentPeepholeWeight = []; for (let dir = 0; dir < numDirections; ++dir) { - currentWeight.push(builder.squeeze(builder.slice(weight, [dir, 0, 0], [1, 4 * hidden_size, input_size]), { axes: [0] })); - currentRecurrentWeight.push(builder.squeeze(builder.slice(recurrentWeight, [dir, 0, 0], [1, 4 * hidden_size, hidden_size]), { axes: [0] })); - currentBias.push(options.bias ? (builder.squeeze(builder.slice(options.bias, [dir, 0], [1, 4 * hidden_size]), { axes: [0] })) : null); + currentWeight.push(builder.squeeze(builder.slice(weight, [dir, 0, 0], [1, 4 * hiddenSize, inputSize]), { axes: [0] })); + currentRecurrentWeight.push(builder.squeeze(builder.slice(recurrentWeight, [dir, 0, 0], [1, 4 * hiddenSize, hiddenSize]), { axes: [0] })); + currentBias.push(options.bias ? (builder.squeeze(builder.slice(options.bias, [dir, 0], [1, 4 * hiddenSize]), { axes: [0] })) : null); currentRecurrentBias.push(options.recurrentBias ? - (builder.squeeze(builder.slice(options.recurrentBias, [dir, 0], [1, 4 * hidden_size]), { axes: [0] })) : null); + (builder.squeeze(builder.slice(options.recurrentBias, [dir, 0], [1, 4 * hiddenSize]), { axes: [0] })) : null); currentPeepholeWeight.push(options.peepholeWeight ? - (builder.squeeze(builder.slice(options.peepholeWeight, [dir, 0], [1, 3 * hidden_size]), { axes: [0] })) : null); + (builder.squeeze(builder.slice(options.peepholeWeight, [dir, 0], [1, 3 * hiddenSize]), { axes: [0] })) : null); } for (let step = 0; step < steps; ++step) { @@ -3865,13 +3865,13 @@ partial interface MLGraphBuilder { let nextCell = null; for (let dir = 0; dir < numDirections; ++dir) { - currentHidden.push(builder.squeeze(builder.slice(hiddenState, [dir, 0, 0], [1, batch_size, hidden_size]), { axes: [0] })); - currentCell.push(builder.squeeze(builder.slice(cellState, [dir, 0, 0], [1, batch_size, hidden_size]), { axes: [0] })); + currentHidden.push(builder.squeeze(builder.slice(hiddenState, [dir, 0, 0], [1, batchSize, hiddenSize]), { axes: [0] })); + currentCell.push(builder.squeeze(builder.slice(cellState, [dir, 0, 0], [1, batchSize, hiddenSize]), { axes: [0] })); } for (let dir = 0; dir < numDirections; ++dir) { let slice = (dir == 1 || options.direction == "backward" ? steps - step - 1 : step); - let currentInput = builder.squeeze(builder.slice(input, [slice, 0, 0], [1, batch_size, input_size]), { axes: [0] }); + let currentInput = builder.squeeze(builder.slice(input, [slice, 0, 0], [1, batchSize, inputSize]), { axes: [0] }); let results = builder.lstmCell( currentInput, currentWeight[dir], currentRecurrentWeight[dir], @@ -3923,15 +3923,15 @@ partial interface MLGraphBuilder {
: bias :: - An {{MLOperand}}. The 1-D input bias tensor of shape [4 * hidden_size]. The ordering of the bias vectors in the first dimension of the tensor shape is specified according to the {{MLLstmCellOptions/layout}} argument. + An {{MLOperand}}. The 1-D input bias tensor of shape [4 * hiddenSize]. The ordering of the bias vectors in the first dimension of the tensor shape is specified according to the {{MLLstmCellOptions/layout}} argument. : recurrentBias :: - An {{MLOperand}}. The 1-D recurrent bias tensor of shape [4 * hidden_size]. The ordering of the bias vectors in the first dimension of the tensor shape is specified according to the {{MLLstmCellOptions/layout}} argument. + An {{MLOperand}}. The 1-D recurrent bias tensor of shape [4 * hiddenSize]. The ordering of the bias vectors in the first dimension of the tensor shape is specified according to the {{MLLstmCellOptions/layout}} argument. : peepholeWeight :: - An {{MLOperand}}. The 1-D weight tensor for peepholes of shape [3 * hidden_size]. The pack ordering of the weight vectors is for the `input (i)`, `output (o)`, and `forget (f)` gate, respectively. + An {{MLOperand}}. The 1-D weight tensor for peepholes of shape [3 * hiddenSize]. The pack ordering of the weight vectors is for the `input (i)`, `output (o)`, and `forget (f)` gate, respectively. : layout :: @@ -3944,15 +3944,15 @@ partial interface MLGraphBuilder {
**Arguments:** - - *input*: an {{MLOperand}}. The input 2-D tensor of shape [batch_size, input_size]. - - *weight*: an {{MLOperand}}. The 2-D input weight tensor of shape [4 * hidden_size, input_size]. The ordering of the weight vectors in the first dimension of the tensor shape is specified according to the *options.layout* argument. - - *recurrentWeight*: an {{MLOperand}}. The 2-D recurrent weight tensor of shape [4 * hidden_size, hidden_size]. The ordering of the weight vectors in the first dimension of the tensor shape is specified according to the *options.layout* argument. - - *hiddenState*: an {{MLOperand}}. The 2-D input hidden state tensor of shape [batch_size, hidden_size]. - - *cellState*: an {{MLOperand}}. The 2-D input cell state tensor of shape [batch_size, hidden_size]. + - *input*: an {{MLOperand}}. The input 2-D tensor of shape [batchSize, inputSize]. + - *weight*: an {{MLOperand}}. The 2-D input weight tensor of shape [4 * hiddenSize, inputSize]. The ordering of the weight vectors in the first dimension of the tensor shape is specified according to the *options.layout* argument. + - *recurrentWeight*: an {{MLOperand}}. The 2-D recurrent weight tensor of shape [4 * hiddenSize, hiddenSize]. The ordering of the weight vectors in the first dimension of the tensor shape is specified according to the *options.layout* argument. + - *hiddenState*: an {{MLOperand}}. The 2-D input hidden state tensor of shape [batchSize, hiddenSize]. + - *cellState*: an {{MLOperand}}. The 2-D input cell state tensor of shape [batchSize, hiddenSize]. - *hiddenSize*: an {{unsigned long}} scalar. The value of the second dimension of the output tensor shape. It indicates the number of features in the hidden state. - *options*: an optional {{MLLstmCellOptions}}. The optional parameters of the operation. - **Returns:** a sequence of {{MLOperand}}. The first element of the sequence is the output hidden state of the current time step of the recurrent network. The following element is the output cell state. Both elements are 2-D tensors of shape [batch_size, hidden_size]. + **Returns:** a sequence of {{MLOperand}}. The first element of the sequence is the output hidden state of the current time step of the recurrent network. The following element is the output cell state. Both elements are 2-D tensors of shape [batchSize, hiddenSize].
@@ -3963,7 +3963,7 @@ partial interface MLGraphBuilder {
1. [=Assert=]: the type of |input|, |weight|, |recurrentWeight|, |hiddenState| and |cellState| is {{MLOperand}}. 1. If the [=rank=] of |input|, |weight|, |recurrentWeight|, |hiddenState| or |cellState| is not `2`, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. - 1. Let |batch_size| be |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}[0]. + 1. Let |batchSize| be |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}[0]. 1. If |options|.{{MLLstmCellOptions/bias}} [=map/exists=]: 1. [=Assert=]: its type is {{MLOperand}}. 1. If its rank is not `1`, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. @@ -3981,7 +3981,7 @@ partial interface MLGraphBuilder { 1. If it is not an array of size `3`, then [=exception/throw=] a {{TypeError}}. 1. [=Assert=]: the type of its elements is {{MLActivation}}. 1. Let |desc| a new {{MLOperandDescriptor}}. - 1. Set |desc|.{{MLOperandDescriptor/dimensions}} to [ |batch_size|, |hiddenSize| ]. + 1. Set |desc|.{{MLOperandDescriptor/dimensions}} to [ |batchSize|, |hiddenSize| ]. 1. Set |desc|.{{MLOperandDescriptor/type}} to |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/type}}. 1. If any of the following sub-steps fail, [=exception/throw=] an "{{OperationError}}" {{DOMException}}. 1. Let |output0| be the result of creating an MLOperand given [=this=] and |desc|. @@ -4021,11 +4021,11 @@ partial interface MLGraphBuilder { builder.add( builder.matmul( input, - builder.transpose(builder.slice(weight, [0, 0], [hiddenSize, input_size])) + builder.transpose(builder.slice(weight, [0, 0], [hiddenSize, inputSize])) ), builder.matmul( hiddenState, - builder.transpose(builder.slice(recurrentWeight, [0, 0], [hiddenSize, hidden_size])) + builder.transpose(builder.slice(recurrentWeight, [0, 0], [hiddenSize, hiddenSize])) ) ) ) @@ -4047,11 +4047,11 @@ partial interface MLGraphBuilder { builder.add( builder.matmul( input, - builder.transpose(builder.slice(weight, [2 * hiddenSize, 0], [hiddenSize, input_size])) + builder.transpose(builder.slice(weight, [2 * hiddenSize, 0], [hiddenSize, inputSize])) ), builder.matmul( hiddenState, - builder.transpose(builder.slice(recurrentWeight, [2 * hiddenSize, 0], [hiddenSize, hidden_size])) + builder.transpose(builder.slice(recurrentWeight, [2 * hiddenSize, 0], [hiddenSize, hiddenSize])) ) ) ) @@ -4068,11 +4068,11 @@ partial interface MLGraphBuilder { builder.add( builder.matmul( input, - builder.transpose(builder.slice(weight, [3 * hiddenSize, 0], [hiddenSize, input_size])) + builder.transpose(builder.slice(weight, [3 * hiddenSize, 0], [hiddenSize, inputSize])) ), builder.matmul( hiddenState, - builder.transpose(builder.slice(recurrentWeight, [3 * hiddenSize, 0], [hiddenSize, hidden_size])) + builder.transpose(builder.slice(recurrentWeight, [3 * hiddenSize, 0], [hiddenSize, hiddenSize])) ) ) ) @@ -4093,11 +4093,11 @@ partial interface MLGraphBuilder { builder.add( builder.matmul( input, - builder.transpose(builder.slice(weight, [hiddenSize, 0], [hiddenSize, input_size])) + builder.transpose(builder.slice(weight, [hiddenSize, 0], [hiddenSize, inputSize])) ), builder.matmul( hiddenState, - builder.transpose(builder.slice(recurrentWeight, [hiddenSize, 0], [hiddenSize, hidden_size])) + builder.transpose(builder.slice(recurrentWeight, [hiddenSize, 0], [hiddenSize, hiddenSize])) ) ) ) @@ -4366,25 +4366,25 @@ partial interface MLGraphBuilder {
: windowDimensions :: - A sequence of {{unsigned long}} of length 2: [window_height, window_width]. + A sequence of {{unsigned long}} of length 2: [windowHeight, windowWidth]. Specifies the dimensions of the sliding window. The default value for the window dimensions are the height and width dimensions of the input shape. : padding :: - A sequence of {{unsigned long}} of length 4: [beginning_height, ending_height, beginning_width, ending_width]. + A sequence of {{unsigned long}} of length 4: [beginningHeight, endingHeight, beginningWidth, endingWidth]. Specifies the additional rows and columns added to the beginning and ending of each spatial dimension of the convolution input. The default value is [0,0,0,0]. : strides :: - A sequence of {{unsigned long}} of length 2: [stride_height, stride_width]. + A sequence of {{unsigned long}} of length 2: [strideHeight, strideWidth]. Specifies the stride of the sliding window for each spatial dimension of the convolution input. The default value is [1,1]. : dilations :: - A sequence of {{unsigned long}} of length 2: [dilation_height, dilation_width]. Specifies the dilation factor for each spatial dimension applied on the convolution filter (kernel). + A sequence of {{unsigned long}} of length 2: [dilationHeight, dilationWidth]. Specifies the dilation factor for each spatial dimension applied on the convolution filter (kernel). The default value is [1,1]. : autoPad @@ -4403,11 +4403,11 @@ partial interface MLGraphBuilder { An {{MLInputOperandLayout}} [=string=]. Specifies the layout format of the input and output tensor as follows: - **"nchw"** - - input tensor: *[batches, input_channels, height, width]* - - output tensor: *[batches, output_channels, height, width]* + - input tensor: *[batches, inputChannels, height, width]* + - output tensor: *[batches, outputChannels, height, width]* - **"nhwc"**: - - input tensor: *[batches, height, width, input_channels]* - - output tensor: *[batches, height, width, output_channels]* + - input tensor: *[batches, height, width, inputChannels]* + - output tensor: *[batches, height, width, outputChannels]* The default value is *"nchw"*. : roundingType @@ -4799,13 +4799,13 @@ partial interface MLGraphBuilder { : scales :: A sequence of {{float}} of length 2. - Specifies the scaling factor in each spatial dimensions of the input: [scale_height, scale_width]. + Specifies the scaling factor in each spatial dimensions of the input: [scaleHeight, scaleWidth]. The default value is [1.0, 1.0]. : sizes :: A sequence of {{unsigned long}} of length 2. - Specifies the target sizes for each spatial dimensions of the input: [size_height, size_width]. When the target sizes are specified, the {{MLResample2dOptions/scales}} argument is ignored, since the scaling factor values are derived from the target sizes of each spatial dimension of the input. + Specifies the target sizes for each spatial dimensions of the input: [sizeHeight, sizeWidth]. When the target sizes are specified, the {{MLResample2dOptions/scales}} argument is ignored, since the scaling factor values are derived from the target sizes of each spatial dimension of the input. : axes :: From 2c4f00a76b307ab1ba5fd9eaf721458e708bb59e Mon Sep 17 00:00:00 2001 From: Dominique Hazael-Massieux Date: Wed, 23 Aug 2023 10:42:23 +0200 Subject: [PATCH 100/112] Remove unneeded hardcoded references to specs and definitions --- index.bs | 40 +++------------------------------------- 1 file changed, 3 insertions(+), 37 deletions(-) diff --git a/index.bs b/index.bs index d2253baa..b2c9b6e2 100644 --- a/index.bs +++ b/index.bs @@ -39,49 +39,15 @@ Status Text:

-urlPrefix: https://gpuweb.github.io/gpuweb/; spec: WEBGPU
-    type: interface
-        text: GPUDevice; url: gpu-device
-        text: GPUBuffer; url: buffer-interface
-        for: GPUBuffer; text: size; url: dom-gpubuffer-size
-        text: GPUTexture; url: texture-interface
-        text: GPUQueue; url: queues
-        text: GPUCommandBuffer; url: command-buffers
-        text: GPUCommandBufferDescriptor; url: dictdef-gpucommandbufferdescriptor
 urlPrefix: https://tc39.es/ecma262/; spec: ECMA-262
     type: dfn
         text: element size; url: table-the-typedarray-constructors
         text: element type; url: table-the-typedarray-constructors
         text: view constructor; url: table-the-typedarray-constructors
-        text: Construct; url: sec-construct
-urlPrefix: https://webidl.spec.whatwg.org/; spec: WEBIDL
-    type: interface
-        text: Promise; url: idl-promise
-        text: nullable; url: idl-nullable-type
-    type: dfn
-        text: underlying buffer; url: buffersource-underlying-buffer
-        text: buffer byte length; url: buffersource-byte-length
 urlPrefix: https://tc39.es/proposal-float16array/; spec: float16array
     type: interface
         text: Float16Array; url: sec-float16array
 
-
-{
-    "WEBGPU": {
-        "authors": [
-            "Dzmitry Malyshau",
-            "Kai Ninomiya"
-        ],
-        "href": "https://gpuweb.github.io/gpuweb/",
-        "title": "WebGPU",
-        "status": "ED",
-        "publisher": "W3C",
-        "deliveredBy": [
-            "https://www.w3.org/2020/gpu/"
-        ]
-    }
-}
-
@@ -4880,7 +4846,7 @@ partial interface MLGraphBuilder {
**Arguments:** - *input*: an {{MLOperand}}. The input tensor. - - *newShape*: a sequence of {{nullable}} {{unsigned long}}. The shape of the output tensor. + - *newShape*: a sequence of [=nullable type|nullable=] {{unsigned long}}. The shape of the output tensor. The number of elements implied by *newShape* must be the same as the number of elements in the input tensor. Only one component of *newShape* can be the special value of `null`. The size of the dimension From 7831c9611a71cbf4bb64b8d4d2b4dfda7f7c7dde Mon Sep 17 00:00:00 2001 From: Zoltan Kis Date: Wed, 23 Aug 2023 23:17:41 +0300 Subject: [PATCH 101/112] Fix review comments for matmul(), gru() and gruCell() Signed-off-by: Zoltan Kis --- index.bs | 28 ++++++++++++++++------------ 1 file changed, 16 insertions(+), 12 deletions(-) diff --git a/index.bs b/index.bs index b2c9b6e2..539d2ad5 100644 --- a/index.bs +++ b/index.bs @@ -2869,7 +2869,7 @@ partial interface MLGraphBuilder { : layout :: - An {{MLGruWeightLayout}}. The ordering of the weight and bias vectors for the internal gates of GRU, specifically the `update (z)`, `reset (r)`, and `new (n)` gate, as indicated in the second dimension of the weight and bias tensor shape. When not specified, the default layout is `"zrn"`. + An {{MLGruWeightLayout}}. The ordering of the weight and bias vectors for the internal gates of GRU, specifically the `update (z)`, `reset (r)`, and `new (n)` gate, as indicated in the second dimension of the weight and bias tensor shape. When not specified, the default layout is `"zrn"`.bias : activations :: @@ -2895,21 +2895,20 @@ partial interface MLGraphBuilder {
1. [=Assert=]: the type of |input|, |weight| and |recurrentWeight| is {{MLOperand}}. - 1. If the [=rank=] of |input| or |weight| is not `3`, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. - 1. If the [=rank=] of |weight| or |recurrentWeight| is not `2`, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. + 1. If the [=rank=] of |input| or |weight| or |recurrentWeight| is not `3`, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. 1. If |options|.{{MLGruOptions/bias}} [=map/exists=]. 1. [=Assert=]: its type is {{MLOperand}}. - 1. If its rank is not `2`, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. + 1. If |options|.{{MLGruOptions/bias}}.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}[1] is not equal to `3 * hiddenSize`, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. 1. If |options|.{{MLGruOptions/recurrentBias}} [=map/exists=]. 1. [=Assert=]: its type is {{MLOperand}}. - 1. If its rank is not `2`, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. + 1. If |options|.{{MLGruOptions/recurrentBias}}.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}[1] is not equal to `3 * hiddenSize`, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. 1. If |options|.{{MLGruOptions/initialHiddenState}} [=map/exists=]. 1. [=Assert=]: its type is {{MLOperand}}. 1. If its rank is not `3`, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. 1. If |options|.{{MLGruOptions/direction}} is not one of {{MLRecurrentNetworkDirection}}, then [=exception/throw=] a {{TypeError}}. 1. If |options|.{{MLGruOptions/layout}} is not one of {{MLGruWeightLayout}}, then [=exception/throw=] a {{TypeError}}. 1. If |options|.{{MLGruOptions/activations}} [=map/exists=] and is not an array of {{MLActivation}} objects with size `2`, then [=exception/throw=] a {{TypeError}}. - 1. If |steps| is not a [=number=] or it is `0`, then [=exception/throw=] a {{TypeError}}. + 1. If |steps| is not equal to |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}[0], then [=exception/throw=] a {{TypeError}}. 1. Let |output| be an empty sequence of {{MLOperand}} objects. 1. If any of the following sub-steps fail, [=exception/throw=] an "{{OperationError}}" {{DOMException}}. 1. Make a request to the underlying platform to: @@ -3046,14 +3045,15 @@ partial interface MLGraphBuilder {
1. [=Assert=]: the type of |input|, |weight| and |recurrentWeight| is {{MLOperand}}. - 1. If the [=rank=] of |input| or |weight| is not `3`, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. - 1. If the [=rank=] of |weight| or |recurrentWeight| is not `2`, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. + 1. If the [=rank=] of |input| or |weight| or |recurrentWeight| or |hiddenState| is not `2`, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. + 1. If |weight|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}[0] is not equal to 3 * |hiddenSize|, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. + 1. If |recurrentWeight|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}[0] is not equal to `3 * hiddenSize`, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. 1. If |options|.{{MLGruOptions/bias}} [=map/exists=]: 1. [=Assert=]: its type is {{MLOperand}}. - 1. If its rank is not `1`, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. + 1. If its rank is not equal to 3 * |hiddenSize|, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. 1. If |options|.{{MLGruOptions/recurrentBias}} [=map/exists=]: 1. [=Assert=]: its type is {{MLOperand}}. - 1. If its rank is not `1`, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. + 1. If its rank is not equal to 3 * |hiddenSize|, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. 1. If |options|.{{MLGruOptions/layout}} is not one of {{MLGruWeightLayout}}, then [=exception/throw=] a {{TypeError}}. 1. If |options|.{{MLGruOptions/activations}} [=map/exists=] and is not an array of {{MLActivation}} objects with size `2`, then [=exception/throw=] a {{TypeError}}. 1. Let |desc| a new {{MLOperandDescriptor}}. @@ -4115,8 +4115,11 @@ partial interface MLGraphBuilder { 1. Let |shapeA| be |a|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}} and |sizeA| the [=list/size=] of |shapeA|. 1. Let |shapeB| be |b|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}} and |sizeB| the [=list/size=] of |shapeB|. 1. If |sizeA| and |sizeB| is `1`, return `« 1 »`. - 1. If | sizeA| is `1` and |sizeB| is not, then insert `1` in the front of |shapeA| to become [ 1 | |shapeA| ] and let |sizeA| be `2`. - 1. If | sizeB| is `1` and |sizeA| is not, then insert `1` in the front of |shapeB| to become [ 1 | |shapeB| ] and let |sizeB| be `2`. + 1. If |sizeA| is `1` and |sizeB| is not, then insert `1` in the front of |shapeA| to become [ 1 | |shapeA| ] and let |sizeA| be `2`. + 1. If |shapeA|[0] is not equal to |shapeB|[|sizeB| - 2], then [=exception/throw=] an "{{OperationError}}" {{DOMException}}. + 1. If |sizeB| is `1` and |sizeA| is not, then insert `1` in the front of |shapeB| to become [ 1 | |shapeB| ] and let |sizeB| be `2`. + 1. If |shapeA|[|sizeA| - 1] is not equal to |shapeB|[0], then [=exception/throw=] an "{{OperationError}}" {{DOMException}}. + 1. If |shapeA| is not equal to |shapeB|, then [=exception/throw=] an "{{OperationError}}" {{DOMException}}. 1. Let |shape| be an array whose size |size| is the maximum of |sizeA| and |sizeB|. 1. [=map/For each=] |index| in [=the range=] 0 to |size|, exclusive: 1. Set |shape|[|index|] to the maximum of |shapeA|[|index|] and |shapeB|[|index|]. @@ -4133,6 +4136,7 @@ partial interface MLGraphBuilder { 1. [=Assert=]: the type of |a| and |b| is {{MLOperand}}. 1. Let |desc| a new {{MLOperandDescriptor}}. 1. Set |desc|.{{MLOperandDescriptor/dimensions}} to the result of invoking the calculate matmul output sizes steps given |a| and |b|. + 1. If that throws an error, re-[=exception/throw=] the error. 1. Set |desc|.{{MLOperandDescriptor/type}} to |a|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/type}}. 1. If any of the following sub-steps fail, [=exception/throw=] an "{{OperationError}}" {{DOMException}}. 1. Let |output| be the result of creating an MLOperand given [=this=] and |desc|. From 3e4e9d0120a6c031126349a05b538f17e23c79b1 Mon Sep 17 00:00:00 2001 From: Zoltan Kis Date: Wed, 23 Aug 2023 23:38:24 +0300 Subject: [PATCH 102/112] Remove more unnecessary type checks for #446. Fix linting errors. Signed-off-by: Zoltan Kis --- index.bs | 25 ++++++------------------- 1 file changed, 6 insertions(+), 19 deletions(-) diff --git a/index.bs b/index.bs index 539d2ad5..f79dbc86 100644 --- a/index.bs +++ b/index.bs @@ -996,7 +996,6 @@ The {{MLOperand}} objects are created by the methods of {{MLGraphBuilder}}, inte To check dimensions given |dimensions| and |type|, run the following steps:
- 1. If |dimensions| is not an array of positive numbers, return `false`; 1. If the [=list/size=] of |dimensions| is 0, return `false`. 1. If the [=list/size=] of |dimensions| is too large to be supported by the implementation, return `false`. 1. If any element of |dimensions| is not a positive number, or it is too large to be supported by the implementation given |type|, return `false`. @@ -1734,8 +1733,6 @@ Create a constant {{MLOperand}} that can be used in {{MLGraphBuilder}} methods.
The permissions and context validity have been checked by [[#api-mlgraphbuilder-constructor]] steps.
- 1. If |value| is not a [=number=], then [=exception/throw=] a {{TypeError}}. - 1. Otherwise, if |type| is not one of {{MLOperandType}}, then [=exception/throw=] a {{TypeError}}. 1. Let |descriptor| be a new {{MLOperandDescriptor}}. 1. Set |descriptor|.{{MLOperandDescriptor/type}} to |type|. 1. Set |descriptor|.{{MLOperandDescriptor/dimensions}} to `undefined`. @@ -1809,7 +1806,7 @@ partial interface MLGraphBuilder {
1. [=Assert=]: the type of |input|, |mean| and |variance| is {{MLOperand}}. - 1. If |options|.axis is not a number in [=the range=] 0 to the [=rank=] of |input|, exclusive, then [=exception/throw=] a {{TypeError}}. + 1. If |options|.axis is not in [=the range=] 0 to the [=rank=] of |input|, exclusive, then [=exception/throw=] a {{TypeError}}. 1. If the [=list/size=] of |mean|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}} is not equal with |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}[|options|.{{MLBatchNormalizationOptions/axis}}], then [=exception/throw=] a {{TypeError}}. 1. If the [=list/size=] of |variance|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}} is not equal with |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}[|options|.{{MLBatchNormalizationOptions/axis}}], then [=exception/throw=] a {{TypeError}}. 1. If |options|.{{MLBatchNormalizationOptions/scale}} [=map/exists=] and its [=list/size=] is not equal with |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}[|options|.{{MLBatchNormalizationOptions/axis}}], then [=exception/throw=] a {{TypeError}}. @@ -1885,6 +1882,7 @@ partial interface MLGraphBuilder { } } +
@@ -2905,9 +2903,7 @@ partial interface MLGraphBuilder { 1. If |options|.{{MLGruOptions/initialHiddenState}} [=map/exists=]. 1. [=Assert=]: its type is {{MLOperand}}. 1. If its rank is not `3`, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. - 1. If |options|.{{MLGruOptions/direction}} is not one of {{MLRecurrentNetworkDirection}}, then [=exception/throw=] a {{TypeError}}. - 1. If |options|.{{MLGruOptions/layout}} is not one of {{MLGruWeightLayout}}, then [=exception/throw=] a {{TypeError}}. - 1. If |options|.{{MLGruOptions/activations}} [=map/exists=] and is not an array of {{MLActivation}} objects with size `2`, then [=exception/throw=] a {{TypeError}}. + 1. If |options|.{{MLGruOptions/activations}} [=map/exists=] and its [=list/size=] is not 2, then [=exception/throw=] a {{TypeError}}. 1. If |steps| is not equal to |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}[0], then [=exception/throw=] a {{TypeError}}. 1. Let |output| be an empty sequence of {{MLOperand}} objects. 1. If any of the following sub-steps fail, [=exception/throw=] an "{{OperationError}}" {{DOMException}}. @@ -3054,8 +3050,7 @@ partial interface MLGraphBuilder { 1. If |options|.{{MLGruOptions/recurrentBias}} [=map/exists=]: 1. [=Assert=]: its type is {{MLOperand}}. 1. If its rank is not equal to 3 * |hiddenSize|, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. - 1. If |options|.{{MLGruOptions/layout}} is not one of {{MLGruWeightLayout}}, then [=exception/throw=] a {{TypeError}}. - 1. If |options|.{{MLGruOptions/activations}} [=map/exists=] and is not an array of {{MLActivation}} objects with size `2`, then [=exception/throw=] a {{TypeError}}. + 1. If |options|.{{MLGruOptions/activations}} [=map/exists=] and its [=list/size=] is not 2, then [=exception/throw=] a {{TypeError}}. 1. Let |desc| a new {{MLOperandDescriptor}}. 1. Set |desc|.{{MLOperandDescriptor/dimensions}} to [ |input|.{{MLOperandDescriptor/dimensions}}[0], |hiddenSize| ]. 1. Set |desc|.{{MLOperandDescriptor/type}} to |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/type}}. @@ -3348,7 +3343,6 @@ partial interface MLGraphBuilder { 1. Return |op|.
- ### The instanceNormalization() method ### {#api-mlgraphbuilder-instancenorm} Normalize the input features using [[Instance-Normalization]]. Unlike [[#api-mlgraphbuilder-batchnorm]] where the mean and variance values used in the calculation are previously computed across the batch dimension during the model training phase, the mean and variance values used in the calculation of an instance normalization are computed internally on the fly per input feature. @@ -3407,7 +3401,6 @@ The {{MLInstanceNormalizationOptions}} members are: 1. If the [=rank=] of |options|.{{MLInstanceNormalizationOptions/scale}} is not equal to the [=list/size=] of the channel dimension of |input|, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. 1. [=Assert=]: the type of |options|.{{MLInstanceNormalizationOptions/bias}} is {{MLOperand}}. 1. If the [=rank=] of |options|.{{MLInstanceNormalizationOptions/bias}} is not equal to the [=list/size=] of the channel dimension of |input|, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. - 1. Otherwise if |options|.{{MLInstanceNormalizationOptions/layout}} is not one of {{MLInputOperandLayout}}, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. 1. If any of the following sub-steps fail, [=exception/throw=] an "{{OperationError}}" {{DOMException}}. 1. Let |output| be the result of copying an MLOperand given |input|. 1. Make a request to the underlying platform to: @@ -3724,7 +3717,6 @@ partial interface MLGraphBuilder { The lstm(|input|, |weight|, |recurrentWeight|, |steps|, |hiddenSize|, |options|) method steps are:
- 1. If |options|.{{MLLstmOptions/direction}} is not one of {{MLRecurrentNetworkDirection}}, then [=exception/throw=] a {{TypeError}}. 1. Let |numDirections| be `1` if |options|.{{MLLstmOptions/direction}} is `"forward"`, or otherwise let it be `2`. 1. [=Assert=]: the type of |input|, |weight| and |recurrentWeight| is {{MLOperand}}.
@@ -3759,9 +3751,8 @@ partial interface MLGraphBuilder { 1. If |options|.{{MLLstmOptions/initialCellState}}.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}[0] is not |numDirections|, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. 1. If |options|.{{MLLstmOptions/initialCellState}}.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}[1] is not equal to |batchSize|, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. 1. If |options|.{{MLLstmOptions/initialCellState}}.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}[2] is not |hiddenSize|, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. - 1. If |options|.{{MLLstmOptions/layout}} is not one of {{MLLstmWeightLayout}}, then [=exception/throw=] a {{TypeError}}. 1. If |options|.{{MLLstmOptions/activations}} [=map/exists=]: - 1. If it is not an array of size `3`, then [=exception/throw=] a {{TypeError}}. + 1. If its [=list/size=] is not 3, then [=exception/throw=] a {{TypeError}}. 1. [=Assert=]: the type of its elements is {{MLActivation}}. 1. If any of the following sub-steps fail, [=exception/throw=] an "{{OperationError}}" {{DOMException}}. 1. Let |desc| a new {{MLOperandDescriptor}}. @@ -3942,9 +3933,8 @@ partial interface MLGraphBuilder { 1. [=Assert=]: its type is {{MLOperand}}. 1. If its rank is not `1`, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. 1. If |options|.{{MLLstmCellOptions/peepholeWeight}}.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}[0] is not 3 * |hiddenSize|, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. - 1. If |options|.{{MLLstmCellOptions/layout}} is not one of {{MLLstmWeightLayout}}, then [=exception/throw=] a {{TypeError}}. 1. If |options|.{{MLLstmCellOptions/activations}} [=map/exists=]: - 1. If it is not an array of size `3`, then [=exception/throw=] a {{TypeError}}. + 1. If its [=list/size=] is not 3, then [=exception/throw=] a {{TypeError}}. 1. [=Assert=]: the type of its elements is {{MLActivation}}. 1. Let |desc| a new {{MLOperandDescriptor}}. 1. Set |desc|.{{MLOperandDescriptor/dimensions}} to [ |batchSize|, |hiddenSize| ]. @@ -4221,7 +4211,6 @@ partial interface MLGraphBuilder {
1. [=Assert=]: the type of |input| is {{MLOperand}}. - 1. If |options|.{{MLPadOptions/mode}} is not one of {{MLPaddingMode}}, then [=exception/throw=] a {{TypeError}}. 1. Let |desc| be a copy of |input|.{{MLOperand/[[descriptor]]}}. 1. Set |desc|.{{MLOperandDescriptor/dimensions}} to the result of invoking the calculate padding output sizes steps given |input|, |beginningPadding| and |endingPadding|. 1. If any of the following sub-steps fail, [=exception/throw=] an "{{OperationError}}" {{DOMException}}. @@ -4790,7 +4779,6 @@ partial interface MLGraphBuilder { To check resample options given |options|, run the following steps:
- 1. If |options|.{{MLResample2dOptions/mode}} [=map/exists=], and if its value is not one of `"nearest-neighbor"` or `"linear"`, return `false`. 1. If |options|.{{MLResample2dOptions/scales}} does not [=map/exist=], set it to to `« 1.0, 1.0 »`. 1. Otherwise, if any of its values is not greater than `0`, return `false`. 1. If |options|.{{MLResample2dOptions/sizes}} [=map/exists=], and if its size is not `2`, or if any of its values is not greater than `0`, return `false`. @@ -5289,7 +5277,6 @@ partial interface MLGraphBuilder {
1. [=Assert=]: the type of |input| is {{MLOperand}}. - 1. If |splits| is not a non-zero {{unsigned long}} or a sequence of {{unsigned long}}, then [=exception/throw=] a {{TypeError}}. 1. If |splits| is an {{unsigned long}}, and |input|.{{MLOperandDescriptor/dimensions}}[|options|.{{MLSplitOptions/axis}}] % |splits| is not 0, then [=exception/throw=] a {{TypeError}}. 1. If |splits| is a sequence of {{unsigned long}}, and the sum of its elements is not equal to |input|.{{MLOperandDescriptor/dimensions}}[|options|.{{MLSplitOptions/axis}}], then [=exception/throw=] a {{TypeError}}. 1. If any of the following sub-steps fail, [=exception/throw=] an "{{OperationError}}" {{DOMException}}. From 4ca7e8eb0d29d9fb2a0a0f7e3b88123fa202d16e Mon Sep 17 00:00:00 2001 From: Zoltan Kis Date: Wed, 23 Aug 2023 23:48:13 +0300 Subject: [PATCH 103/112] Remove code style from true, false and scalars Signed-off-by: Zoltan Kis --- index.bs | 190 +++++++++++++++++++++++++++---------------------------- 1 file changed, 95 insertions(+), 95 deletions(-) diff --git a/index.bs b/index.bs index f79dbc86..7ed95ff2 100644 --- a/index.bs +++ b/index.bs @@ -803,7 +803,7 @@ Its default allowlist is 'self'. 1. Let |promise| be [=a new promise=]. 1. Return |promise| and run the following steps [=in parallel=]. 1. Let |context| be the result of [=creating a context=] given |options|. - 1. If validating MLContext given |context| returns `false`, [=reject=] |promise| with a "{{NotSupportedError}}" {{DOMException}}. + 1. If validating MLContext given |context| returns false, [=reject=] |promise| with a "{{NotSupportedError}}" {{DOMException}}. 1. [=Resolve=] |promise| with |context|.
@@ -817,7 +817,7 @@ Its default allowlist is 'self'. 1. Let |promise| be [=a new promise=]. 1. Return |promise| and run the following steps [=in parallel=]. 1. Let |context| be the result of [=creating a context=] given |gpuDevice|. - 1. If validating MLContext given |context| returns `false`, [=reject=] |promise| with a "{{NotSupportedError}}" {{DOMException}}. + 1. If validating MLContext given |context| returns false, [=reject=] |promise| with a "{{NotSupportedError}}" {{DOMException}}. 1. [=Resolve=] |promise| with |context|.
@@ -831,7 +831,7 @@ Its default allowlist is 'self'.
1. If [=this=]'s [=relevant global object=]'s [=associated Document=] is not [=allowed to use=] the [=webnn-feature|webnn=] feature, then [=exception/throw=] a "{{SecurityError}}" {{DOMException}}. 1. Let |context| be the result [=creating a context=] |options|. - 1. If validating MLContext given |context| return `false`, then [=exception/throw=] a "{{NotSupportedError}}" {{DOMException}}. + 1. If validating MLContext given |context| return false, then [=exception/throw=] a "{{NotSupportedError}}" {{DOMException}}. 1. Return |context|.
@@ -843,7 +843,7 @@ Its default allowlist is 'self'.
1. If [=this=]'s [=relevant global object=]'s [=associated Document=] is not [=allowed to use=] the [=webnn-feature|webnn=] feature, then [=exception/throw=] a "{{SecurityError}}" {{DOMException}}. 1. Let |context| be the result [=creating a context=] with |gpuDevice|. - 1. If validating MLContext given |context| return `false`, then [=exception/throw=] a "{{NotSupportedError}}" {{DOMException}}. + 1. If validating MLContext given |context| return false, then [=exception/throw=] a "{{NotSupportedError}}" {{DOMException}}. 1. Return |context|.
@@ -996,10 +996,10 @@ The {{MLOperand}} objects are created by the methods of {{MLGraphBuilder}}, inte To check dimensions given |dimensions| and |type|, run the following steps:
- 1. If the [=list/size=] of |dimensions| is 0, return `false`. - 1. If the [=list/size=] of |dimensions| is too large to be supported by the implementation, return `false`. - 1. If any element of |dimensions| is not a positive number, or it is too large to be supported by the implementation given |type|, return `false`. - 1. Return `true`. + 1. If the [=list/size=] of |dimensions| is 0, return false. + 1. If the [=list/size=] of |dimensions| is too large to be supported by the implementation, return false. + 1. If any element of |dimensions| is not a positive number, or it is too large to be supported by the implementation given |type|, return false. + 1. Return true.
@@ -1009,10 +1009,10 @@ The {{MLOperand}} objects are created by the methods of {{MLGraphBuilder}}, inte
1. [=Assert=]: the type of |operand|.{{MLOperand/[[builder]]}} is {{MLGraphBuilder}}. - 1. If |builder| is not equal to |operand|.{{MLOperand/[[builder]]}}, return `false`. + 1. If |builder| is not equal to |operand|.{{MLOperand/[[builder]]}}, return false. 1. Let |desc| be |operand|.{{MLOperand/[[descriptor]]}}. - 1. If |desc|.{{MLOperandDescriptor/dimensions}} [=map/exists=] and invoking check dimensions given |desc|.{{MLOperandDescriptor/dimensions}} and |desc|.{{MLOperandDescriptor/type}} returns `false`, then return `false`. - 1. Return `true`. + 1. If |desc|.{{MLOperandDescriptor/dimensions}} [=map/exists=] and invoking check dimensions given |desc|.{{MLOperandDescriptor/dimensions}} and |desc|.{{MLOperandDescriptor/type}} returns false, then return false. + 1. Return true.
@@ -1136,11 +1136,11 @@ When the {{[[contextType]]}} is set to [=default-context|default=] with the {{ML To validate MLContext, given |context|, run these steps:
- 1. If |context|.{{[[contextType]]}} is not "[=webgpu-context|webgpu=]" or "[=default-context|default=]", return `false`. - 1. If |context|.{{[[deviceType]]}} is not "[=device-type-cpu|cpu=]" or "[=device-type-gpu|gpu=]", return `false`. - 1. If |context|.{{[[powerPreference]]}} is not "[=power-preference-default|default=]" or "[=power-preference-high-performance|high-performance=]" or "[=power-preference-low-power|low-power=]", return `false`. - 1. If the user agent cannot support |context|.{{[[contextType]]}}, |context|.{{[[deviceType]]}} and |context|.{{[[powerPreference]]}}, return `false`. - 1. Return `true`; + 1. If |context|.{{[[contextType]]}} is not "[=webgpu-context|webgpu=]" or "[=default-context|default=]", return false. + 1. If |context|.{{[[deviceType]]}} is not "[=device-type-cpu|cpu=]" or "[=device-type-gpu|gpu=]", return false. + 1. If |context|.{{[[powerPreference]]}} is not "[=power-preference-default|default=]" or "[=power-preference-high-performance|high-performance=]" or "[=power-preference-low-power|low-power=]", return false. + 1. If the user agent cannot support |context|.{{[[contextType]]}}, |context|.{{[[deviceType]]}} and |context|.{{[[powerPreference]]}}, return false. + 1. Return true;
@@ -1170,8 +1170,8 @@ partial interface MLContext {
1. If |graph|.{{MLGraph/[[context]]}}.{{MLContext/[[contextType]]}} is not "[=default-context|default=]", [=exception/throw=] an "{{OperationError}}" {{DOMException}}. - 1. If validating graph resources given |inputs| and |graph|.{{MLGraph/[[inputDescriptors]]}} returns `false`, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. - 1. If validating graph resources given |outputs| and |graph|.{{MLGraph/[[outputDescriptors]]}} returns `false`, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. + 1. If validating graph resources given |inputs| and |graph|.{{MLGraph/[[inputDescriptors]]}} returns false, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. + 1. If validating graph resources given |outputs| and |graph|.{{MLGraph/[[outputDescriptors]]}} returns false, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. 1. Invoke execute graph given |graph|, |inputs| and |outputs|. 1. If that [=exception/throws=] an error, re-[=exception/throw=] the error. 1. Return {{undefined}}. @@ -1185,10 +1185,10 @@ partial interface MLContext {
1. [=Assert=]: the type of |resources| is {{MLNamedArrayBufferViews}}. 1. [=map/For each=] [=record=] <|key|, |value|> of |resources|: - 1. If |descriptors|[|key|] does not [=map/exist=], return `false`. + 1. If |descriptors|[|key|] does not [=map/exist=], return false. 1. [=Assert=]: the type of |value| is {{ArrayBufferView}}. - 1. If validating buffer with descriptor given |value| and |descriptors|[|key|] returns `false`, then return `false`. - 1. Return `true`. + 1. If validating buffer with descriptor given |value| and |descriptors|[|key|] returns false, then return false. + 1. Return true.
@@ -1197,9 +1197,9 @@ partial interface MLContext { To validate buffer with descriptor given |bufferView| and |descriptor|, run the following steps:
- 1. If |bufferView| is not an {{MLBufferView}}, return `false`. - 1. If |bufferView|'s [=element type=] does not match to |descriptor|.{{MLOperandDescriptor/type}} according to [this table](#appendices-mloperandtype-arraybufferview-compatibility), return `false`. - 1. If |bufferView|.\[[ByteLength]] is not equal to the [=byte length=] of |descriptor|, return `false`. + 1. If |bufferView| is not an {{MLBufferView}}, return false. + 1. If |bufferView|'s [=element type=] does not match to |descriptor|.{{MLOperandDescriptor/type}} according to [this table](#appendices-mloperandtype-arraybufferview-compatibility), return false. + 1. If |bufferView|.\[[ByteLength]] is not equal to the [=byte length=] of |descriptor|, return false.
@@ -1323,8 +1323,8 @@ partial interface MLContext { 1. Let |promise| be [=a new promise=]. 1. Return |promise| and run the following steps [=in parallel=]: 1. If |graph|.{{MLGraph/[[context]]}}.{{MLContext/[[contextType]]}} is not "[=default-context|default=]", [=reject=] |promise| with an "{{OperationError}}" {{DOMException}}. - 1. If validating graph resources given |inputs| and |graph|.{{MLGraph/[[inputDescriptors]]}} returns `false`, then [=reject=] |promise| with a "{{DataError}}" {{DOMException}}. - 1. If validating graph resources given |outputs| and |graph|.{{MLGraph/[[outputDescriptors]]}} returns `false`, then [=reject=] |promise| with a "{{DataError}}" {{DOMException}}. + 1. If validating graph resources given |inputs| and |graph|.{{MLGraph/[[inputDescriptors]]}} returns false, then [=reject=] |promise| with a "{{DataError}}" {{DOMException}}. + 1. If validating graph resources given |outputs| and |graph|.{{MLGraph/[[outputDescriptors]]}} returns false, then [=reject=] |promise| with a "{{DataError}}" {{DOMException}}. 1. Let |transferredInputs| be the result of [=MLNamedArrayBufferViews/transfer|transferring=] {{MLNamedArrayBufferViews}} |inputs|. 1. Let |transferredOutputs| be the result of [=MLNamedArrayBufferViews/transfer|transferring=] {{MLNamedArrayBufferViews}} |outputs|. 1. Invoke execute graph given |graph|, |transferredInputs| and |transferredOutputs|. @@ -1586,7 +1586,7 @@ Both {{MLGraphBuilder}}.{{MLGraphBuilder/build()}} and {{MLGraphBuilder}}.{{MLGr
1. If [=this=]'s [=relevant global object=]'s [=associated Document=] is not [=allowed to use=] the [=webnn-feature|webnn=] feature, then [=exception/throw=] a "{{SecurityError}}" {{DOMException}}. - 1. If validating MLContext given |context| returns `false`, then [=exception/throw=] a "{{TypeError}}" and abort these steps. + 1. If validating MLContext given |context| returns false, then [=exception/throw=] a "{{TypeError}}" and abort these steps. 1. Set {{MLGraphBuilder/[[context]]}} to |context|.
@@ -1613,7 +1613,7 @@ Create a named {{MLOperand}} based on a descriptor, that can be used as an input 1. [=Assert=]: the type of |descriptor| is {{MLOperandDescriptor}}. 1. [=Assert=]: If |descriptor|.{{MLOperandDescriptor/dimensions}} does not [=map/exist=], then |descriptor| defines a scalar input. 1. If |descriptor|.{{MLOperandDescriptor/dimensions}} [=map/exists=]: - 1. If the [=check dimensions=] steps given |descriptor|.{{MLOperandDescriptor/type}} and |descriptor|.{{MLOperandDescriptor/dimensions}} return `false`, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. + 1. If the [=check dimensions=] steps given |descriptor|.{{MLOperandDescriptor/type}} and |descriptor|.{{MLOperandDescriptor/dimensions}} return false, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. 1. If the [=byte length=] of |descriptor| is not supported by the underlying platform, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. 1. If any of the following sub-steps fail, [=exception/throw=] an "{{OperationError}}" {{DOMException}}. 1. Let |operand| be the result of creating an MLOperand given [=this=] and |descriptor|. @@ -1670,7 +1670,7 @@ Build a composed graph up to a given output operand into a computational graph, 1. Store a reference to |graphImpl| in |graph|.{{MLGraph/[[implementation]]}}. 1. Make a request to the underlying platform to initialize the graph: 1. [=map/For each=] |operand| in |outputs|: - 1. If validating MLOperand given |operand| and [=this=] returns `false`, then [=exception/throw=] a {{TypeError}}. + 1. If validating MLOperand given |operand| and [=this=] returns false, then [=exception/throw=] a {{TypeError}}. 1. If |operand| was created as an input by the underlying platform: 1. If |operand|.{{MLOperand/[[name]]}}] is not unique for |graphImpl|, then [=exception/throw=] a {{TypeError}}. 1. Add |operand|.{{MLOperand/[[descriptor]]}} to |graph|.{{MLGraph/[[inputDescriptors]]}}[|operand|.{{MLOperand/[[name]]}}]. @@ -1703,8 +1703,8 @@ Create a constant {{MLOperand}} that can be used in {{MLGraphBuilder}} methods.
1. [=Assert=]: the type of |descriptor| is {{MLOperandDescriptor}}. 1. If the [=byte length=] of |descriptor| is not supported by the underlying platform, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. - 1. If the [=check dimensions=] steps given |descriptor|.{{MLOperandDescriptor/type}} and |descriptor|.{{MLOperandDescriptor/dimensions}} return `false`, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. - 1. If validating buffer with descriptor given |bufferView| and |descriptor| returns `false`, then [=exception/throw=] a {{TypeError}}. + 1. If the [=check dimensions=] steps given |descriptor|.{{MLOperandDescriptor/type}} and |descriptor|.{{MLOperandDescriptor/dimensions}} return false, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. + 1. If validating buffer with descriptor given |bufferView| and |descriptor| returns false, then [=exception/throw=] a {{TypeError}}. 1. If any of the following sub-steps fail, [=exception/throw=] an "{{OperationError}}" {{DOMException}}. 1. Let |operand| be the result of creating an MLOperand given [=this=] and |descriptor|. 1. Let |bytes| be the result of invoking the [=get a copy of the bytes held by the buffer source=] steps given |bufferView|. @@ -1890,8 +1890,8 @@ partial interface MLGraphBuilder { To check clamp options given |options|, run the following steps:
- 1. If |options|.{{MLClampOptions/minValue}} is greater than |options|.{{MLClampOptions/maxValue}}, then return `false`. - 1. Return `true`. + 1. If |options|.{{MLClampOptions/minValue}} is greater than |options|.{{MLClampOptions/maxValue}}, then return false. + 1. Return true.
@@ -1913,7 +1913,7 @@ partial interface MLGraphBuilder {
1. [=Assert=]: the type of |operand| is {{MLOperand}}. - 1. If running the check clamp options steps given |options| returns `false`, then [=exception/throw=] a {{TypeError}}. + 1. If running the check clamp options steps given |options| returns false, then [=exception/throw=] a {{TypeError}}. 1. If any of the following sub-steps fail, [=exception/throw=] an "{{OperationError}}" {{DOMException}}. 1. Let |output| be the result of copying an MLOperand given |operand|. 1. Make a request to the underlying platform to: @@ -1943,7 +1943,7 @@ partial interface MLGraphBuilder { The clamp(|options|) method steps are:
- 1. If running the check clamp options steps given |options| returns `false`, then [=exception/throw=] a {{TypeError}}. + 1. If running the check clamp options steps given |options| returns false, then [=exception/throw=] a {{TypeError}}. 1. Let |op| be the result of creating an MLActivation given [=this=], `"clamp"` and |options|. 1. If that [=exception/throws=] an error, re-[=exception/throw=] the error. 1. Return |op|. @@ -1985,9 +1985,9 @@ partial interface MLGraphBuilder { 1. If any of the following steps fail, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. 1. Let |desc| be |inputs|[0].{{MLOperand/[[descriptor]]}}. 1. If |axis| is greater than or equal to the [=rank=] of |desc|, fail. - 1. Let |desc|.{{MLOperandDescriptor/dimensions}}[|axis|] be `0`. + 1. Let |desc|.{{MLOperandDescriptor/dimensions}}[|axis|] be 0. 1. [=map/For each=] |index| in [=the range=] 0 to the [=rank=] of |inputs|, exclusive: - 1. If validating MLOperand given |inputs|[|index|] and [=this=] returns `false`, then fail. + 1. If validating MLOperand given |inputs|[|index|] and [=this=] returns false, then fail. 1. [=map/For each=] |dim| in [=the range=] 0 to the [=rank=] of |inputs|[|index|].{{MLOperandDescriptor/dimensions}}, exclusive:
If the shape of each corresponding dimension and type of the operands, except for those of the dimension given by |axis|, is not the same, fail. @@ -2137,16 +2137,16 @@ partial interface MLGraphBuilder { 1. [=Assert=]: the type of |input| and |filter| is {{MLOperand}}. 1. Let |inputSize| be the [=list/size=] of |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}. 1. Let |filterSize| be the [=list/size=] of |filter|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}. - 1. If |inputSize| is not `4`, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. - 1. If |filterSize| is not `4`, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. + 1. If |inputSize| is not 4, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. + 1. If |filterSize| is not 4, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. 1. If the type of |input| and |filter| is not the same, then [=exception/throw=] a {{TypeError}}. 1. If |options|.{{MLConv2dOptions/padding}} does not [=map/exist=], set it to `« 0, 0, 0, 0 »`. 1. Else if the [=list/size=] of |options|.{{MLConv2dOptions/padding}} is not 4, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. 1. If |options|.{{MLConv2dOptions/strides}} does not [=map/exist=], set it to `« 1, 1 »`. - 1. Else if the [=list/size=] of |options|.{{MLConv2dOptions/strides}} is not `2`, then [=exception/throw=] a {{TypeError}}. + 1. Else if the [=list/size=] of |options|.{{MLConv2dOptions/strides}} is not 2, then [=exception/throw=] a {{TypeError}}. 1. If any element in |options|.{{MLConv2dOptions/strides}} is equal to 0, then [=exception/throw=] a {{TypeError}}. 1. If |options|.{{MLConv2dOptions/dilations}} does not [=map/exist=], set it to `« 1, 1 »`. - 1. Else if the [=list/size=] of |options|.{{MLConv2dOptions/dilations}} is not `2`, then [=exception/throw=] a {{TypeError}}. + 1. Else if the [=list/size=] of |options|.{{MLConv2dOptions/dilations}} is not 2, then [=exception/throw=] a {{TypeError}}. 1. If |options|.{{MLConv2dOptions/autoPad}} does not [=map/exist=], set it to `"explicit"`. 1. If |options|.{{MLConv2dOptions/groups}} is 0, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. 1. If |inputSize| / |options|.{{MLConv2dOptions/groups}} is not equal to |filterSize|, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. @@ -2313,20 +2313,20 @@ partial interface MLGraphBuilder { 1. [=Assert=]: the type of |input| and |filter| is {{MLOperand}}. 1. Let |inputSize| be the [=list/size=] of |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}. 1. Let |filterSize| be the [=list/size=] of |filter|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}. - 1. If |inputSize| is not `4`, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. - 1. If |filterSize| is not `4`, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. + 1. If |inputSize| is not 4, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. + 1. If |filterSize| is not 4, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. 1. If the type of |input| and |filter| is not the same, then [=exception/throw=] a {{TypeError}}. 1. If |options|.{{MLConvTranspose2dOptions/padding}} does not [=map/exist=], set it to `« 0, 0, 0, 0 »`. 1. Else if the [=list/size=] of |options|.{{MLConvTranspose2dOptions/padding}} is not 4, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. 1. If |options|.{{MLConvTranspose2dOptions/strides}} does not [=map/exist=], set it to `« 1, 1 »`. - 1. Else if the [=list/size=] of |options|.{{MLConvTranspose2dOptions/strides}} is not `2`, then [=exception/throw=] a {{TypeError}}. + 1. Else if the [=list/size=] of |options|.{{MLConvTranspose2dOptions/strides}} is not 2, then [=exception/throw=] a {{TypeError}}. 1. If any element in |options|.{{MLConv2dOptions/strides}} is equal to 0, then [=exception/throw=] a {{TypeError}}. 1. If |options|.{{MLConvTranspose2dOptions/dilations}} does not [=map/exist=], set it to `« 1, 1 »`. - 1. Else if the [=list/size=] of |options|.{{MLConvTranspose2dOptions/dilations}} is not `2`, then [=exception/throw=] a {{TypeError}}. + 1. Else if the [=list/size=] of |options|.{{MLConvTranspose2dOptions/dilations}} is not 2, then [=exception/throw=] a {{TypeError}}. 1. If |options|.{{MLConvTranspose2dOptions/outputPadding}} does not [=map/exist=], set it to `« 0, 0 »`. - 1. Else if the [=list/size=] of |options|.{{MLConvTranspose2dOptions/outputPadding}} is not `2`, then [=exception/throw=] a {{TypeError}}. + 1. Else if the [=list/size=] of |options|.{{MLConvTranspose2dOptions/outputPadding}} is not 2, then [=exception/throw=] a {{TypeError}}. 1. If |options|.{{MLConvTranspose2dOptions/outputSizes}} [=map/exists=]: - 1. If the [=list/size=] of |options|.{{MLConvTranspose2dOptions/outputSizes}} is not `2`, then [=exception/throw=] a {{TypeError}}. + 1. If the [=list/size=] of |options|.{{MLConvTranspose2dOptions/outputSizes}} is not 2, then [=exception/throw=] a {{TypeError}}. 1. If the elements of |options|.{{MLConvTranspose2dOptions/outputSizes}} are not smaller than the elements at the same dimension (index) for |options|.{{MLConvTranspose2dOptions/strides}}, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. 1. If |inputSize| / |options|.{{MLConvTranspose2dOptions/groups}} is not equal to |filterSize|, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. 1. Else if |inputSize| % |options|.{{MLConvTranspose2dOptions/groups}} is not 0, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. @@ -2726,7 +2726,7 @@ partial interface MLGraphBuilder {
: c :: - An {{MLOperand}}. Specifies the third input tensor. It is either a scalar, or of the shape that is unidirectionally broadcastable to the shape [M, N] according to [[!numpy-broadcasting-rule]]. When it is not specified, the computation is done as if *c* is a scalar `0.0`. + An {{MLOperand}}. Specifies the third input tensor. It is either a scalar, or of the shape that is unidirectionally broadcastable to the shape [M, N] according to [[!numpy-broadcasting-rule]]. When it is not specified, the computation is done as if *c* is a scalar 0.0. : alpha :: @@ -2763,9 +2763,9 @@ partial interface MLGraphBuilder { 1. [=Assert=]: the type of |a| and |b| is {{MLOperand}}. 1. Let |shapeA| be |a|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}} and |sizeA| the [=list/size=] of |shapeA|. 1. Let |shapeB| be |b|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}} and |sizeB| the [=list/size=] of |shapeB|. - 1. If |sizeA| is not `2` or |sizeB| is not `2`, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. - 1. If |options|.{{MLGemmOptions/aTranspose}} is `true`, then let |shapeA| be the reverse array of |shapeA|. - 1. If |options|.{{MLGemmOptions/bTranspose}} is `true`, then let |shapeB| be the reverse array of |shapeB|. + 1. If |sizeA| is not 2 or |sizeB| is not 2, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. + 1. If |options|.{{MLGemmOptions/aTranspose}} is true, then let |shapeA| be the reverse array of |shapeA|. + 1. If |options|.{{MLGemmOptions/bTranspose}} is true, then let |shapeB| be the reverse array of |shapeB|. 1. If |shapeA|[1] is not equal to |shapeB|[0], then [=exception/throw=] a "{{DataError}}" {{DOMException}}. 1. If |options|.{{MLGemmOptions/c}} [=map/exists=] and is not unidirectionally broadcastable to the shape [|shapeA|[0], |shapeB|[1]] according to the [[!numpy-broadcasting-rule]], then [=exception/throw=] a "{{DataError}}" {{DOMException}}.
@@ -2854,16 +2854,16 @@ partial interface MLGraphBuilder { : resetAfter :: - A {{boolean}} indicating whether to apply the reset gate after or before matrix multiplication. The default value is `true`. + A {{boolean}} indicating whether to apply the reset gate after or before matrix multiplication. The default value is true. : returnSequence :: A {{boolean}} indicating whether to also return the entire sequence with every output from each time step in it in addition to the output of the last time step. - The default value is `false`. + The default value is false. : direction :: - An {{MLRecurrentNetworkDirection}}. Specifies the processing direction of the input sequence. When set to `"both"`, the size of the first dimension of the weight and the bias tensor shapes must be `2`, and the input is processed in both directions. + An {{MLRecurrentNetworkDirection}}. Specifies the processing direction of the input sequence. When set to `"both"`, the size of the first dimension of the weight and the bias tensor shapes must be 2, and the input is processed in both directions. : layout :: @@ -2883,7 +2883,7 @@ partial interface MLGraphBuilder { - *hiddenSize*: an {{unsigned long}} scalar. The value of the third dimension of the cell output tensor shape. It indicates the number of features in the hidden state. - *options*: an optional {{MLGruOptions}}. The optional parameters of the operation. - **Returns:** a sequence of {{MLOperand}}. The first element of the sequence is a 3-D tensor of shape [numDirections, batchSize, hiddenSize], the cell output from the last time step of the network. Additionally, if |options|.{{MLGruOptions/returnSequence}} is set to `true`, the second element is the 4-D output tensor of shape [steps, numDirections, batchSize, hiddenSize] containing every cell outputs from each time step in the temporal sequence. + **Returns:** a sequence of {{MLOperand}}. The first element of the sequence is a 3-D tensor of shape [numDirections, batchSize, hiddenSize], the cell output from the last time step of the network. Additionally, if |options|.{{MLGruOptions/returnSequence}} is set to true, the second element is the 4-D output tensor of shape [steps, numDirections, batchSize, hiddenSize] containing every cell outputs from each time step in the temporal sequence.
@@ -2893,16 +2893,16 @@ partial interface MLGraphBuilder {
1. [=Assert=]: the type of |input|, |weight| and |recurrentWeight| is {{MLOperand}}. - 1. If the [=rank=] of |input| or |weight| or |recurrentWeight| is not `3`, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. + 1. If the [=rank=] of |input| or |weight| or |recurrentWeight| is not 3, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. 1. If |options|.{{MLGruOptions/bias}} [=map/exists=]. 1. [=Assert=]: its type is {{MLOperand}}. - 1. If |options|.{{MLGruOptions/bias}}.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}[1] is not equal to `3 * hiddenSize`, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. + 1. If |options|.{{MLGruOptions/bias}}.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}[1] is not equal to 3 * |hiddenSize|, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. 1. If |options|.{{MLGruOptions/recurrentBias}} [=map/exists=]. 1. [=Assert=]: its type is {{MLOperand}}. - 1. If |options|.{{MLGruOptions/recurrentBias}}.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}[1] is not equal to `3 * hiddenSize`, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. + 1. If |options|.{{MLGruOptions/recurrentBias}}.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}[1] is not equal to 3 * |hiddenSize|, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. 1. If |options|.{{MLGruOptions/initialHiddenState}} [=map/exists=]. 1. [=Assert=]: its type is {{MLOperand}}. - 1. If its rank is not `3`, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. + 1. If its rank is not 3, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. 1. If |options|.{{MLGruOptions/activations}} [=map/exists=] and its [=list/size=] is not 2, then [=exception/throw=] a {{TypeError}}. 1. If |steps| is not equal to |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}[0], then [=exception/throw=] a {{TypeError}}. 1. Let |output| be an empty sequence of {{MLOperand}} objects. @@ -3011,7 +3011,7 @@ partial interface MLGraphBuilder { : resetAfter :: - A {{boolean}} indicating whether to apply the reset gate after or before matrix multiplication. The default value is `true`. + A {{boolean}} indicating whether to apply the reset gate after or before matrix multiplication. The default value is true. : layout :: @@ -3041,9 +3041,9 @@ partial interface MLGraphBuilder {
1. [=Assert=]: the type of |input|, |weight| and |recurrentWeight| is {{MLOperand}}. - 1. If the [=rank=] of |input| or |weight| or |recurrentWeight| or |hiddenState| is not `2`, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. + 1. If the [=rank=] of |input| or |weight| or |recurrentWeight| or |hiddenState| is not 2, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. 1. If |weight|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}[0] is not equal to 3 * |hiddenSize|, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. - 1. If |recurrentWeight|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}[0] is not equal to `3 * hiddenSize`, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. + 1. If |recurrentWeight|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}[0] is not equal to 3 * |hiddenSize|, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. 1. If |options|.{{MLGruOptions/bias}} [=map/exists=]: 1. [=Assert=]: its type is {{MLOperand}}. 1. If its rank is not equal to 3 * |hiddenSize|, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. @@ -3207,11 +3207,11 @@ partial interface MLGraphBuilder { : alpha :: A {{float}} scalar multiplier. - The default value is `0.2`. + The default value is 0.2. : beta :: A {{float}} scalar addition. - The default value is `0.5`. + The default value is 0.5.
#### The {{MLGraphBuilder/hardSigmoid(input, options)}} method #### {#api-mlgraphbuilder-hardsigmoid-input-options} @@ -3396,7 +3396,7 @@ The {{MLInstanceNormalizationOptions}} members are:
1. [=Assert=]: the type of |input| is {{MLOperand}}. - 1. If the [=rank=] of |input| is not `4`, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. + 1. If the [=rank=] of |input| is not 4, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. 1. [=Assert=]: the type of |options|.{{MLInstanceNormalizationOptions/scale}} is {{MLOperand}}. 1. If the [=rank=] of |options|.{{MLInstanceNormalizationOptions/scale}} is not equal to the [=list/size=] of the channel dimension of |input|, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. 1. [=Assert=]: the type of |options|.{{MLInstanceNormalizationOptions/bias}} is {{MLOperand}}. @@ -3486,7 +3486,7 @@ partial interface MLGraphBuilder { : alpha :: A {{float}} scalar multiplier. - The default value is `0.01`. + The default value is 0.01.
#### The {{MLGraphBuilder/leakyRelu(input, options)}} method #### {#api-mlgraphbuilder-leaky-relu-input-options} @@ -3575,11 +3575,11 @@ partial interface MLGraphBuilder { : alpha :: A {{float}} scalar multiplier. - The default value is `1`. + The default value is 1. : beta :: A {{float}} scalar addition. - The default value is `0`. + The default value is 0. #### The {{MLGraphBuilder/linear(input, options)}} method #### {#api-mlgraphbuilder-linear-input-options} @@ -3688,7 +3688,7 @@ partial interface MLGraphBuilder { : direction :: - An {{MLRecurrentNetworkDirection}}. Specifies the processing direction of the input sequence. When set to `"both"`, the size of the first dimension of the weight and the bias tensor shapes must be `2`, and the input is processed in both directions. + An {{MLRecurrentNetworkDirection}}. Specifies the processing direction of the input sequence. When set to `"both"`, the size of the first dimension of the weight and the bias tensor shapes must be 2, and the input is processed in both directions. : layout :: @@ -3717,7 +3717,7 @@ partial interface MLGraphBuilder { The lstm(|input|, |weight|, |recurrentWeight|, |steps|, |hiddenSize|, |options|) method steps are:
- 1. Let |numDirections| be `1` if |options|.{{MLLstmOptions/direction}} is `"forward"`, or otherwise let it be `2`. + 1. Let |numDirections| be 1 if |options|.{{MLLstmOptions/direction}} is `"forward"`, or otherwise let it be 2. 1. [=Assert=]: the type of |input|, |weight| and |recurrentWeight| is {{MLOperand}}.
The shape of |input|, |weight| or |recurrentWeight| could be also checked here. @@ -3726,28 +3726,28 @@ partial interface MLGraphBuilder { 1. Let |batchSize| be |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}[1]. 1. If |options|.{{MLLstmOptions/bias}} [=map/exists=]: 1. [=Assert=]: its type is {{MLOperand}}. - 1. If its rank is not `2`, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. + 1. If its rank is not 2, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. 1. If |options|.{{MLLstmOptions/bias}}.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}[0] is not |numDirections|, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. 1. If |options|.{{MLLstmOptions/bias}}.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}[1] is not 4 * |hiddenSize|, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. 1. If |options|.{{MLLstmOptions/recurrentBias}} [=map/exists=]: 1. [=Assert=]: its type is {{MLOperand}}. - 1. If its rank is not `2`, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. + 1. If its rank is not 2, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. 1. If |options|.{{MLLstmOptions/recurrentBias}}.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}[0] is not |numDirections|, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. 1. If |options|.{{MLLstmOptions/recurrentBias}}.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}[1] is not 4 * |hiddenSize|, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. 1. If |options|.{{MLLstmOptions/peepholeWeight}} [=map/exists=]: 1. [=Assert=]: its type is {{MLOperand}}. - 1. If its rank is not `2`, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. + 1. If its rank is not 2, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. 1. If |options|.{{MLLstmOptions/peepholeWeight}}.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}[0] is not |numDirections|, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. 1. If |options|.{{MLLstmOptions/peepholeWeight}}.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}[1] is not 4 * |hiddenSize|, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. 1. If |options|.{{MLLstmOptions/initialHiddenState}} [=map/exists=]: 1. [=Assert=]: its type is {{MLOperand}}. - 1. If its rank is not `3`, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. + 1. If its rank is not 3, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. 1. If |options|.{{MLLstmOptions/initialHiddenState}}.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}[0] is not |numDirections|, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. 1. If |options|.{{MLLstmOptions/initialHiddenState}}.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}[1] is not equal to |batchSize|, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. 1. If |options|.{{MLLstmOptions/initialHiddenState}}.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}[2] is not |hiddenSize|, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. 1. If |options|.{{MLLstmOptions/initialCellState}} [=map/exists=]: 1. [=Assert=]: its type is {{MLOperand}}. - 1. If its rank is not `3`, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. + 1. If its rank is not 3, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. 1. If |options|.{{MLLstmOptions/initialCellState}}.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}[0] is not |numDirections|, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. 1. If |options|.{{MLLstmOptions/initialCellState}}.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}[1] is not equal to |batchSize|, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. 1. If |options|.{{MLLstmOptions/initialCellState}}.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}[2] is not |hiddenSize|, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. @@ -3919,19 +3919,19 @@ partial interface MLGraphBuilder {
1. [=Assert=]: the type of |input|, |weight|, |recurrentWeight|, |hiddenState| and |cellState| is {{MLOperand}}. - 1. If the [=rank=] of |input|, |weight|, |recurrentWeight|, |hiddenState| or |cellState| is not `2`, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. + 1. If the [=rank=] of |input|, |weight|, |recurrentWeight|, |hiddenState| or |cellState| is not 2, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. 1. Let |batchSize| be |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}[0]. 1. If |options|.{{MLLstmCellOptions/bias}} [=map/exists=]: 1. [=Assert=]: its type is {{MLOperand}}. - 1. If its rank is not `1`, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. + 1. If its rank is not 1, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. 1. If |options|.{{MLLstmCellOptions/bias}}.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}[0] is not 4 * |hiddenSize|, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. 1. If |options|.{{MLLstmCellOptions/recurrentBias}} [=map/exists=]: 1. [=Assert=]: its type is {{MLOperand}}. - 1. If its rank is not `1`, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. + 1. If its rank is not 1, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. 1. If |options|.{{MLLstmCellOptions/recurrentBias}}.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}[0] is not 4 * |hiddenSize|, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. 1. If |options|.{{MLLstmCellOptions/peepholeWeight}} [=map/exists=]: 1. [=Assert=]: its type is {{MLOperand}}. - 1. If its rank is not `1`, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. + 1. If its rank is not 1, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. 1. If |options|.{{MLLstmCellOptions/peepholeWeight}}.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}[0] is not 3 * |hiddenSize|, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. 1. If |options|.{{MLLstmCellOptions/activations}} [=map/exists=]: 1. If its [=list/size=] is not 3, then [=exception/throw=] a {{TypeError}}. @@ -4104,10 +4104,10 @@ partial interface MLGraphBuilder {
1. Let |shapeA| be |a|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}} and |sizeA| the [=list/size=] of |shapeA|. 1. Let |shapeB| be |b|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}} and |sizeB| the [=list/size=] of |shapeB|. - 1. If |sizeA| and |sizeB| is `1`, return `« 1 »`. - 1. If |sizeA| is `1` and |sizeB| is not, then insert `1` in the front of |shapeA| to become [ 1 | |shapeA| ] and let |sizeA| be `2`. + 1. If |sizeA| and |sizeB| is 1, return `« 1 »`. + 1. If |sizeA| is 1 and |sizeB| is not, then insert 1 in the front of |shapeA| to become [ 1 | |shapeA| ] and let |sizeA| be 2. 1. If |shapeA|[0] is not equal to |shapeB|[|sizeB| - 2], then [=exception/throw=] an "{{OperationError}}" {{DOMException}}. - 1. If |sizeB| is `1` and |sizeA| is not, then insert `1` in the front of |shapeB| to become [ 1 | |shapeB| ] and let |sizeB| be `2`. + 1. If |sizeB| is 1 and |sizeA| is not, then insert 1 in the front of |shapeB| to become [ 1 | |shapeB| ] and let |sizeB| be 2. 1. If |shapeA|[|sizeA| - 1] is not equal to |shapeB|[0], then [=exception/throw=] an "{{OperationError}}" {{DOMException}}. 1. If |shapeA| is not equal to |shapeB|, then [=exception/throw=] an "{{OperationError}}" {{DOMException}}. 1. Let |shape| be an array whose size |size| is the maximum of |sizeA| and |sizeB|. @@ -4176,7 +4176,7 @@ partial interface MLGraphBuilder { :: A {{float}}. Specifies the padding value when {{MLPadOptions/mode}} is set to `"constant"`. - The default value is `0`. + The default value is 0.
@@ -4400,7 +4400,7 @@ partial interface MLGraphBuilder { 1. If the [=list/size=] of |options|.{{MLPool2dOptions/strides}} is not 2, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. 1. If any value in |options|.{{MLPool2dOptions/strides}} is not greater than 0, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. 1. If |options|.{{MLPool2dOptions/outputSizes}} [=map/exists=]: - 1. If the [=list/size=] of |options|.{{MLPool2dOptions/outputSizes}} is not `2`, then [=exception/throw=] a {{TypeError}}. + 1. If the [=list/size=] of |options|.{{MLPool2dOptions/outputSizes}} is not 2, then [=exception/throw=] a {{TypeError}}. 1. If the elements of |options|.{{MLPool2dOptions/outputSizes}} are not smaller than the elements at the same dimension (index) for |options|.{{MLPool2dOptions/strides}}, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. 1. If |options|.{{MLPool2dOptions/dilations}} does not [=map/exist=], set |options|.{{MLPool2dOptions/dilations}} to `« 1, 1 »`. 1. If the [=list/size=] of |options|.{{MLPool2dOptions/dilations}} is not 2, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. @@ -4780,11 +4780,11 @@ partial interface MLGraphBuilder {
1. If |options|.{{MLResample2dOptions/scales}} does not [=map/exist=], set it to to `« 1.0, 1.0 »`. - 1. Otherwise, if any of its values is not greater than `0`, return `false`. - 1. If |options|.{{MLResample2dOptions/sizes}} [=map/exists=], and if its size is not `2`, or if any of its values is not greater than `0`, return `false`. + 1. Otherwise, if any of its values is not greater than 0, return false. + 1. If |options|.{{MLResample2dOptions/sizes}} [=map/exists=], and if its size is not 2, or if any of its values is not greater than 0, return false. 1. If |options|.{{MLResample2dOptions/axes}} does not [=map/exists=], set it to `« 2, 3 »`. - 1. Otherwise, if its value is not one of `« 0, 1», « 1, 2», « 2, 3 »`, return `false`. - 1. Return `true`. + 1. Otherwise, if its value is not one of `« 0, 1», « 1, 2», « 2, 3 »`, return false. + 1. Return true.
@@ -4811,8 +4811,8 @@ partial interface MLGraphBuilder { The resample2d(|input|, |options|) method steps are:
- 1. Check if the input is a 4-dimensional tensor: if the [=list/size=] of |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}} is not `4`, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. - 1. If running the check resample options steps given |options| returns `false`, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. + 1. Check if the input is a 4-dimensional tensor: if the [=list/size=] of |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}} is not 4, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. + 1. If running the check resample options steps given |options| returns false, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. 1. Let |desc| be the result of running the resample output sizes steps given |options|. 1. If that [=exception/throws=] an error, re-[=exception/throw=] the error. 1. If any of the following sub-steps fail, [=exception/throw=] an "{{OperationError}}" {{DOMException}}. @@ -4860,9 +4860,9 @@ partial interface MLGraphBuilder { 1. Let |outputShape| be an empty array of {{unsigned long}}. 1. If |newShape| is a scalar [=number=], set |outputShape| to `« 1 »`. 1. Otherwise, if |newShape| is an array of {{unsigned long}}: - 1. If the [=list/size=] of |newShape| is `0`, set |outputShape| to `« 1 »` (reshaping to scalar). + 1. If the [=list/size=] of |newShape| is 0, set |outputShape| to `« 1 »` (reshaping to scalar). 1. If |newShape| contains more than one `null` value, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. - 1. If any value in |newShape| is `0`, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. + 1. If any value in |newShape| is 0, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. 1. Let |inputElementCount| be the product of all elements in |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}. 1. If |newShape| contains a `null` value, set that value to |inputElementCount| divided by the product of all other values in |newShape|. 1. If that value is too large for {{unsigned long}}, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. @@ -5114,7 +5114,7 @@ partial interface MLGraphBuilder { : steepness :: A {{float}} scalar parameter. - The default value is `1`. + The default value is 1. #### The {{MLGraphBuilder/softplus(input, options)}} method #### {#api-mlgraphbuilder-softplus-input-options} @@ -5267,7 +5267,7 @@ partial interface MLGraphBuilder { : axis :: An {{unsigned long}} scalar. The dimension along which to split. Its value must be in the range [0, N-1] where N is the [=rank=] of the input tensor. - The default value is `0`. + The default value is 0.
@@ -5359,7 +5359,7 @@ partial interface MLGraphBuilder { 1. If |axesLength| is not smaller than the rank of |dimensions|, 1. For |index| in [=the range=] 0 to |axesLength|, exclusive: 1. Let |oneDimIndex| be |options|.{{MLSqueezeOptions/axes}}[|index|]. - 1. If |dimensions|[|oneDimIndex|] is not `1`, then [=exception/throw=] a {{TypeError}}. + 1. If |dimensions|[|oneDimIndex|] is not 1, then [=exception/throw=] a {{TypeError}}. 1. If any of the following sub-steps fail, [=exception/throw=] an "{{OperationError}}" {{DOMException}}. 1. Let |output| be the result of copying an MLOperand given |input|. 1. Make a request to the underlying platform to: From 4700e742d4b44cf338e906c3407718c79fe7d03f Mon Sep 17 00:00:00 2001 From: Zoltan Kis Date: Thu, 24 Aug 2023 10:39:34 +0300 Subject: [PATCH 104/112] Fix matmul step, fix more typos Signed-off-by: Zoltan Kis --- index.bs | 8 ++++---- 1 file changed, 4 insertions(+), 4 deletions(-) diff --git a/index.bs b/index.bs index 7ed95ff2..35283a9a 100644 --- a/index.bs +++ b/index.bs @@ -1980,7 +1980,7 @@ partial interface MLGraphBuilder {
1. [=Assert=]: the type of |inputs| is sequence of {{MLOperand}} objects. 1. [=Assert=]: the type of |axis| is `unsigned long`. - 1. [=Assert=]: the shape, i.e. {{MLOperandDescriptor/dimensions}}) of each operand in |inputs| is the same, except on the dimension given by |axis| on which they are concatenated. + 1. [=Assert=]: the shape, i.e. {{MLOperandDescriptor/dimensions}} of each operand in |inputs| is the same, except on the dimension given by |axis| on which they are concatenated. 1. [=Assert=]: the {{MLOperandDescriptor/type}} of each operand in |inputs| is the same. 1. If any of the following steps fail, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. 1. Let |desc| be |inputs|[0].{{MLOperand/[[descriptor]]}}. @@ -2139,7 +2139,7 @@ partial interface MLGraphBuilder { 1. Let |filterSize| be the [=list/size=] of |filter|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}. 1. If |inputSize| is not 4, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. 1. If |filterSize| is not 4, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. - 1. If the type of |input| and |filter| is not the same, then [=exception/throw=] a {{TypeError}}. + 1. If |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/type}} is not the same as {{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/type}}, then [=exception/throw=] a {{TypeError}}. 1. If |options|.{{MLConv2dOptions/padding}} does not [=map/exist=], set it to `« 0, 0, 0, 0 »`. 1. Else if the [=list/size=] of |options|.{{MLConv2dOptions/padding}} is not 4, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. 1. If |options|.{{MLConv2dOptions/strides}} does not [=map/exist=], set it to `« 1, 1 »`. @@ -2315,7 +2315,7 @@ partial interface MLGraphBuilder { 1. Let |filterSize| be the [=list/size=] of |filter|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}. 1. If |inputSize| is not 4, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. 1. If |filterSize| is not 4, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. - 1. If the type of |input| and |filter| is not the same, then [=exception/throw=] a {{TypeError}}. + 1. If |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/type}} is not the same as {{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/type}}, then [=exception/throw=] a {{TypeError}}. 1. If |options|.{{MLConvTranspose2dOptions/padding}} does not [=map/exist=], set it to `« 0, 0, 0, 0 »`. 1. Else if the [=list/size=] of |options|.{{MLConvTranspose2dOptions/padding}} is not 4, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. 1. If |options|.{{MLConvTranspose2dOptions/strides}} does not [=map/exist=], set it to `« 1, 1 »`. @@ -4107,7 +4107,7 @@ partial interface MLGraphBuilder { 1. If |sizeA| and |sizeB| is 1, return `« 1 »`. 1. If |sizeA| is 1 and |sizeB| is not, then insert 1 in the front of |shapeA| to become [ 1 | |shapeA| ] and let |sizeA| be 2. 1. If |shapeA|[0] is not equal to |shapeB|[|sizeB| - 2], then [=exception/throw=] an "{{OperationError}}" {{DOMException}}. - 1. If |sizeB| is 1 and |sizeA| is not, then insert 1 in the front of |shapeB| to become [ 1 | |shapeB| ] and let |sizeB| be 2. + 1. If |sizeB| is 1 and |sizeA| is not, then append 1 to |shapeB| to become [ |shapeB| | 1 ] and let |sizeB| be 2. 1. If |shapeA|[|sizeA| - 1] is not equal to |shapeB|[0], then [=exception/throw=] an "{{OperationError}}" {{DOMException}}. 1. If |shapeA| is not equal to |shapeB|, then [=exception/throw=] an "{{OperationError}}" {{DOMException}}. 1. Let |shape| be an array whose size |size| is the maximum of |sizeA| and |sizeB|. From 4db98944e8bf31386996fe7040ac57d1dc790936 Mon Sep 17 00:00:00 2001 From: Zoltan Kis Date: Thu, 24 Aug 2023 12:04:37 +0300 Subject: [PATCH 105/112] Fix batchnorm validation steps for mean, variance, scale, bias Signed-off-by: Zoltan Kis --- index.bs | 14 ++++++++++---- 1 file changed, 10 insertions(+), 4 deletions(-) diff --git a/index.bs b/index.bs index 35283a9a..36984dec 100644 --- a/index.bs +++ b/index.bs @@ -1807,10 +1807,16 @@ partial interface MLGraphBuilder {
1. [=Assert=]: the type of |input|, |mean| and |variance| is {{MLOperand}}. 1. If |options|.axis is not in [=the range=] 0 to the [=rank=] of |input|, exclusive, then [=exception/throw=] a {{TypeError}}. - 1. If the [=list/size=] of |mean|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}} is not equal with |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}[|options|.{{MLBatchNormalizationOptions/axis}}], then [=exception/throw=] a {{TypeError}}. - 1. If the [=list/size=] of |variance|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}} is not equal with |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}[|options|.{{MLBatchNormalizationOptions/axis}}], then [=exception/throw=] a {{TypeError}}. - 1. If |options|.{{MLBatchNormalizationOptions/scale}} [=map/exists=] and its [=list/size=] is not equal with |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}[|options|.{{MLBatchNormalizationOptions/axis}}], then [=exception/throw=] a {{TypeError}}. - 1. If |options|.{{MLBatchNormalizationOptions/bias}} [=map/exists=] and its [=list/size=] is not equal with |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}[|options|.{{MLBatchNormalizationOptions/axis}}], then [=exception/throw=] a {{TypeError}}. + 1. If the [=list/size=] of |mean|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}} is not 1, then [=exception/throw=] a {{TypeError}}. + 1. If |mean|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}[0] is not equal to |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}[|options|.{{MLBatchNormalizationOptions/axis}}], then [=exception/throw=] a {{TypeError}}. + 1. If the [=list/size=] of |variance|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}} is not 1, then [=exception/throw=] a {{TypeError}}. + 1. If |variance|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}[0] is not equal to |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}[|options|.{{MLBatchNormalizationOptions/axis}}], then [=exception/throw=] a {{TypeError}}. + 1. If |options|.{{MLBatchNormalizationOptions/scale}} [=map/exists=]: + 1. If its [=list/size=] is not 1, then [=exception/throw=] a {{TypeError}}. + 1. If |options|.{{MLBatchNormalizationOptions/scale}}.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}[0] is not equal to |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}[|options|.{{MLBatchNormalizationOptions/axis}}], then [=exception/throw=] a {{TypeError}}. + 1. If |options|.{{MLBatchNormalizationOptions/bias}} [=map/exists=]: + 1. If its [=list/size=] is not 1, then [=exception/throw=] a {{TypeError}}. + 1. If |options|.{{MLBatchNormalizationOptions/bias}}.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}[0] is not equal to |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}[|options|.{{MLBatchNormalizationOptions/axis}}], then [=exception/throw=] a {{TypeError}}. 1. If any of the following sub-steps fail, [=exception/throw=] an "{{OperationError}}" {{DOMException}}. 1. Let |output| be the result of creating an MLOperand given [=this=] and |input|.{{MLOperand/[[descriptor]]}}, that may use the same underlying data as |input|. 1. Make a request to the underlying platform to initialize the batch normalization: From f937b88f547cd9c76cb79dc8886c419d4945d24c Mon Sep 17 00:00:00 2001 From: Zoltan Kis Date: Thu, 24 Aug 2023 14:33:54 +0300 Subject: [PATCH 106/112] Fix resample2d() steps Signed-off-by: Zoltan Kis --- index.bs | 15 +++++++-------- 1 file changed, 7 insertions(+), 8 deletions(-) diff --git a/index.bs b/index.bs index 36984dec..8725a5c8 100644 --- a/index.bs +++ b/index.bs @@ -1059,7 +1059,7 @@ The {{MLActivation}} objects (including the ones passed as input to methods) are
1. [=Assert=]: the type of |builder| is {{MLGraphBuilder}}. - 1. If |name| is empty, then [=exception/throw=] a "{{TypeError}}" and abort these steps. + 1. If |name| is empty, then [=exception/throw=] a "{{TypeError}}". 1. Let |activation| be a new [=object=]. 1. Set |activation|.{{MLActivation/[[builder]]}} to |builder|. 1. Set |activation|.{{MLActivation/[[name]]}} to |name|. @@ -1586,7 +1586,7 @@ Both {{MLGraphBuilder}}.{{MLGraphBuilder/build()}} and {{MLGraphBuilder}}.{{MLGr
1. If [=this=]'s [=relevant global object=]'s [=associated Document=] is not [=allowed to use=] the [=webnn-feature|webnn=] feature, then [=exception/throw=] a "{{SecurityError}}" {{DOMException}}. - 1. If validating MLContext given |context| returns false, then [=exception/throw=] a "{{TypeError}}" and abort these steps. + 1. If validating MLContext given |context| returns false, then [=exception/throw=] a "{{TypeError}}". 1. Set {{MLGraphBuilder/[[context]]}} to |context|.
@@ -4217,6 +4217,7 @@ partial interface MLGraphBuilder {
1. [=Assert=]: the type of |input| is {{MLOperand}}. + 1. If the [=list/size=] of |beginningPadding| and |endingPadding| is not equal to the [=rank=] of |input|, then then [=exception/throw=] a "{{TypeError}}". 1. Let |desc| be a copy of |input|.{{MLOperand/[[descriptor]]}}. 1. Set |desc|.{{MLOperandDescriptor/dimensions}} to the result of invoking the calculate padding output sizes steps given |input|, |beginningPadding| and |endingPadding|. 1. If any of the following sub-steps fail, [=exception/throw=] an "{{OperationError}}" {{DOMException}}. @@ -4786,7 +4787,7 @@ partial interface MLGraphBuilder {
1. If |options|.{{MLResample2dOptions/scales}} does not [=map/exist=], set it to to `« 1.0, 1.0 »`. - 1. Otherwise, if any of its values is not greater than 0, return false. + 1. Otherwise, if any of its values is not greater than 0, or if its [=list/size=] is not 2, return false. 1. If |options|.{{MLResample2dOptions/sizes}} [=map/exists=], and if its size is not 2, or if any of its values is not greater than 0, return false. 1. If |options|.{{MLResample2dOptions/axes}} does not [=map/exists=], set it to `« 2, 3 »`. 1. Otherwise, if its value is not one of `« 0, 1», « 1, 2», « 2, 3 »`, return false. @@ -4802,11 +4803,9 @@ partial interface MLGraphBuilder {
1. Let |desc| be an {{MLOperandDescriptor}} initialized to |input|.{{MLOperand/[[descriptor]]}}. 1. If |options|.{{MLResample2dOptions/sizes}} [=map/exists=], then set |desc|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}} to |options|.{{MLResample2dOptions/sizes}} and return |desc|. - 1. For |index| in [=the range=] 0 to the [=rank=] of |desc|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}, exclusive: + 1. For |index| in [=the range=] 0 to the [=list/size=] of |options|.{{MLResample2dOptions/axes}}, exclusive: 1. Let |inputSize| be the [=list/size=] of |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}[|index|]. - 1. Let |outputSize| be |inputSize| multiplied by |options|.{{MLResample2dOptions/scales}}. - 1. If that fails or |outputSize| is not a positive [=number=], then [=exception/throw=] a "{{DataError}}" {{DOMException}}. - 1. Set |desc|.{{MLOperandDescriptor/dimensions}}[|index|] to |outputSize|. + 1. Set |desc|.{{MLOperandDescriptor/dimensions}}[|index|] to |inputSize| multiplied by |options|.{{MLResample2dOptions/scales}}. 1. Return |desc|.
@@ -4817,7 +4816,7 @@ partial interface MLGraphBuilder { The resample2d(|input|, |options|) method steps are:
- 1. Check if the input is a 4-dimensional tensor: if the [=list/size=] of |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}} is not 4, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. + 1. If the [=list/size=] of |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}} is not 4, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. 1. If running the check resample options steps given |options| returns false, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. 1. Let |desc| be the result of running the resample output sizes steps given |options|. 1. If that [=exception/throws=] an error, re-[=exception/throw=] the error. From 18442ad610854dc236078ff18462f74b5d21f3b1 Mon Sep 17 00:00:00 2001 From: Zoltan Kis Date: Thu, 24 Aug 2023 15:52:24 +0300 Subject: [PATCH 107/112] Add acks for #446 and related work Signed-off-by: Zoltan Kis --- index.bs | 2 ++ 1 file changed, 2 insertions(+) diff --git a/index.bs b/index.bs index 8725a5c8..72f19485 100644 --- a/index.bs +++ b/index.bs @@ -5645,6 +5645,8 @@ Benjamin Poulain for their contributions to the API specification. Thanks to Sangwhan Moon and the W3C Technical Architecture Group for review of this specification for web architecture fit, design consistency and developer ergonomics. +Thanks to Zoltan Kis for adding algorithms and making navigating this specification a delightful experience. Thanks to Joshua Bell for aligning the specification with modern editorial conventions. Thanks to Ningxin Hu, Lisha Guo, Shiyi Zou, Mingming Xu, Junwei Fu, Bruce Dai and Bin Miao for careful review and comments. + Thanks to W3C Privacy Interest Group for privacy and security review and feedback. Thanks to Alex Gough and the Chrome Security team for security review and questions. From 3f9f3b31270fa95772ab4d1634221e848d0fbd76 Mon Sep 17 00:00:00 2001 From: Zoltan Kis Date: Fri, 25 Aug 2023 09:33:17 +0300 Subject: [PATCH 108/112] Fix steps in resample2d(). Fix typo in conv2d(). Signed-off-by: Zoltan Kis --- index.bs | 7 +++---- 1 file changed, 3 insertions(+), 4 deletions(-) diff --git a/index.bs b/index.bs index 72f19485..b73fb35b 100644 --- a/index.bs +++ b/index.bs @@ -2145,7 +2145,7 @@ partial interface MLGraphBuilder { 1. Let |filterSize| be the [=list/size=] of |filter|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}. 1. If |inputSize| is not 4, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. 1. If |filterSize| is not 4, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. - 1. If |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/type}} is not the same as {{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/type}}, then [=exception/throw=] a {{TypeError}}. + 1. If |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/type}} is not the same as |filter|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/type}}, then [=exception/throw=] a {{TypeError}}. 1. If |options|.{{MLConv2dOptions/padding}} does not [=map/exist=], set it to `« 0, 0, 0, 0 »`. 1. Else if the [=list/size=] of |options|.{{MLConv2dOptions/padding}} is not 4, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. 1. If |options|.{{MLConv2dOptions/strides}} does not [=map/exist=], set it to `« 1, 1 »`. @@ -4802,10 +4802,9 @@ partial interface MLGraphBuilder {
1. Let |desc| be an {{MLOperandDescriptor}} initialized to |input|.{{MLOperand/[[descriptor]]}}. - 1. If |options|.{{MLResample2dOptions/sizes}} [=map/exists=], then set |desc|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}} to |options|.{{MLResample2dOptions/sizes}} and return |desc|. 1. For |index| in [=the range=] 0 to the [=list/size=] of |options|.{{MLResample2dOptions/axes}}, exclusive: - 1. Let |inputSize| be the [=list/size=] of |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}[|index|]. - 1. Set |desc|.{{MLOperandDescriptor/dimensions}}[|index|] to |inputSize| multiplied by |options|.{{MLResample2dOptions/scales}}. + 1. If |options|.{{MLResample2dOptions/sizes}} [=map/exists=], set |desc|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}[|options|.{{MLResample2dOptions/axes}}[|index|]] to |options|.{{MLResample2dOptions/sizes}}[|index|] and return |desc|. + 1. Otherwise, set |desc|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}[|options|.{{MLResample2dOptions/axes}}[|index|]] to |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}[|index|] multiplied by |options|.{{MLResample2dOptions/scales}}. 1. Return |desc|.
From 6818a1ca77966e4b809d8af0fa0415d2d37801f5 Mon Sep 17 00:00:00 2001 From: Zoltan Kis Date: Fri, 25 Aug 2023 10:40:41 +0300 Subject: [PATCH 109/112] Fix #459: correct the concat() steps Signed-off-by: Zoltan Kis --- index.bs | 8 ++++---- 1 file changed, 4 insertions(+), 4 deletions(-) diff --git a/index.bs b/index.bs index b73fb35b..c23a5649 100644 --- a/index.bs +++ b/index.bs @@ -1990,16 +1990,16 @@ partial interface MLGraphBuilder { 1. [=Assert=]: the {{MLOperandDescriptor/type}} of each operand in |inputs| is the same. 1. If any of the following steps fail, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. 1. Let |desc| be |inputs|[0].{{MLOperand/[[descriptor]]}}. - 1. If |axis| is greater than or equal to the [=rank=] of |desc|, fail. + 1. If |axis| is greater than or equal to the [=rank=] of |desc|, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. 1. Let |desc|.{{MLOperandDescriptor/dimensions}}[|axis|] be 0. 1. [=map/For each=] |index| in [=the range=] 0 to the [=rank=] of |inputs|, exclusive: - 1. If validating MLOperand given |inputs|[|index|] and [=this=] returns false, then fail. + 1. If validating MLOperand given |inputs|[|index|] and [=this=] returns false, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. 1. [=map/For each=] |dim| in [=the range=] 0 to the [=rank=] of |inputs|[|index|].{{MLOperandDescriptor/dimensions}}, exclusive:
If the shape of each corresponding dimension and type of the operands, except for those of the dimension given by |axis|, is not the same, fail.
- 1. If |dim| is not equal to |axis| and if |inputs|[|index|].{{MLOperandDescriptor/dimensions}}[|dim|] is not equal to |inputs|[0].{{MLOperandDescriptor/dimensions}}[|dim|], fail. - 1. If |inputs|[|dim|].{{MLOperandDescriptor/type}} is not equal to |inputs|[0].{{MLOperandDescriptor/type}}. + 1. If |inputs|[|index|].{{MLOperandDescriptor/type}} is not equal to |inputs|[0].{{MLOperandDescriptor/type}}. + 1. If |dim| is not equal to |axis| and if |inputs|[|index|].{{MLOperandDescriptor/dimensions}}[|dim|] is not equal to |inputs|[0].{{MLOperandDescriptor/dimensions}}[|dim|], then [=exception/throw=] a "{{DataError}}" {{DOMException}}. 1. If |dim| is equal to |axis|, add to |desc|.{{MLOperandDescriptor/dimensions}}[|axis|] the value of |inputs|[|index|].{{MLOperandDescriptor/dimensions}}[|dim|]. 1. If any of the following sub-steps fail, [=exception/throw=] an "{{OperationError}}" {{DOMException}}. 1. Let |output| be the result of creating an MLOperand given [=this=] and |desc|. From ffce2d65ca433288ec039cb19fccb551c8d9949b Mon Sep 17 00:00:00 2001 From: Zoltan Kis Date: Fri, 25 Aug 2023 11:02:39 +0300 Subject: [PATCH 110/112] Fix #451: check options.axes in reduce operations Signed-off-by: Zoltan Kis --- index.bs | 1 + 1 file changed, 1 insertion(+) diff --git a/index.bs b/index.bs index c23a5649..d15d807d 100644 --- a/index.bs +++ b/index.bs @@ -4566,6 +4566,7 @@ partial interface MLGraphBuilder {
1. [=Assert=]: |op| is one of "reduceL1", "reduceL2", "reduceLogSum", "reduceLogSumExp", "reduceMax", "reduceMean", "reduceMin", "reduceProduct", "reduceSum", "reduceSumSquare". 1. [=Assert=]: the type of |input| is {{MLOperand}}. + 1. If |options|.{{MLReduceOptions/axes}} [=map/exists=], if any of its elements is not in [=the range=] 0 to the [=rank=] of |input|, exclusive, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. 1. If any of the following sub-steps fail, [=exception/throw=] an "{{OperationError}}" {{DOMException}}. 1. Let |output| be the result of copying an MLOperand given |input|. 1. Make a request to the underlying platform to: From b5b95d4b59555241c0dbe07d88d617e919d9090b Mon Sep 17 00:00:00 2001 From: Zoltan Kis Date: Fri, 25 Aug 2023 11:05:40 +0300 Subject: [PATCH 111/112] Remove spurious step from matmul() Signed-off-by: Zoltan Kis --- index.bs | 1 - 1 file changed, 1 deletion(-) diff --git a/index.bs b/index.bs index d15d807d..37226533 100644 --- a/index.bs +++ b/index.bs @@ -4115,7 +4115,6 @@ partial interface MLGraphBuilder { 1. If |shapeA|[0] is not equal to |shapeB|[|sizeB| - 2], then [=exception/throw=] an "{{OperationError}}" {{DOMException}}. 1. If |sizeB| is 1 and |sizeA| is not, then append 1 to |shapeB| to become [ |shapeB| | 1 ] and let |sizeB| be 2. 1. If |shapeA|[|sizeA| - 1] is not equal to |shapeB|[0], then [=exception/throw=] an "{{OperationError}}" {{DOMException}}. - 1. If |shapeA| is not equal to |shapeB|, then [=exception/throw=] an "{{OperationError}}" {{DOMException}}. 1. Let |shape| be an array whose size |size| is the maximum of |sizeA| and |sizeB|. 1. [=map/For each=] |index| in [=the range=] 0 to |size|, exclusive: 1. Set |shape|[|index|] to the maximum of |shapeA|[|index|] and |shapeB|[|index|]. From c067b22ed1c9f59cddf688bc378b247c2aafb27e Mon Sep 17 00:00:00 2001 From: Zoltan Kis Date: Fri, 25 Aug 2023 11:44:24 +0300 Subject: [PATCH 112/112] Improve concat() steps Signed-off-by: Zoltan Kis --- index.bs | 11 ++++++----- 1 file changed, 6 insertions(+), 5 deletions(-) diff --git a/index.bs b/index.bs index 37226533..4714113a 100644 --- a/index.bs +++ b/index.bs @@ -1993,14 +1993,15 @@ partial interface MLGraphBuilder { 1. If |axis| is greater than or equal to the [=rank=] of |desc|, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. 1. Let |desc|.{{MLOperandDescriptor/dimensions}}[|axis|] be 0. 1. [=map/For each=] |index| in [=the range=] 0 to the [=rank=] of |inputs|, exclusive: - 1. If validating MLOperand given |inputs|[|index|] and [=this=] returns false, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. - 1. [=map/For each=] |dim| in [=the range=] 0 to the [=rank=] of |inputs|[|index|].{{MLOperandDescriptor/dimensions}}, exclusive: + 1. Let |input| be |inputs|[|index|]. + 1. If validating MLOperand given |input| and [=this=] returns false, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. + 1. If |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/type}} is not equal to |inputs|[0].{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/type}}, then [=exception/throw=] a "{{DataError}}" {{DOMException}}. + 1. [=map/For each=] |dim| in [=the range=] 0 to the [=rank=] of |input|.{{MLOperand/[[descriptor]]}}.{{MLOperandDescriptor/dimensions}}, exclusive:
If the shape of each corresponding dimension and type of the operands, except for those of the dimension given by |axis|, is not the same, fail.
- 1. If |inputs|[|index|].{{MLOperandDescriptor/type}} is not equal to |inputs|[0].{{MLOperandDescriptor/type}}. - 1. If |dim| is not equal to |axis| and if |inputs|[|index|].{{MLOperandDescriptor/dimensions}}[|dim|] is not equal to |inputs|[0].{{MLOperandDescriptor/dimensions}}[|dim|], then [=exception/throw=] a "{{DataError}}" {{DOMException}}. - 1. If |dim| is equal to |axis|, add to |desc|.{{MLOperandDescriptor/dimensions}}[|axis|] the value of |inputs|[|index|].{{MLOperandDescriptor/dimensions}}[|dim|]. + 1. If |dim| is not equal to |axis| and if |input|.{{MLOperandDescriptor/dimensions}}[|dim|] is not equal to |inputs|[0].{{MLOperandDescriptor/dimensions}}[|dim|], then [=exception/throw=] a "{{DataError}}" {{DOMException}}. + 1. If |dim| is equal to |axis|, add to |desc|.{{MLOperandDescriptor/dimensions}}[|axis|] the value of |input|.{{MLOperandDescriptor/dimensions}}[|dim|]. 1. If any of the following sub-steps fail, [=exception/throw=] an "{{OperationError}}" {{DOMException}}. 1. Let |output| be the result of creating an MLOperand given [=this=] and |desc|. 1. Make a request to the underlying platform to: