Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Remove MLActivations definitely not usable with recurrent ops #703

Merged
Merged
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
51 changes: 2 additions & 49 deletions index.bs
Original file line number Diff line number Diff line change
Expand Up @@ -640,7 +640,7 @@ The {{MLGraphBuilder}} interface serves as a builder (factory) to construct a [=

In WebNN, a [=computational graph=] is composed of <dfn>operators</dfn> which act on data, and are the nodes of the graph. {{MLOperand}}s are a representation of data that flows within the computational graph, and are the edges of the graph. {{MLOperand}}s include a [=computational graph=]'s <dfn for="computational graph">input</dfn> values for inference, <dfn for="computational graph">constants</dfn> (including trained weights) used for inference, intermediate values (often referred to as activations) computed during inference, as well as the output values of inference. An [=operator=]'s <dfn for=operator>input</dfn> is one or more {{MLOperand}}s. An [=operator=]'s <dfn for=operator>output</dfn> is one or more {{MLOperand}}s. [=Operators=] have operator-specific parameters that control their behavior, which can include zero or more <dfn for=operator lt="activation|activation function">activation functions</dfn>, which are {{MLActivation}}s.

A key part of the {{MLGraphBuilder}} interface are methods such as {{MLGraphBuilder/gemm()}} and {{MLGraphBuilder/softmax(axis)|softmax()}} which create an [=operator=] which represents the actual operation to perform on the input data when the computation is run, and return a new {{MLOperand}} or {{MLActivation}} holding the operator. Methods that create an {{MLOperand}} connect any [=operator/inputs=] and [=operator/activations=] to the operator. Each method invocation returns a distinct new value, without changing the value of any other {{MLOperand}}.
a-sully marked this conversation as resolved.
Show resolved Hide resolved
A key part of the {{MLGraphBuilder}} interface are methods such as {{MLGraphBuilder/gemm()}} and {{MLGraphBuilder/relu()}} which create an [=operator=] which represents the actual operation to perform on the input data when the computation is run, and return a new {{MLOperand}} or {{MLActivation}} holding the operator. Methods that create an {{MLOperand}} connect any [=operator/inputs=] and [=operator/activations=] to the operator. Each method invocation returns a distinct new value, without changing the value of any other {{MLOperand}}.

At inference time, every {{MLOperand}} will be bound to a tensor (the actual data), which are essentially multidimensional arrays. The representation of the tensors is implementation dependent, but it typically includes the array data stored in some buffer (memory) and some metadata describing the array data (such as its shape).

Expand Down Expand Up @@ -1603,7 +1603,6 @@ dictionary MLClampOptions {

partial interface MLGraphBuilder {
MLOperand clamp(MLOperand input, optional MLClampOptions options = {});
MLActivation clamp(optional MLClampOptions options = {});
};
</script>

Expand Down Expand Up @@ -1642,14 +1641,6 @@ partial interface MLGraphBuilder {
</details>
</div>

<details open algorithm>
<summary>
To <dfn>check clamp options</dfn> given {{MLClampOptions}} |options|, run the following steps:
</summary>
1. If |options|.{{MLClampOptions/minValue}} is greater than |options|.{{MLClampOptions/maxValue}}, then return false.
1. Return true.
</details>

#### {{MLGraphBuilder/clamp(input, options)}} #### {#api-mlgraphbuilder-clamp-operand-options}
<div>
**Arguments:**
Expand All @@ -1664,7 +1655,7 @@ partial interface MLGraphBuilder {
The <dfn method for=MLGraphBuilder>clamp(|input|, |options|)</dfn> method steps are:
</summary>
1. If [=MLGraphBuilder/validating operand=] with [=this=] and |input| returns false, then [=exception/throw=] a {{TypeError}}.
1. If [=checking clamp options=] given |options| returns false, then [=exception/throw=] a {{TypeError}}.
1. If |options|.{{MLClampOptions/minValue}} is greater than |options|.{{MLClampOptions/maxValue}}, then [=exception/throw=] a {{TypeError}}.
1. *Make graph connections:*
1. Let |output| be the result of [=copying an MLOperand=] given |input|.
1. Let |operator| be an [=operator=] for the "clamp" operation, given |options|.{{MLClampOptions/minValue}} and |options|.{{MLClampOptions/maxValue}}.
Expand All @@ -1674,23 +1665,6 @@ partial interface MLGraphBuilder {
1. Return |output|.
</details>

#### {{MLGraphBuilder/clamp(options)}} #### {#api-mlgraphbuilder-clamp-options}
<div>
**Arguments:**
- *options*: an optional {{MLClampOptions}}. The optional parameters of the operation.
**Returns:**
- an {{MLActivation}}. The operator representing the clamp operation.
</div>

<details open algorithm>
<summary>
The <dfn method for=MLGraphBuilder>clamp(|options|)</dfn> method steps are:
</summary>
1. If [=checking clamp options=] given |options| returns false, then [=exception/throw=] a {{TypeError}}.
a-sully marked this conversation as resolved.
Show resolved Hide resolved
1. Let |op| be the result of [=creating an MLActivation=] given [=this=], "clamp" and |options|.
1. Return |op|.
</details>

### concat ### {#api-mlgraphbuilder-concat}
Concatenates the input tensors along a given axis.
<script type=idl>
Expand Down Expand Up @@ -5334,7 +5308,6 @@ the N-D input tensor along the given axis.
<script type=idl>
partial interface MLGraphBuilder {
MLOperand softmax(MLOperand input, unsigned long axis);
MLActivation softmax(unsigned long axis);
};
</script>

Expand Down Expand Up @@ -5382,26 +5355,6 @@ partial interface MLGraphBuilder {
1. Return |output|.
</details>

#### {{MLGraphBuilder/softmax(axis)}} #### {#api-mlgraphbuilder-softmax-axis}
<div>
**Arguments:**
- None.

**Returns:**
- an {{MLActivation}}. The activation function representing the softmax operation.
</div>

<details open algorithm>
<summary>
The <dfn method for=MLGraphBuilder>softmax(|axis|)</dfn> method steps are:
</summary>
1. Let |validationSteps| given {{MLOperandDescriptor}} |descriptor| be these steps:
a-sully marked this conversation as resolved.
Show resolved Hide resolved
1. If |axis| is greater than or equal to |descriptor|.{{MLOperandDescriptor/dimensions}}'s [=list/size=], then return false;
1. Otherwise, return true.
1. Let |op| be the result of [=creating an MLActivation=] given [=this=], "softmax", «[ "axis" → |axis| ]», and |validationSteps|.
1. Return |op|.
</details>

### softplus ### {#api-mlgraphbuilder-softplus-method}
Compute the <a href="https://en.wikipedia.org/wiki/Rectifier_(neural_networks)#Softplus">softplus function</a> of the input tensor. The calculation follows the expression `ln(1 + exp(x))`.
<script type=idl>
Expand Down