Skip to content

Commit

Permalink
Fixed the dead link bug in the API documentation (#48969)
Browse files Browse the repository at this point in the history
* first pr

* Revise nn.py

* Revise nn.py 2.0

* Revise rnn.py;test=document_fix

* test=document_fix

Co-authored-by: Ligoml <39876205+Ligoml@users.noreply.github.com>
  • Loading branch information
jjyaoao and Ligoml authored Dec 13, 2022
1 parent b0e7226 commit 3c81d0c
Show file tree
Hide file tree
Showing 4 changed files with 9 additions and 15 deletions.
2 changes: 1 addition & 1 deletion python/paddle/distributed/launch/main.py
Original file line number Diff line number Diff line change
Expand Up @@ -36,7 +36,7 @@ def launch():
Base Parameters:
- ``--master``: The master/rendezvous server, support http:// and etcd://, default with http://. e.g., ``--master=127.0.0.1:8080``. Default ``--master=None``.
- ``--master``: The master/rendezvous server, support ``http://`` and ``etcd://``, default with ``http://``. e.g., ``--master=127.0.0.1:8080``. Default ``--master=None``.
- ``--rank``: The rank of the node, can be auto assigned by master. Default ``--rank=-1``.
Expand Down
1 change: 1 addition & 0 deletions python/paddle/fluid/dygraph/nn.py
Original file line number Diff line number Diff line change
Expand Up @@ -37,6 +37,7 @@
check_type,
check_dtype,
)

from ..param_attr import ParamAttr
from ..initializer import Normal, Constant, NumpyArrayInitializer
from .. import unique_name
Expand Down
12 changes: 6 additions & 6 deletions python/paddle/fluid/framework.py
Original file line number Diff line number Diff line change
Expand Up @@ -1384,7 +1384,7 @@ class Variable(metaclass=VariableMetaClass):
shape=[-1, 23, 48],
dtype='float32')
In `Dygraph <../../user_guides/howto/dygraph/DyGraph.html>`_ Mode:
In Dygraph Mode:
.. code-block:: python
Expand Down Expand Up @@ -1861,7 +1861,7 @@ def stop_gradient(self):
"""
Indicating if we stop gradient from current Variable
**Notes: This Property has default value as** ``True`` **in** `Dygraph <../../user_guides/howto/dygraph/DyGraph.html>`_ **mode, while Parameter's default value is False. However, in Static Graph Mode all Variable's default stop_gradient value is** ``False``
**Notes: This Property has default value as** ``True`` **in** Dygraph **mode, while Parameter's default value is False. However, in Static Graph Mode all Variable's default stop_gradient value is** ``False``
Examples:
.. code-block:: python
Expand Down Expand Up @@ -1903,7 +1903,7 @@ def persistable(self):
**1. All Variable's persistable is** ``False`` **except Parameters.**
**2. In** `Dygraph <../../user_guides/howto/dygraph/DyGraph.html>`_ **mode, this property should not be changed**
**2. In** Dygraph **mode, this property should not be changed**
Examples:
.. code-block:: python
Expand Down Expand Up @@ -1952,7 +1952,7 @@ def name(self):
"""
Indicating name of current Variable
**Notes: If it has two or more Varaible share the same name in the same** :ref:`api_guide_Block_en` **, it means these Variable will share content in no-** `Dygraph <../../user_guides/howto/dygraph/DyGraph.html>`_ **mode. This is how we achieve Parameter sharing**
**Notes: If it has two or more Varaible share the same name in the same** :ref:`api_guide_Block_en` **, it means these Variable will share content in no-** Dygraph **mode. This is how we achieve Parameter sharing**
Examples:
.. code-block:: python
Expand Down Expand Up @@ -1982,7 +1982,7 @@ def grad_name(self):
import paddle.fluid as fluid
x = fluid.data(name="x", shape=[-1, 23, 48], dtype='float32')
print(x.grad_name) # output is "x@GRAD"
print(x.grad_name) # output is ``x@GRAD``
"""
return self.name + "@GRAD"
Expand Down Expand Up @@ -2043,7 +2043,7 @@ def lod_level(self):
**1. This is a read-only property**
**2. Don't support this property in** `Dygraph <../../user_guides/howto/dygraph/DyGraph.html>`_ **mode, it's value should be** ``0(int)``
**2. Don't support this property in** Dygraph **mode, it's value should be** ``0(int)``
Examples:
.. code-block:: python
Expand Down
9 changes: 1 addition & 8 deletions python/paddle/fluid/layers/rnn.py
Original file line number Diff line number Diff line change
Expand Up @@ -494,7 +494,6 @@ def dynamic_lstm(
name=None,
):
r"""
:api_attr: Static Graph
**Note**:
1. This OP only supports LoDTensor as inputs. If you need to deal with Tensor, please use :ref:`api_fluid_layers_lstm` .
Expand Down Expand Up @@ -684,12 +683,11 @@ def lstm(
seed=-1,
):
r"""
:api_attr: Static Graph
**Note**:
This OP only supports running on GPU devices.
This OP implements LSTM operation - `Hochreiter, S., & Schmidhuber, J. (1997) <http://deeplearning.cs.cmu.edu/pdfs/Hochreiter97_lstm.pdf>`_ .
This OP implements LSTM operation - `Hochreiter, S., & Schmidhuber, J. (1997) <https://blog.xpgreat.com/file/lstm.pdf>`_ .
The implementation of this OP does not include diagonal/peephole connections.
Please refer to `Gers, F. A., & Schmidhuber, J. (2000) <ftp://ftp.idsia.ch/pub/juergen/TimeCount-IJCNN2000.pdf>`_ .
Expand Down Expand Up @@ -742,7 +740,6 @@ def lstm(
If set None, default initializer will be used. Default: None.
seed(int, optional): Seed for dropout in LSTM, If it's -1, dropout will use random seed. Default: 1.
Returns:
tuple ( :ref:`api_guide_Variable_en` , :ref:`api_guide_Variable_en` , :ref:`api_guide_Variable_en` ) :
Expand All @@ -757,7 +754,6 @@ def lstm(
shape is :math:`[num\_layers, batch\_size, hidden\_size]` \
if is_bidirec set to True, shape will be :math:`[num\_layers*2, batch\_size, hidden\_size]`
Examples:
.. code-block:: python
Expand Down Expand Up @@ -875,7 +871,6 @@ def dynamic_lstmp(
proj_clip=None,
):
r"""
:api_attr: Static Graph
**Note**:
1. In order to improve efficiency, users must first map the input of dimension [T, hidden_size] to input of [T, 4 * hidden_size], and then pass it to this OP.
Expand Down Expand Up @@ -1100,7 +1095,6 @@ def dynamic_gru(
origin_mode=False,
):
r"""
:api_attr: Static Graph
**Note: The input type of this must be LoDTensor. If the input type to be
processed is Tensor, use** :ref:`api_fluid_layers_StaticRNN` .
Expand Down Expand Up @@ -1270,7 +1264,6 @@ def gru_unit(
origin_mode=False,
):
r"""
:api_attr: Static Graph
Gated Recurrent Unit (GRU) RNN cell. This operator performs GRU calculations for
one time step and it supports these two modes:
Expand Down

0 comments on commit 3c81d0c

Please sign in to comment.