Skip to content

Commit

Permalink
improve the doc
Browse files Browse the repository at this point in the history
  • Loading branch information
Jackwaterveg committed Jun 21, 2022
1 parent 44ee108 commit fc56cd3
Show file tree
Hide file tree
Showing 2 changed files with 9 additions and 6 deletions.
5 changes: 3 additions & 2 deletions python/paddle/fluid/initializer.py
Original file line number Diff line number Diff line change
Expand Up @@ -690,8 +690,9 @@ class MSRAInitializer(Initializer):
Args:
uniform (bool): whether to use uniform or normal distribution
fan_in (float32|None): fan_in for MSRAInitializer. If None, it is\
inferred from the variable. default is None.
fan_in (float32|None): fan_in (in_features) of trainable Tensor,\
If None, it will be infered automaticly. If you don't want to use in_features of the Tensor,\
you can set the value of 'fan_in' smartly by yourself. default is None.
seed (int32): random seed
negative_slope (float, optional): negative_slope (only used with leaky_relu). default is 0.0.
nonlinearity(str, optional): the non-linear function. default is relu.
Expand Down
10 changes: 6 additions & 4 deletions python/paddle/nn/initializer/kaiming.py
Original file line number Diff line number Diff line change
Expand Up @@ -36,8 +36,9 @@ class KaimingNormal(MSRAInitializer):
\frac{gain}{\sqrt{{fan\_in}}}
Args:
fan_in (float32|None): fan_in for Kaiming normal Initializer. If None, it is\
inferred from the variable. default is None.
fan_in (float32|None): fan_in (in_features) of trainable Tensor,\
If None, it will be infered automaticly. If you don't want to use in_features of the Tensor,\
you can set the value of 'fan_in' smartly by yourself. default is None.
negative_slope (float, optional): negative_slope (only used with leaky_relu). default is 0.0.
nonlinearity(str, optional): the non-linear function. default is relu.
Expand Down Expand Up @@ -83,8 +84,9 @@ class KaimingUniform(MSRAInitializer):
x = gain \times \sqrt{\frac{3}{fan\_in}}
Args:
fan_in (float32|None): fan_in for Kaiming uniform Initializer. If None, it is\
inferred from the variable. default is None.
fan_in (float32|None): fan_in (in_features) of trainable Tensor,\
If None, it will be infered automaticly. If you don't want to use in_features of the Tensor,\
you can set the value of 'fan_in' smartly by yourself. default is None.
negative_slope (float, optional): negative_slope (only used with leaky_relu). default is 0.0.
nonlinearity(str, optional): the non-linear function. default is relu.
Expand Down

1 comment on commit fc56cd3

@paddle-bot-old
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Congratulation! Your pull request passed all required CI. You could ask reviewer(s) to approve and merge. 🎉

Please sign in to comment.