Skip to content

Commit

Permalink
Fix the rendering of latex equation for adamax op (#6294)
Browse files Browse the repository at this point in the history
* Using latex fraction syntax in sigmoid and logsigmoid op

* Fixing the rendering of the latex equations in adamax operator
  • Loading branch information
abhinavarora authored Dec 5, 2017
1 parent 161128b commit 1d04b19
Show file tree
Hide file tree
Showing 2 changed files with 10 additions and 8 deletions.
8 changes: 4 additions & 4 deletions paddle/operators/activation_op.cc
Original file line number Diff line number Diff line change
Expand Up @@ -44,9 +44,9 @@ class SigmoidOpMaker : public framework::OpProtoAndCheckerMaker {
AddInput("X", "Input of Sigmoid operator");
AddOutput("Y", "Output of Sigmoid operator");
AddComment(R"DOC(
Sigmoid Activation Operator.
Sigmoid Activation Operator
$y = 1 / (1 + e^{-x})$
$$y = \frac{1}{1 + e^{-x}}$$
)DOC");
}
Expand All @@ -60,9 +60,9 @@ class LogSigmoidOpMaker : public framework::OpProtoAndCheckerMaker {
AddInput("X", "Input of LogSigmoid operator");
AddOutput("Y", "Output of LogSigmoid operator");
AddComment(R"DOC(
Logsigmoid Activation Operator.
Logsigmoid Activation Operator
$y = \log(1 / (1 + e^{-x}))$
$$y = \log \frac{1}{1 + e^{-x}}$$
)DOC");
}
Expand Down
10 changes: 6 additions & 4 deletions paddle/operators/adamax_op.cc
Original file line number Diff line number Diff line change
Expand Up @@ -107,10 +107,12 @@ Adam algorithm based on the infinity norm.
Adamax updates:
$$momentOut = \beta_1 * moment + (1 - \beta_1) * grad \break
infNormOut = max(\beta_2 * infNorm + \epsilon, |grad|) \break
learningRate = learningRate /(1 - \beta_1_{pow}) \break
paramOut = param - learningRate * momentPut / infNormOut$$
$$
momentOut = \beta_{1} * moment + (1 - \beta_{1}) * grad \\
infNormOut = max(\beta_{2} * infNorm + \epsilon, |grad|) \\
learningRate = \frac{learningRate}{1 - \beta_{1}^{Beta1Pow}} \\
paramOut = param - learningRate * \frac{momentOut}{infNormOut}
$$
The original paper does not have an epsilon attribute.
However, it is added here for numerical stability to prevent the
Expand Down

0 comments on commit 1d04b19

Please sign in to comment.