Replies: 1 comment
-
With the scripting interface, it is now possible to take one gradient of a gradient: import tensortrax as tr
import tensortrax.math as tm
import numpy as np
x = (np.random.rand(3, 3) - 0.5) / 10 + np.eye(3)
# init a Tensor with hessian=True
F = tr.Tensor(x)
F.init(hessian=True)
# perform some math operations
C = F.T() @ F
J = tm.linalg.det(F)
W = tm.trace(J**(-2 / 3) * C) - 3
eta = 1 - 1 / 3 * tm.tanh(W / 8)
def dual2real(A, like=None):
"Return a Tensor with old-dual data as new-real values."
ndual = like.ndual - len(like.shape)
return tr.Tensor(
x=A.δx,
δx=A.Δδx,
Δx=A.Δδx,
ndual=ndual,
ntrax=A.ntrax - ndual
)
P = dual2real(W, like=F)
# perform some more math with a derived Tensor involved
Q = eta * P
# take the gradient
A = tr.δ(Q) |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
This is a first conception of taking arbitrary gradients:
Beta Was this translation helpful? Give feedback.
All reactions