Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Allowing more flexible cost functions for optimizers #959

Merged
merged 49 commits into from
Jan 6, 2021
Merged
Show file tree
Hide file tree
Changes from 5 commits
Commits
Show all changes
49 commits
Select commit Hold shift + click to select a range
fe79cc6
allowing more parameters to cost function in gradient descent
albi3ro Dec 9, 2020
9362b82
multiple args and kwargs support for most optimizers
albi3ro Dec 11, 2020
5609f5d
improved structure, return format
albi3ro Dec 14, 2020
2c14788
edit changelog
albi3ro Dec 14, 2020
24d30e1
near finished version of the operators
albi3ro Dec 15, 2020
21e9e01
update doc about provided gradient form
albi3ro Dec 16, 2020
a820b55
update test_optimize for new gradient form
albi3ro Dec 16, 2020
ad0906a
testing multiple arguments, non-training args, keywords
albi3ro Dec 16, 2020
ed4ac91
improved changelog
albi3ro Dec 16, 2020
b30a552
linting
albi3ro Dec 16, 2020
cb2b328
linting
albi3ro Dec 16, 2020
5c03e65
Merge remote-tracking branch 'origin/optimize_more_parameters' into o…
albi3ro Dec 16, 2020
df0a7a8
Merge branch 'master' into optimize_more_parameters
albi3ro Dec 16, 2020
4cf358c
black formatting
albi3ro Dec 16, 2020
b9a447d
Merge remote-tracking branch 'origin/optimize_more_parameters' into o…
albi3ro Dec 16, 2020
7624f24
different black parameters
albi3ro Dec 16, 2020
5e0c40b
Merge branch 'master' into optimize_more_parameters
josh146 Dec 17, 2020
12598c8
Update .github/CHANGELOG.md
albi3ro Dec 17, 2020
4f35668
changelog conform to black
albi3ro Dec 17, 2020
193cb53
wording change
albi3ro Dec 17, 2020
095de9e
wording change
albi3ro Dec 17, 2020
9f189f9
comments on code example
albi3ro Dec 17, 2020
b4b0d71
wording change
albi3ro Dec 17, 2020
83dcf94
Update pennylane/optimize/gradient_descent.py
albi3ro Dec 17, 2020
a3c2534
Update pennylane/optimize/gradient_descent.py
albi3ro Dec 17, 2020
b4ed411
Update pennylane/optimize/momentum.py
albi3ro Dec 17, 2020
b3c3857
docs string wording
albi3ro Dec 17, 2020
29416d2
Update pennylane/optimize/rotosolve.py
albi3ro Dec 17, 2020
dfc0092
Update pennylane/optimize/nesterov_momentum.py
albi3ro Dec 17, 2020
6cc787c
Update pennylane/optimize/rotosolve.py
albi3ro Dec 17, 2020
703942d
Update pennylane/optimize/adam.py
albi3ro Dec 17, 2020
6ca34cb
fix rotosolve
albi3ro Dec 17, 2020
4854c6b
improve docstrings
albi3ro Dec 17, 2020
16c6bde
Apply simple, local suggestions from code review
albi3ro Dec 18, 2020
7ddbcbc
Most code review comments implemented
albi3ro Dec 18, 2020
c53adb7
black on new tests
albi3ro Dec 18, 2020
d9d03a9
fix nesterov momentum
albi3ro Dec 18, 2020
afd0cfa
Merge branch 'master' into optimize_more_parameters
antalszava Dec 18, 2020
751a030
Merge remote-tracking branch 'origin/optimize_more_parameters' into o…
albi3ro Dec 18, 2020
90257ba
actually add rotoselect kwargs this time. nesterov test
albi3ro Dec 18, 2020
a35d782
ran black on rotoselect
albi3ro Dec 21, 2020
033af1b
minor docstring fixes
albi3ro Dec 22, 2020
3c58644
Merge branch 'master' into optimize_more_parameters
albi3ro Dec 22, 2020
f39a839
name on changelog, tests in progress changing
albi3ro Dec 28, 2020
0eb4133
black
albi3ro Jan 4, 2021
bfe0a4b
test rotosolve, fix rotosolve
albi3ro Jan 4, 2021
c3d1e49
Merge branch 'master' into optimize_more_parameters
albi3ro Jan 4, 2021
761bfed
Merge branch 'master' into optimize_more_parameters
albi3ro Jan 6, 2021
f7e9d67
remove import of mocker
albi3ro Jan 6, 2021
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
29 changes: 28 additions & 1 deletion .github/CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -40,6 +40,33 @@
params = (x, y, data)
params = opt.step(cost, *params)
```
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Don't forget to add your name to contributors! (unless you have done that already, and I missed it)


* Support added for calculating the Hessian of quantum tapes using the second-order
parameter shift formula.
[(#961)](https://github.com/PennyLaneAI/pennylane/pull/961)

The following example shows the calculation of the Hessian of a quantum tape:

```python
qml.enable_tape()
n_wires = 5
weights = [2.73943676, 0.16289932, 3.4536312, 2.73521126, 2.6412488]

dev = qml.device("default.qubit", wires=n_wires)

with qml.tape.QubitParamShiftTape() as tape:
for i in range(n_wires):
qml.RX(weights[i], wires=i)

qml.CNOT(wires=[0, 1])
qml.CNOT(wires=[2, 1])
qml.CNOT(wires=[3, 1])
qml.CNOT(wires=[4, 3])

qml.expval(qml.PauliZ(1))

print(tape.hessian(dev))
```

* A new `qml.draw` function is available, allowing QNodes to be easily
drawn without execution by providing example input.
Expand Down Expand Up @@ -175,7 +202,7 @@

This release contains contributions from (in alphabetical order):

Olivia Di Matteo, Josh Izaac, Alejandro Montanez, Steven Oud, Chase Roberts, Maria Schuld, David Wierichs, Jiahao Yao.
Olivia Di Matteo, Josh Izaac, Christina Lee, Alejandro Montanez, Steven Oud, Chase Roberts, Maria Schuld, David Wierichs, Jiahao Yao.

# Release 0.13.0 (current release)

Expand Down
2 changes: 1 addition & 1 deletion pennylane/optimize/rotosolve.py
Original file line number Diff line number Diff line change
Expand Up @@ -140,7 +140,7 @@ def step(self, objective_fn, *args, **kwargs):
args_new[index] = unflatten(x_flat, arg)

# updating before_args for next loop
before_args.append(arg)
before_args.append(args_new[index])

# unwrap arguments if only one, backward compatible and cleaner
if len(args_new) == 1:
Expand Down
168 changes: 80 additions & 88 deletions tests/test_optimize.py
Original file line number Diff line number Diff line change
Expand Up @@ -19,6 +19,7 @@

import numpy as onp
import pytest
from pytest_mock import mocker
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

No need to import mocker here; mocker is a fixture, so pytest will automatically perform 'dependency injection' so that Python can find it in required test functions. Not a very pythonic system!

Suggested change
from pytest_mock import mocker
from pytest_mock import mocker


import pennylane as qml
from pennylane import numpy as np
Expand Down Expand Up @@ -720,6 +721,11 @@ def test_update_stepsize(self):
assert opt._stepsize == eta2


def reset(opt):
if getattr(opt, "reset", None):
opt.reset()


@pytest.mark.parametrize(
"opt, opt_name",
[
Expand All @@ -735,56 +741,76 @@ def test_update_stepsize(self):
class TestOverOpts:
"""Tests keywords, multiple arguements, and non-training arguments in relevent optimizers"""

def test_kwargs(self, opt, opt_name, tol):
def test_kwargs(self, mocker, opt, opt_name, tol):
"""Test that the keywords get passed and alter the function"""

def func(x, c=1.0):
return (x - c) ** 2
class func_wrapper:
@staticmethod
def func(x, c=1.0):
return (x - c) ** 2

x = 1.0

x_new_one = opt.step(func, x, c=1.0)
x_new_two = opt.step(func, x, c=2.0)
wrapper = func_wrapper()
spy = mocker.spy(wrapper, "func")

x_new_one_wc, cost_one = opt.step_and_cost(func, x, c=1.0)
x_new_two_wc, cost_two = opt.step_and_cost(func, x, c=2.0)
x_new_two = opt.step(wrapper.func, x, c=2.0)
reset(opt)

if getattr(opt, "reset", None):
opt.reset()
args2, kwargs2 = spy.call_args_list[-1]

assert x_new_one != pytest.approx(x_new_two, abs=tol)
assert x_new_one_wc != pytest.approx(x_new_two_wc, abs=tol)
x_new_three_wc, cost_three = opt.step_and_cost(wrapper.func, x, c=3.0)
reset(opt)

if opt_name != "nest":
assert cost_one == pytest.approx(func(x, c=1.0), abs=tol)
assert cost_two == pytest.approx(func(x, c=2.0), abs=tol)
args3, kwargs3 = spy.call_args_list[-1]

@pytest.mark.parametrize(
"func, args",
[
(lambda x, y: x * y, (1.0, 1.0)),
(lambda x, y: x[0] * y[0], (np.array([1.0]), np.array([1.0]))),
],
)
def test_multi_args(self, opt, opt_name, func, args, tol):
"""Test multiple arguments to function"""
x_new, y_new = opt.step(func, *args)
x_new2, y_new2 = opt.step(func, x_new, y_new)
if opt_name != "roto":
assert args2 == (x,)
assert args3 == (x,)
else:
assert x_new_two != pytest.approx(x, abs=tol)
assert x_new_three_wc != pytest.approx(x, abs=tol)

assert kwargs2 == {"c": 2.0}
assert kwargs3 == {"c": 3.0}

assert cost_three == pytest.approx(wrapper.func(x, c=3.0), abs=tol)
Comment on lines +773 to +776
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💯


def test_multi_args(self, mocker, opt, opt_name, tol):
"""Test passing multiple arguments to function"""

class func_wrapper:
@staticmethod
def func(x, y, z):
return x[0] * y[0] + z[0]

wrapper = func_wrapper()
spy = mocker.spy(wrapper, "func")

x = np.array([1.0])
y = np.array([2.0])
z = np.array([3.0])

(x_new_wc, y_new_wc), cost = opt.step_and_cost(func, *args)
(x_new2_wx, y_new2_wc), cost2 = opt.step_and_cost(func, x_new_wc, y_new_wc)
(x_new, y_new, z_new), cost = opt.step_and_cost(wrapper.func, x, y, z)
reset(opt)
args_called1, kwargs1 = spy.call_args_list[-1] # just take last call

if getattr(opt, "reset", None):
opt.reset()
x_new2, y_new2, z_new2 = opt.step(wrapper.func, x_new, y_new, z_new)
reset(opt)
args_called2, kwargs2 = spy.call_args_list[-1] # just take last call

assert x_new != pytest.approx(args[0], abs=tol)
assert y_new != pytest.approx(args[1], abs=tol)
if opt_name != "roto":
assert args_called1 == (x, y, z)
assert args_called2 == (x_new, y_new, z_new)
else:
assert x_new != pytest.approx(x, abs=tol)
assert y_new != pytest.approx(y, abs=tol)
assert z_new != pytest.approx(z, abs=tol)

assert x_new_wc != pytest.approx(args[0], abs=tol)
assert y_new_wc != pytest.approx(args[1], abs=tol)
assert kwargs1 == {}
assert kwargs2 == {}

if opt_name != "nest":
assert cost == pytest.approx(func(*args), abs=tol)
assert cost == pytest.approx(wrapper.func(x, y, z), abs=tol)

def test_nontrainable_data(self, opt, opt_name, tol):
"""Check non-trainable argument does not get updated"""
Expand All @@ -796,71 +822,37 @@ def func(x, data):
data = np.array([1.0], requires_grad=False)

args_new = opt.step(func, x, data)
reset(opt)
args_new_wc, cost = opt.step_and_cost(func, *args_new)

if getattr(opt, "reset", None):
opt.reset()
reset(opt)

assert len(args_new) == pytest.approx(2, abs=tol)
assert args_new[0] != pytest.approx(x, abs=tol)
assert args_new[1] == pytest.approx(data, abs=tol)

if opt_name != "nest":
assert cost == pytest.approx(func(args_new[0], data), abs=tol)

def test_multiargs_data_kwargs(self, opt, opt_name, tol):
""" Check all multiargs, non-trainable data, and keywords at the same time."""
assert cost == pytest.approx(func(*args_new), abs=tol)

def func(x, data, y, c=1.0):
return c * (x[0] + y[0] - data[0]) ** 2

x = np.array([1.0], requires_grad=True)
y = np.array([1.0])
data = np.array([1.0], requires_grad=False)

args_new, cost = opt.step_and_cost(func, x, data, y, c=0.5)
args_new2 = opt.step(func, *args_new, c=0.5)

if getattr(opt, "reset", None):
opt.reset()
def test_steps_the_same(self, opt, opt_name, tol):
"""Tests whether separating the args into different inputs affects their
optimization step. Assumes single argument optimization is correct, as tested elsewhere."""

assert args_new[0] != pytest.approx(x, abs=tol)
assert args_new[1] == pytest.approx(data, abs=tol)
assert args_new[2] != pytest.approx(y, abs=tol)
def func1(x, y, z):
return x[0] * y[0] * z[0]

if opt_name != "nest":
assert cost == pytest.approx(func(x, data, y, c=0.5), abs=tol)
def func2(args):
return args[0][0] * args[1][0] * args[2][0]

def test_steps_the_same(self, opt, opt_name, tol):
"""Tests optimizing single parameter same as with several at a time"""
x = np.array([1.0])
y = np.array([2.0])
z = np.array([3.0])
args = (x, y, z)

def func(x, y, z):
return x[0] * y[0] * z[0]
x_seperate, y_seperate, z_seperate = opt.step(func1, x, y, z)
reset(opt)

args_new = opt.step(func2, args)
reset(opt)

fx = lambda xp: func(xp, y, z)
fy = lambda yp: func(x, yp, z)
fz = lambda zp: func(x, y, zp)

if getattr(opt, "reset", None):
opt.reset()

x_full, y_full, z_full = opt.step(func, x, y, z)
if getattr(opt, "reset", None):
opt.reset()

x_part = opt.step(fx, x)
if getattr(opt, "reset", None):
opt.reset()
y_part = opt.step(fy, y)
if getattr(opt, "reset", None):
opt.reset()
z_part = opt.step(fz, z)
if getattr(opt, "reset", None):
opt.reset()

assert x_full == pytest.approx(x_part, abs=tol)
assert y_full == pytest.approx(y_part, abs=tol)
assert z_full == pytest.approx(z_part, abs=tol)
assert x_seperate == pytest.approx(args_new[0], abs=tol)
assert y_seperate == pytest.approx(args_new[1], abs=tol)
assert z_seperate == pytest.approx(args_new[2], abs=tol)