Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

iRprop- local optimiser. #1459

Merged
merged 6 commits into from
Aug 17, 2022
Merged

iRprop- local optimiser. #1459

merged 6 commits into from
Aug 17, 2022

Conversation

MichaelClerx
Copy link
Member

@MichaelClerx MichaelClerx commented Aug 16, 2022

See #1105

Local gradient using optimiser with adaptive step size.
Seems to work pretty well

https://doi.org/10.1016/S0925-2312(01)00700-7

The easiest to read psuedo code for this algorithm is in a figure in the paper above.
Note that the < 0 / > 0 is deliberate: the step size is not updated for = 0.

@MichaelClerx MichaelClerx marked this pull request as ready for review August 16, 2022 15:09
@codecov
Copy link

codecov bot commented Aug 16, 2022

Codecov Report

Merging #1459 (edff89e) into master (27d8d26) will not change coverage.
The diff coverage is 100.00%.

@@            Coverage Diff            @@
##            master     #1459   +/-   ##
=========================================
  Coverage   100.00%   100.00%           
=========================================
  Files           95        96    +1     
  Lines         9300      9370   +70     
=========================================
+ Hits          9300      9370   +70     
Impacted Files Coverage Δ
pints/_optimisers/__init__.py 100.00% <ø> (ø)
pints/_optimisers/_gradient_descent.py 100.00% <ø> (ø)
pints/__init__.py 100.00% <100.00%> (ø)
pints/_optimisers/_irpropmin.py 100.00% <100.00%> (ø)

Help us with your feedback. Take ten seconds to tell us how you rate us. Have a feature suggestion? Share it here.

@MichaelClerx MichaelClerx requested a review from DavAug August 16, 2022 15:49
Copy link
Member

@DavAug DavAug left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks very good to me, @MichaelClerx
I agree with not exposing the hyperparameters for now, given that according to the literature the algorithm appears to be quite robust to changes in them.

I just have minor comments. Happy for this to be merged ☺️

pints/_optimisers/_irpropmin.py Outdated Show resolved Hide resolved
pints/_optimisers/_irpropmin.py Show resolved Hide resolved
@MichaelClerx
Copy link
Member Author

Thanks @DavAug !

@MichaelClerx MichaelClerx merged commit 9acb238 into master Aug 17, 2022
@MichaelClerx MichaelClerx deleted the 1105-irpropmin branch August 17, 2022 10:28
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants