Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix unexpected recommendation #951

Merged
merged 8 commits into from
Dec 14, 2020
Merged

Fix unexpected recommendation #951

merged 8 commits into from
Dec 14, 2020

Conversation

jrapin
Copy link
Contributor

@jrapin jrapin commented Dec 14, 2020

Types of changes

  • Docs change / refactoring / dependency upgrade
  • Bug fix (non-breaking change which fixes an issue)
  • New feature (non-breaking change which adds functionality)
  • Breaking change (fix or feature that would cause existing functionality to change)

Motivation and Context / Related issue

#947
What happens:

  • 1+1 converges to very small sigma/steps very quickly
  • when reaching machine precision, it considers it is reestimating the same point
  • points which have several estimations have lower pessimistic loss, even though they are not as good
  • the best pessimistic point is returned, which is not good in this case.

How Has This Been Tested (if it applies)

Checklist

  • The documentation is up-to-date with the changes I made.
  • I have read the CONTRIBUTING document and completed the CLA (see CLA).
  • All tests passed, and additional code has been covered with new tests.

@facebook-github-bot facebook-github-bot added the CLA Signed Do not delete this pull request or issue due to inactivity. label Dec 14, 2020
@jrapin
Copy link
Contributor Author

jrapin commented Dec 14, 2020

@teytaud not sure how to solve this, maybe use the average as default to return the value ? this seems to be the expected behavior. Returning the pessimistic best as a default even for non-noisy optimization is confusing

@teytaud
Copy link
Contributor

teytaud commented Dec 14, 2020

@teytaud not sure how to solve this, maybe use the average as default to return the value ? this seems to be the expected behavior. Returning the pessimistic best as a default even for non-noisy optimization is confusing

In the deterministic setting yes we should return the average.

@jrapin jrapin changed the title [WIP] Solve unexpected recommendation Fix unexpected recommendation Dec 14, 2020
Comment on lines +491 to +492
name = "minimum" if self.parametrization.descriptors.deterministic_function else "pessimistic"
return self.current_bests[name].parameter
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@teytaud this is a drastic change in my opinion, you may want to crosscheck it in the xps

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think it is safe but yes I'll check xps... complicated though are the absolute indicators are just new.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I guess we can wait a bit to merge if it is safer

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

or I provide a fix for the noisy variants

@@ -395,7 +395,7 @@ def _update_archive_and_bests(self, candidate: p.Parameter, loss: tp.FloatLoss)
# update current best records
# this may have to be improved if we want to keep more kinds of best losss

for name in ["optimistic", "pessimistic", "average"]:
for name in self.current_bests:
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

ah yes cool as a shortcut :-)

return score


def test_recommendation_correct() -> None:
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

A possible test would be the example in the issue.
That said, I find the fix quite cool and convincing.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actually that is the same test, just simplified a bit, but it does exactly the same

@jrapin jrapin merged commit cdee1f8 into master Dec 14, 2020
@jrapin jrapin deleted the bug branch December 14, 2020 18:48
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
CLA Signed Do not delete this pull request or issue due to inactivity.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants