Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix#180 #184

Merged
merged 8 commits into from
Jun 21, 2023
Merged

Fix#180 #184

merged 8 commits into from
Jun 21, 2023

Conversation

sqbl
Copy link
Collaborator

@sqbl sqbl commented Jun 20, 2023

This is an attempt to allow users to add or remove the modelled experimental noise. I'm balancing between not wanting to change the "inner mechanics" of our fitting while also wanting to expose the knowledge of the experimental noise to the user. Currently, experimental noise is set to zero between each iteration with reference to chapter 2 in Rasmussen and Williams. Link.
With this "fix" the workflow could now be:

  • set up optimizer
  • experiment/make data/fit model (i.e. tell())
  • Decide on users interest in looking at the Expectation values of the model (like looking at a linear regression line) or looking at the possible outcomes of a new experiment
  • If interested in "normal regression": continue
  • if interested in plotting, predicting, sampling with expected noise, then call a helper function to change the noise level AND THEN call the plotting function
  • Lastly: If the user wants to keep on experimenting, then ask() (a.k.a. 'next_x') has already been stored and is ready to be served to the user.

So in total: the user will need to call opt.add_modelled_noise() to go from regression to prediction and would need to call opt.remove_modelled_noise() to return old functionality. Alternative, the user will just need to add one additional datapoint to return to old functionality/behaviour.

@sqbl sqbl linked an issue Jun 20, 2023 that may be closed by this pull request
@sqbl
Copy link
Collaborator Author

sqbl commented Jun 20, 2023

When/if this has been merged, we might wanna look at issue #95 again.

Copy link
Collaborator

@dk-teknologisk-mon dk-teknologisk-mon left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Good addition overall, appears to work as intended (I have also tested it on the Branin-Hoo system at varying degrees of noise). I'd mainly like to see the changes related to the pass-by-reference comment I made in the test code, everything else is just formatting nice-to-have.

res_x, [res_y, res_std_no_white] = expected_minimum(res, return_std=True)
#Add moddeled experimental noise
opt.add_modelled_noise()
res_noise = opt.get_result()
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Remember that python works with pass-by-reference, and with the way you have implemented add_modelled_noise your original res (line 374) is changed when you call it. In this test, res and res_noise point to the same underlying object and you will get the same result in line 387 by calling expected_minmum(res, return_std=True).

I think it would be better if the test here makes this clear by removing line 378 and explicitly calling the same optimize result in both calls of expected minimum.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Would it make sense to do a copy of the entire optimizer in this case?
In details: train the optimizer on the data. Make a copy of the optimizer; call it opt_noisy. Add the noise to apt_noisy. Then pull the std from the expected minima?
And now that I write this... shouldn't we check that the minima are identical in x and y (?)

Copy link
Collaborator

@dk-teknologisk-mon dk-teknologisk-mon Jun 21, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

And now that I write this... shouldn't we check that the minima are identical in x and y (?)

Well, the res_x location in your test is probably quite sensitive to the specific simulated data points (as there isn't an optimum) so I wouldn't want to add it for that reason alone.

More importantly, with the way our model is defined, the WhiteNoise is simply a constant increase of the uncertainty with the same value across the whole parameter space. Because of this, by definition it cannot change the location of the expected minimum in mean values. If the WhiteNoise is very large it may change the location in practice as a consequence of using sampling to find the expected minimum, but it doesn't add any information and so should be left out.

ProcessOptimizer/tests/test_optimizer.py Outdated Show resolved Hide resolved
ProcessOptimizer/tests/test_optimizer.py Show resolved Hide resolved
ProcessOptimizer/tests/test_optimizer.py Outdated Show resolved Hide resolved
@dk-teknologisk-mon
Copy link
Collaborator

When/if this has been merged, we might wanna look at issue #95 again.

I absolutely agree. The noise level bounds refer to normalized y-values, and in this scale the present lower limit of 1E-5 is an absurdly unrealistic lower limit for noise in real data. Still, whether to change it is not so black/white though, as if your model ends in that value for the noise, it is at least very easy to have the code flag that it is an unrealistic scenario...

@dk-teknologisk-mon dk-teknologisk-mon merged commit 7fb8913 into develop Jun 21, 2023
@sqbl sqbl deleted the Fix#180 branch June 22, 2023 06:39
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Noise estimation is completely broken
2 participants