Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

metadynamics as out of equilibrium method #5

Closed
roitberg opened this issue Feb 24, 2022 · 4 comments
Closed

metadynamics as out of equilibrium method #5

roitberg opened this issue Feb 24, 2022 · 4 comments

Comments

@roitberg
Copy link

Hi,

Very nice review !

I think one should extend the discussion of Metadynamics (in all of its incarnations) and an intrinsically off equilibrium method, which should then be treated with some Jarzynski-ype reweighing.

As you properly described it, MD works by changing the initial surface, applying a time-dependent bias. Whatever you get at the end of such calculation, can then never be really considered a free energy, but rather it is ‘work’ done onto the system.

So, there are a small number of solutions. Either one adds the gaussians VERY slowly (which most people do not check), or one applies some reweighing at the end. However, the reweighing formulas show in the paper can only work if one has not moved too far from the “correct” phase space. In other words, if you pull/add gaussians too fast, no amount of reweighing is going to save you.

@mrshirts
Copy link
Collaborator

This is a great question/comment. I think that in most cases, history-dependent methods can best be thought of as stochastic approximations to the free energy surface, with the caveats that implies. There's some theory on convergence of this stochastic approximation under certain conditions, but more certainly needs to be done.

Question: when talking about, say, steered MD to calculate a single free energy, it's very clear how one Jarzynski-type reweighting should be done, as the initial ensemble is the NVT/NPT ensemble of configurations you start the pulling from. With metadynamics, you don't have a clearly defined distribution to reweight from, only a family of related distributions that evolves over time, so I don't know you really can define a clear reweighting scheme?

My paranoid way of dealing with metadynamics-type approaches when I need quantitative results is to run them for awhile, then freeze the biases, run some more with fixed biases, and analyze for thermdynamics only the data collected after bias freezing. Then there are no real theoretical issues, at the cost of some extra run time (and problems that occur when you stop too soon).

@GiovanniBussi
Copy link

I agree with @roitberg that metadynamics simulations are out of equilibrium. However, I am not aware of formalisms to analyze them with a Jarzynski-like equation. And indeed most people analyze them as if they were done at equilibrium, thus resulting in a (large, small, or negligible?) systematic error. So, I think there is some arbitrariness in the classification.

There is a nice cancellation property that makes it so that at least part of this systematic error cancels out. You can see this in this old paper for a Langevin system. My way of understanding this is that (a) if the biased CV is out of equilibrium there's no problem, since the error will cancel out; however (b) if some other slow variable is out of equilibrium you will have a systematic error. This is what you should look for. You certainly need some experience, but the typical tests to assess how much you were out of equilibrium are not too difficult to do (I am not saying that most papers are reporting them).

Regarding the frozen bias (that I also informally call "paranoia mode" ;-) @mrshirts ), let me add one comment. Whereas by running out of equilibrium you will add a systematic error, running at equilibrium with less transitions you will increase the statistical error. Finding the sweet spot is not trivial. My experience is that well tempered metadynamics is a good compromise, and indeed it clearly (if you keep your eyes open) does not work when you have bad CVs. Several other methods that are formally at equilibrium are more difficult to test against problems in the CV (e.g., multiple windows umbrella sampling without replica exchange).

PS Thanks @jhenin for hosting this!

@jhenin
Copy link
Owner

jhenin commented Jun 17, 2022

Thank you for the comments @roitberg and @GiovanniBussi . The point about orthogonal CVs being out of equilibrium is a major one, I would call it a defining characteristic of nonequilibrium simulation. If all non-driven degrees of freedom are fully relaxed with respect to the dynamics of the driven CVs, then we are in a time-separated / adiabatic regime - that is, the ideal regime of TAMD, or the quasi-static "slow growth" limit of SMD. I doubt this applies to many real-world adaptive simulations.

In cases (ABF, wtMTD...) where the biases converge and the dynamics tends towards equilibrium, trajectory averages converge towards equilibrium averages as well, as the weight of early samples decays. In that sense, those methods are only transiently out of equilibrium, as opposed to methods like SMD with a defined nonequilibrium schedule. Does this distinction seem good enough for a classification?

@jhenin
Copy link
Owner

jhenin commented Aug 25, 2022

Thank you for your contributions, we have updated the text accordingly.

@jhenin jhenin closed this as completed Aug 25, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants