Relative entropy has been shown to provide a principled framework for the selection of coarse-grained potentials. Despite the intellectual appeal of it, its application has been limited by the fact that it requires the solution of an optimization problem with noisy gradients. When using deterministic optimization schemes, one is forced to either decrease the noise by adequate sampling or to resolve to ad hoc modifications in order to avoid instabilities. The former increases the computational demand of the method while the latter is of questionable validity. In order to address these issues and make relative entropy widely applicable, we propose alternative schemes for the solution of the optimization problem using stochastic algorithms.
展开▼