Skip to content
This repository has been archived by the owner on Jul 16, 2020. It is now read-only.

Stochastic reconfiguration and Linear Method as pytorch optimizers #15

Open
NicoRenaud opened this issue Dec 28, 2019 · 1 comment
Open

Comments

@NicoRenaud
Copy link
Collaborator

Both LR and LM requires the calculations of some gradients wrt the optimization paramters over the sampling point. We could get those in the .grad of the parameters after calculation of the wave function.

We should develop SR and LM as pytorch optimizers to be able to switch between traditional ML optimizers and dedicated QMC optimizers

@NicoRenaud
Copy link
Collaborator Author

SR in master now (not really tested though ... )

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant