TY - JOUR
T1 - Backflow Transformations via Neural Networks for Quantum Many-Body Wave Functions
AU - Luo, Di
AU - Clark, Bryan K.
N1 - Publisher Copyright:
© 2019 American Physical Society.
PY - 2019/6/4
Y1 - 2019/6/4
N2 - Obtaining an accurate ground state wave function is one of the great challenges in the quantum many-body problem. In this Letter, we propose a new class of wave functions, neural network backflow (NNB). The backflow approach, pioneered originally by Feynman and Cohen [Phys. Rev. 102, 1189 (1956)10.1103/PhysRev.102.1189], adds correlation to a mean-field ground state by transforming the single-particle orbitals in a configuration-dependent way. NNB uses a feed-forward neural network to learn the optimal transformation via variational Monte Carlo calculations. NNB directly dresses a mean-field state, can be systematically improved, and directly alters the sign structure of the wave function. It generalizes the standard backflow [L. F. Tocchio et al., Phys. Rev. B 78, 041101(R) (2008)10.1103/PhysRevB.78.041101], which we show how to explicitly represent as a NNB. We benchmark the NNB on Hubbard models at intermediate doping, finding that it significantly decreases the relative error, restores the symmetry of both observables and single-particle orbitals, and decreases the double-occupancy density. Finally, we illustrate interesting patterns in the weights and bias of the optimized neural network.
AB - Obtaining an accurate ground state wave function is one of the great challenges in the quantum many-body problem. In this Letter, we propose a new class of wave functions, neural network backflow (NNB). The backflow approach, pioneered originally by Feynman and Cohen [Phys. Rev. 102, 1189 (1956)10.1103/PhysRev.102.1189], adds correlation to a mean-field ground state by transforming the single-particle orbitals in a configuration-dependent way. NNB uses a feed-forward neural network to learn the optimal transformation via variational Monte Carlo calculations. NNB directly dresses a mean-field state, can be systematically improved, and directly alters the sign structure of the wave function. It generalizes the standard backflow [L. F. Tocchio et al., Phys. Rev. B 78, 041101(R) (2008)10.1103/PhysRevB.78.041101], which we show how to explicitly represent as a NNB. We benchmark the NNB on Hubbard models at intermediate doping, finding that it significantly decreases the relative error, restores the symmetry of both observables and single-particle orbitals, and decreases the double-occupancy density. Finally, we illustrate interesting patterns in the weights and bias of the optimized neural network.
UR - http://www.scopus.com/inward/record.url?scp=85066961403&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85066961403&partnerID=8YFLogxK
U2 - 10.1103/PhysRevLett.122.226401
DO - 10.1103/PhysRevLett.122.226401
M3 - Article
C2 - 31283262
AN - SCOPUS:85066961403
SN - 0031-9007
VL - 122
JO - Physical Review Letters
JF - Physical Review Letters
IS - 22
M1 - 226401
ER -