TY - JOUR
T1 - Approximate contraction of arbitrary tensor networks with a flexible and efficient density matrix algorithm
AU - Ma, Linjian
AU - Fishman, Matthew T.
AU - Stoudenmire, E. M.
AU - Solomonik, Edgar
N1 - M.F. and M.S. are grateful for ongoing support through the Flatiron Institute, a division of the Simons Foundation. L.M. acknowledges support through the Flatiron Institute internship program through which this work was initiated. The work of L.M. and E.S. was also supported by the US National Science Foundation (NSF) grant #1942995 and the Department of Energy (DOE) Advanced Scientific Computing Research program via award DE-SC0023483.
PY - 2024
Y1 - 2024
N2 - Tensor network contractions are widely used in statistical physics, quantum computing, and computer science. We introduce a method to efficiently approximate tensor network contractions using low-rank approximations, where each intermediate tensor generated during the contractions is approximated as a low-rank binary tree tensor network. The proposed algorithm has the flexibility to incorporate a large portion of the environment when performing low-rank approximations, which can lead to high accuracy for a given rank. Here, the environment refers to the remaining set of tensors in the network, and low-rank approximations with larger environments can generally provide higher accuracy. For contracting tensor networks defined on lattices, the proposed algorithm can be viewed as a generalization of the standard boundary-based algorithms. In addition, the algorithm includes a cost-efficient density matrix algorithm for approximating a tensor network with a general graph structure into a tree structure, whose computational cost is asymptotically upper-bounded by that of the standard algorithm that uses canonicalization. Experimental results indicate that the proposed technique outperforms previously proposed approximate tensor network contraction algorithms for multiple problems in terms of both accuracy and efficiency.
AB - Tensor network contractions are widely used in statistical physics, quantum computing, and computer science. We introduce a method to efficiently approximate tensor network contractions using low-rank approximations, where each intermediate tensor generated during the contractions is approximated as a low-rank binary tree tensor network. The proposed algorithm has the flexibility to incorporate a large portion of the environment when performing low-rank approximations, which can lead to high accuracy for a given rank. Here, the environment refers to the remaining set of tensors in the network, and low-rank approximations with larger environments can generally provide higher accuracy. For contracting tensor networks defined on lattices, the proposed algorithm can be viewed as a generalization of the standard boundary-based algorithms. In addition, the algorithm includes a cost-efficient density matrix algorithm for approximating a tensor network with a general graph structure into a tree structure, whose computational cost is asymptotically upper-bounded by that of the standard algorithm that uses canonicalization. Experimental results indicate that the proposed technique outperforms previously proposed approximate tensor network contraction algorithms for multiple problems in terms of both accuracy and efficiency.
UR - http://www.scopus.com/inward/record.url?scp=85214091314&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85214091314&partnerID=8YFLogxK
U2 - 10.22331/q-2024-12-27-1580
DO - 10.22331/q-2024-12-27-1580
M3 - Article
AN - SCOPUS:85214091314
SN - 2521-327X
VL - 8
JO - Quantum
JF - Quantum
ER -