Bulletin of the American Physical Society
APS March Meeting 2019
Volume 64, Number 2
Monday–Friday, March 4–8, 2019; Boston, Massachusetts
Session C18: Machine Learning Quantum Manybody ModelsFocus

Hide Abstracts 
Sponsoring Units: DCOMP DCMP DAMOP Chair: Yi Zhang, Cornell University Room: BCEC 156B 
Monday, March 4, 2019 2:30PM  3:06PM 
C18.00001: Quantum Loop Topography for Machine Learning Transport Invited Speaker: Yi Zhang Despite the rapidly growing interest in harnessing machine learning in the study of quantum manybody systems, there remains a central challenge in efficiently extracting information from the model simulations of the quantum states and turning the information into formats compatible with the machine learning architecture, such as an artificial neural network. Here we introduce quantum loop topography (QLT): a featureselection machine learning procedure by evaluating correlators’ loop products of the microscopic models at independent Monte Carlo steps. Following the contribution of the currentcurrent correlations, we demonstrate that QLT can probe the distinctive transport properties of diverse states of matter, which are sometimes challenging to access directly. To showcase this approach, we study the emergent superconducting fluctuations as well as the topological phases with quantized Hall transport. We find that the QLT approach detects a change in transport in very good agreement with their established phase diagrams. We also demonstrate that our preselection of features relevant to transport allows us to work with a simple neural network, and then offer an interpretation of such a neural network for the analytical decision criteria. The high fidelity and numerical efficiency of our machine learning algorithm also point a way to identify hitherto elusive transport phenomena such as the nonFermi liquids. 
Monday, March 4, 2019 3:06PM  3:18PM 
C18.00002: Recent advances in the study of frustrated magnetism with NeuralNetwork quantum states Kenny Choo, Titus Neupert, Giuseppe Carleo NeuralNetwork quantum states are an actively explored route to solve challenging interacting quantum problems. 
Monday, March 4, 2019 3:18PM  3:30PM 
C18.00003: Symmetries and ManyBody Excitations with NeuralNetwork Quantum States Kenny Jing Choo, Giuseppe Carleo, Nicolas Regnault, Titus Neupert Artificial neural networks have been recently introduced as a variational ansatz for representing manybody wave functions. While previous efforts have been focused on obtaining ground states, in this work we extend the method to the study of excited states, which is an important task for many condensed matter applications. First, we give a prescription that allows us to target the lowest energy state within a symmetry sector of the Hamiltonian. Second, we give a simple algorithm to compute the lowlying states without symmetries. We demonstrate this approach on the onedimensional spin 1/2 Heisenberg model and the onedimensional BoseHubbard model and found good agreement where exact results are available. We applied our approach using both the restricted Boltzmann machine (RBM) and the feedforward neural network (FFNN). Interestingly, we obtained more accurate results using a deeper FFNN as compared with a shallower RBM with comparable number of variations parameters. 
Monday, March 4, 2019 3:30PM  3:42PM 
C18.00004: Learning Quantum Models from Symmetries Eli Chertkov, Benjamin Villalonga, Bryan Clark Inverse method algorithms that learn models from data, such as machine learning algorithms, have been successful in solving complicated engineering tasks and are increasingly being applied to study quantum systems. Moreover, inverse methods have the potential to automate the discovery of quantum materials with desired properties. With this goal in mind, we present an inverse method algorithm for learning quantum models, i.e., Hamiltonians, from symmetries or integrals of motion. The forward problem of starting from a given Hamiltonian and finding its symmetries, is generically difficult both analytically and numerically. Yet, despite the difficulty of the forward problem, we show that this method can efficiently solve the inverse problem of starting from a desired set of symmetries and finding Hamiltonians obeying those symmetries. In this talk, we describe this inverse method and give examples of its application. 
Monday, March 4, 2019 3:42PM  3:54PM 
C18.00005: Parent hamiltonians of restricted Boltzmann machine wavefunctions Samuel Lederer, EunAh Kim Wavefunctions based on the restricted Boltzmann machine (RBM) architecture have recently been found to be highly effective in variational calculations on interacting spin models. The promise of RBMs for compactly encoding quantum states motivates study of the structure underlying RBM wavefunctions. Here we tackle the question of what hamiltonians take RBM wavefunctions as eigenstates. Using a recently developed framework for identifying local hamiltonians from wavefunctions, we consider a select set of RBM states, and comment on the features of their parent hamiltonians. 
Monday, March 4, 2019 3:54PM  4:06PM 
C18.00006: Learning a local Hamiltonian from local measurements Eyal Bairey, Itai Arad, Netanel Lindner Recovering an unknown Hamiltonian from measurements is an increasingly important task for certification of noisy quantum devices and simulators. Recent works have succeeded in recovering the Hamiltonian of an isolated quantum system with local interactions from longranged correlators of a single eigenstate. Here, we show that such Hamiltonians can be recovered from local observables alone, using computational and measurement resources scaling linearly with the system size. In fact, to recover the Hamiltonian acting on each finite spatial domain, only observables within that domain are required. The observables can be measured in a Gibbs state as well as a single eigenstate; furthermore, they can be measured in a state evolved by the Hamiltonian for a long time, allowing to recover a large family of timedependent Hamiltonians. We derive an estimate for the statistical recovery error due to approximation of expectation values using a finite number of samples, which agrees well with numerical simulations. 
Monday, March 4, 2019 4:06PM  4:18PM 
C18.00007: Accelerating Density Matrix Renormalization Group Computations with Machine Learning Jacob Marks, HongChen Jiang, Thomas Devereaux Density Matrix Renormalization Group (DMRG) has achieved great success as a technique for simulating onedimensional and quasitwodimensional quantum systems. One major bottleneck for these computations is the variational procedure for ground state approximation. We investigate the application of machine learning methods to this problem and improve convergence time for various classes of strongly correlated systems. 
Monday, March 4, 2019 4:18PM  4:30PM 
C18.00008: Observation of topological phenomena in a programmable lattice of 1,800 qubits Juan Carrasquilla Here we demonstrate a largescale quantum simulation of this phenomenon in a network of 1,800 in situ programmable superconducting flux qubits arranged in a fullyfrustrated squareoctagonal lattice. Essential to the critical behavior, we observe the emergence of a complex order parameter with continuous rotational symmetry, and the onset of quasilongrange order as the system approaches a critical temperature. We use a simple but previously undemonstrated approach to statistical estimation with an annealingbased quantum processor, performing Monte Carlo sampling in a chain of reverse quantum annealing protocols. Observations are consistent with classical simulations across a range of Hamiltonian parameters. We anticipate that our approach of using a quantum processor as a programmable magnetic lattice will find widespread use in the simulation and development of exotic materials. 
Monday, March 4, 2019 4:30PM  4:42PM 
C18.00009: Selflearning with neural networks in determinant quantum Monte Carlo studies of the Holstein model. Shaozhi Li, Philip Dee, Ehsan Khatami, Steven Johnston Machine learning techniques have recently occupied the focus of many investigators in computational manybody physics. In particular, some practitioners of quantum MonteCarlo have considered the efficacy of various "SelfLearning" techniques which aim to reduce CPU runtime associated with updates and autocorrelation times. We have used artificial neural networks (NN) within determinant quantum MonteCarlo to improve the scaling of CPU runtime with typical system parameters. This work focuses on a singleband Holstein Hamiltonian, which models Einstein phonons coupled to onsite electrons. We have implemented both fully connected and convolutional NN and used them to study the metallic and insulating phases of this model. To close, we will assess the generality of this approach to other model systems. 
Monday, March 4, 2019 4:42PM  4:54PM 
C18.00010: Unsupervised manifold learning of ground state wave functions Michael Matty, Yi Zhang, Senthil Todadri, EunAh Kim Quantum manybody wave functions are complex objects that encode much information, but it can be challenging to back out the information. In particular, there is no good way to assess whether a given wave function can be a ground state of some local Hamiltonian. Here we employ an unsupervised machine learning algorithm wellsuited for discovering trends in highdimensional space: manifold learning. We apply our approach to a band insulator and the toric code and demonstrate that our approach can separate ground state wave functions from excited state wave functions without any prior knowledge. 
Monday, March 4, 2019 4:54PM  5:06PM 
C18.00011: Machinery representation of physics models via structured selfattention network Junwei Liu, Yang Zhang, Yujun zhao Recently machine learning techniques, especially deep neural networks, are widely used to identify phases and phase transition and speed up the simulations in variety of physical models. However, deep neural networks are often required to model the energy or partition function, which results in heavy computational cost and also limits the application to large system. In this work, we proposed a structured selfattention neural network approach, inspired by mean field theory, to represent the original Hamiltonian via wellstructured neural layers. We experiment with the ring exchange Ising model and double exchange model, and show these two models can be well represented by pseudo spin layers and local field layers. Quite different from the sequential neural network, which stores all the information in every single layer, the separation strategy achieves significantly less training time, higher accuracy and straight forward application to large system. We therefore believe the new structural network would also be highly efficient to identify different phases and accelerate the numerical simulations for even complex models 
Monday, March 4, 2019 5:06PM  5:18PM 
C18.00012: Neural Network Renormalization Group ShuoHui Li, Lei Wang We present a variational renormalization group (RG) approach using a deep generative model based on normalizing flows. The model performs hierarchical changeofvariables transformations from the physical space to a latent space with reduced mutual information. Conversely, the neural net directly maps independent Gaussian noises to physical configurations following the inverse RG flow. The model has an exact and tractable likelihood, which allows unbiased training and direct access to the renormalized energy function of the latent variables. To train the model, we employ probability density distillation for the bare energy function of the physical problem, in which the training loss provides a variational upper bound of the physical free energy. We demonstrate practical usage of the approach by identifying mutually independent collective variables of the Ising model and performing accelerated hybrid Monte Carlo sampling in the latent space. 
Monday, March 4, 2019 5:18PM  5:30PM 
C18.00013: Learning density functional theory mappings with extensive deep neural networks and deep convolutional inverse graphics networks Kevin Ryczko, David Strubbe, Isaac Tamblyn In this work, we show that deep neural networks (DNNs) can be used in conjunction with KohnSham density functional theory (KSDFT) for twodimensional electron gases in simple harmonic oscillator and random potentials. Using calculations from the Octopus realspace DFT code we show that extensive DNNs (EDNNs) can learn the mappings between the electron density and exchange, correlation, external, kinetic and total energies simultaneously. Our results hold for local, semilocal, and hybrid exchangecorrelation functionals. We then show that the external potential can also be used as input for an EDNN when predicting the aforementioned energy functionals, bypassing the KS scheme. Additionally, we show that EDNNs can be used to map the electron density calculated with a local exchangecorrelation functional to energies calculated with a semilocal exchange correlation functional. Lastly, we show that deep convolutional inverse graphics networks can be used to map external potentials to their respective selfconsistent electron densities. This work shows that EDNNs are generalizable and transferable given the variability of the potentials and the ability to scale to an arbitrary system size with an O(N) computational cost. 
Follow Us 
Engage
Become an APS Member 
My APS
Renew Membership 
Information for 
About APSThe American Physical Society (APS) is a nonprofit membership organization working to advance the knowledge of physics. 
© 2021 American Physical Society
 All rights reserved  Terms of Use
 Contact Us
Headquarters
1 Physics Ellipse, College Park, MD 207403844
(301) 2093200
Editorial Office
1 Research Road, Ridge, NY 119612701
(631) 5914000
Office of Public Affairs
529 14th St NW, Suite 1050, Washington, D.C. 200452001
(202) 6628700