Probabilistic Computing for Quantum Simulations
Analysis
This paper addresses the computational bottleneck in simulating quantum many-body systems using neural networks. By combining sparse Boltzmann machines with probabilistic computing hardware (FPGAs), the authors achieve significant improvements in scaling and efficiency. The use of a custom multi-FPGA cluster and a novel dual-sampling algorithm for training deep Boltzmann machines are key contributions, enabling simulations of larger systems and deeper variational architectures. This work is significant because it offers a potential path to overcome the limitations of traditional Monte Carlo methods in quantum simulations.
Key Takeaways
- •Combines sparse Boltzmann machines with probabilistic computing hardware (FPGAs) to improve quantum simulation efficiency.
- •Achieves accurate ground-state energies for large lattices (up to 80x80).
- •Introduces a dual-sampling algorithm for training deep Boltzmann machines, improving parameter efficiency.
- •Demonstrates a path to overcome sampling bottlenecks in variational quantum simulations.
“The authors obtain accurate ground-state energies for lattices up to 80 x 80 (6400 spins) and train deep Boltzmann machines for a system with 35 x 35 (1225 spins).”