Sebastian Sanokowski, Wilhelm Berghammer, Johannes Kofler, Sepp Hochreiter, and Sebastian Lehner

An illustration of the proposed architecture.

An illustration of the proposed architecture.

For a wide range of combinatorial optimization problems, finding the optimal solutions is equivalent to finding the ground states of corresponding Ising Hamiltonians. Recent work shows that these ground states are found more efficiently by variational approaches using autoregressive models than by traditional methods. In contrast to previous works, where for every problem instance a new model has to be trained, we aim at a single model that approximates the ground states for a whole family of Hamiltonians. We demonstrate that autoregregressive neural networks can be trained to achieve this goal and are able to generalize across a class of problems. We iteratively approximate the ground state based on a representation of the Hamiltonian that is provided by a graph neural network. Our experiments show that solving a large number of related problem instances by a single model can be considerably more efficient than solving them individually.

Machine Learning and the Physical Sciences - NeurIPS 2022, 2022-12-03.

Download
View paper
IARAI Authors
Dr Sepp Hochreiter, Sebastian Lehner
Research
Physics
Keywords
Graph Neural Networks, Hamiltonian, Ising Model, Long Short-Term Memory, Optimization

©2023 IARAI - INSTITUTE OF ADVANCED RESEARCH IN ARTIFICIAL INTELLIGENCE

Imprint | Privacy Policy

Stay in the know with developments at IARAI

We can let you know if there’s any

updates from the Institute.
You can later also tailor your news feed to specific research areas or keywords (Privacy)
Loading

Log in with your credentials

Forgot your details?

Create Account