Skip to content

tzu-chen/VariationalGENN

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

138 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

VariationalGENN

Variational Monte Carlo with gauge-equivariant neural-network quantum states for $\mathbb{Z}_N$ lattice gauge theories in 2+1 dimensions. The network architecture enforces local gauge invariance exactly, so sampling stays on the physical (gauge-invariant) subspace throughout training.

Research output: Deep learning lattice gauge theories, Phys. Rev. B 110, 165133 (2024) — A. Apte, C. Córdova, T.-C. Huang, A. Ashmore.


What the code does

  • Builds gauge-invariant neural-network ansätze for $\mathbb{Z}_N$ pure gauge theory on a 2D spatial lattice in the Hamiltonian formulation.
  • Optimizes the ansätze via variational Monte Carlo (NetKet-based) to find ground states as a function of the gauge coupling.
  • Uses transfer learning across coupling values to track the ground state through the confinement / deconfinement transition, rather than retraining from scratch at each point.
  • Extracts critical behavior: data collapse and finite-size scaling to recover critical exponents (continuous transition for $\mathbb{Z}_2$ in the Ising universality class; weakly first-order for $\mathbb{Z}_3$).

Tech stack

Layer Tool
VMC & NNQS machinery NetKet
Autodiff / accelerator JAX (via NetKet)
Experiments Jupyter notebooks
Cluster execution SLURM (gpu.sbatch, cluster.py)

Running experiments

Most experiments live in Jupyter notebooks and were launched on a GPU SLURM cluster. The typical flow:

  1. Prototype the architecture and optimization in a notebook against a small lattice (where exact diagonalization is available for sanity-checking).
  2. Scale up via cluster.py + gpu.sbatch — the SLURM template launches a parameter sweep (lattice size × coupling × seed).
  3. Post-process results (energy convergence, observables, scaling fits) back in notebooks.

Dependencies

No pinned environment is checked in. The core stack is:

  • Python ≥ 3.9
  • NetKet ≥ 3.x
  • JAX with CUDA support (for GPU runs)
  • NumPy, SciPy, Matplotlib, Jupyter

Citing

@article{ApteCordovaHuangAshmore2024,
  title   = {Deep learning lattice gauge theories},
  author  = {Apte, Anuj and C\'ordova, Clay and Huang, Tzu-Chen and Ashmore, Anthony},
  journal = {Phys. Rev. B},
  volume  = {110},
  number  = {16},
  pages   = {165133},
  year    = {2024},
  doi     = {10.1103/PhysRevB.110.165133},
}

About

Variational algorithms for finding ground states of lattice gauge theories using gauge equivariant neural network ansätze. Phys. Rev. B 110, 165133 (2024).

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors