Skip to content

hpslib/hpsmultidomain

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

302 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

hpsmultidomain

Hierarchical Poincaré-Steklov Solver for PDEs

DOI License Top language Code size Latest commit

Written by Joseph Kump and Anna Yesypenko

The Hierarchical Poincaré-Steklov (HPS) Solver is a high-performance computing solution designed to solve Partial Differential Equations (PDEs) on multidomain geometries. Leveraging advanced numerical methods and efficient parallel processing including GPUs, this solver is capable of handling complex PDE problems with high accuracy and computational efficiency.

Features

  • Multidomain Discretization: Supports solving PDEs on complex geometries divided into multiple domains.
  • Batch Processing: Utilizes batch operations for efficient computation on both CPU and GPU.
  • Sparse Matrix Representation: Employs sparse matrices to optimize memory usage and computation time.

Figures 1 and 2: Plots of solutions to the gravity Helmholtz equation, $\Delta u + \kappa^2 (1-x_3) u = -1$ on $\Omega = [1.1, 2.1] \times [-1, 0] \times [-1.2, -0.2]$ with a zero DBC. A cross-section has been taken through the $x$-axis. We see consistent results across different $p$ and $h$.

Figures 3 and 4: Plots of solutions to a constant-coefficient Helmholtz equation on different domain geometries.

Dependencies

  • PyTorch: For tensor operations and GPU acceleration.
  • NumPy: For numerical operations.
  • SciPy: For sparse matrix operations and linear algebra.
  • petsc4py (Optional): To use PETSc for sparse matrix operations. The solver can fall back to SciPy if petsc4py is not available. However, PETSc (particularly using the MUMPS direct solver) makes the code much faster.

Installation:

  1. Download the repository to your machine.
  2. (Recommended) create a clean Conda environment. Alternatively switch to the environment you wish to incorporate hpsmultidomain into.
  3. Navigate to the repository directory and run pip install .. This will download hpsmultidomain itself as well as its dependencies.
  4. (Optional but recommended) Install mpi4py and petsc4py through either Conda or pip, since they generally give better performance than SuperLU.

(Optional but recommended) MUMPS via petsc4py:

  1. Install fortran into your environment if it's not there already: conda install -c conda-forge gfortran
  2. Set the proper petsc configure options: export PETSC_CONFIGURE_OPTIONS="--with-64-bit-indices --download-mumps --with-debugging=0 --with-mpi=0 --with-mumps-serial --with-fortran-bindings=0 --download-metis --download-scotch --download-bison --download-flex" This downloads MUMPS as well as metis for improved pivoting. This configuration does not provide MPI support (but still offers OpenMP, which is adequate for many workstations). It removes debugging and fortran bindings (but not compiled fortran) for performance.
  3. Install PETSc: python -m pip install -v --no-binary=:all: --no-cache-dir petsc
  4. Install petsc4py: python -m pip install -v --no-build-isolation --no-binary=:all: --no-cache-dir petsc4py

YMMV depending on your machine. You may need to restrict the versions of pip and setuptools: pip install -U "pip<26" "setuptools<75" wheel and/or update cython: python -m pip install --upgrade cython

Example usage

For a 2D problem:

python hpsmultidomain/argparse_driver.py --pde poisson --domain square --bc log_dist --n 1000 --p 12 --d 2 --solver superLU

And for a 3D problem:

python hpsmultidomain/argparse_driver.py --pde poisson --domain square --bc log_dist --n 50 --p 12 --d 3 --solver MUMPS

You can also test on preconfigured basic problems by calling python test/test_multidomain.py and python test/test_multidomain_curved.py

Notes

A series of command line arguments can be seen in argparse_driver.py. These include:

  • pde to specify the partial differential equation to solve, such as poisson or bfield_constant (i.e. constant-coefficient Helmholtz equation)
  • domain to specify domain shapes. square is the standard for rectangular domains, curved (sinusoidal curve) and annulus are alternatives.
  • bc to supply a Dirichlet BC for the problem. Some of these comdbined with certain PDEs have analytic solutions, others do not.
  • d for the domain dimension, either 2 or 3
  • p for the discretization's polynomial order on each subdomain. There will be $p^d$ points per subdomain.
  • n for the total number of discretization points along each axis. This can be either a single number (assuming a square discretization) or two/three numbers set to n0, n1, n2 for non-square discretizations. For now, each axis must have a number of points equal to a multiple of (p-2)
  • solver for the solver of choice for the sparse matrix factorization, such as MUMPS or superLU
  • box_xlim/box_ylim/box_zlim to set the physical dimensions of the 2D or 3D domain (defaults to [0,1]^d)
  • kh for the wavenumber in PDEs with wavenumbers such as bfield_constant
  • ppw to specify the number of discretization points per wavelength in PDEs with wavenumbers such as bfield_constant. This automatically sets the wavenumber, so it is incompatible with kh
  • sparse_assembly to specify whether operations with PyTorch tensors are done on GPUs or CPUs. If not specified then the code will check if a GPU is available and otherwise resort to CPU.
  • pickle to specify a filepath to store results of the run
  • visualize to plot the PDE solution

If you wish to run the solver multiple times for the same 3D problem (same PDE and BC) across a range of p and n values, you can run run_models.py with the problem specified. This calls test_3d.py to automatically piece together the commandline arguments and run the solver several times in a row.

Alternatively, you can call test_hps_multidomain.py or test_hps_multidomain_curved.py to test a few various 2D and 3D configurations.

Associated Papers

The paper connected to this code can be found here:

[1] Kump, Joseph; Yesypenko, Anna; and Per-Gunnar Martinsson. "A Two-Level Direct Solver for the Hierarchical Poincaré-Steklov Method" arXiv preprint arXiv:2503.04033 (2025).

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Packages

 
 
 

Contributors

Languages