Skip to content

QAR-Lab/quantum-black-box-optimization

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

12 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Quantum Surrogate Modeling for Black Box Optimization

DOI arXiv

Quantum circuit-based regression using variational quantum circuits to approximate computationally expensive black box functions. This approach addresses challenges in chemical optimization, pharmaceutical development, and other domains with sparse, noisy data.

Abstract

In many optimization tasks, the objective function is a black box due to computational intractability. Classical neural networks often overfit when dealing with small sample sizes and noisy data. This work demonstrates that quantum neural networks (QNN) can serve as effective surrogate models, achieving:

  • R² scores > 0.9 on benchmark functions
  • Better noise resilience than classical CNNs
  • Improved generalization with smaller sample sizes
  • Robust performance at noise levels where classical models fail

For details, see our paper: Quantum Surrogate Modeling for Black Box Optimization

Comparison of QNN vs CNN on noisy Griewank function
Figure: Surface plots of the Griewank function with noise (factor 0.5) and sample scarcity (20 samples from 400 points). The QNN maintains better approximation quality compared to the classical ANN under these challenging conditions.

Installation

# Clone repository
git clone https://github.com/your-username/your-repo-name.git
cd your-repo-name

# Create virtual environment
python3 -m venv venv
source venv/bin/activate  # Windows: venv\Scripts\activate

# Install dependencies
pip install -r requirements.txt

# Setup directories
mkdir -p src data models logs notebooks
touch src/__init__.py

Quick Start

from src.circuits import *
from src.datasets import *
from src.regressors import *
from qiskit_algorithms.optimizers import COBYLA

# Load benchmark dataset
datasets = load_complex_sets_2d()
dataset = datasets['griewank_2d']  # 20×20 = 400 points
x, y = get_appropriate_x_and_y(dataset)

# Configure and train quantum model
regressor = StandardRegressor(
    n_qubits=4,
    circuit_list=[(11, 1)],
    feature_map=get_standard_feature_map(2, 2),
    optimizer=COBYLA(maxiter=100),
    re_upload=True
)

model = regressor.get_regressor().fit(x, y)
score = model.score(x, y)
print(f"R² Score: {score:.4f}")

Usage

Training Quantum Models

Run notebooks/regression.ipynb to train quantum surrogate models on benchmark datasets.

Classical Baseline

Run notebooks/CNN.ipynb to train classical neural network baselines for comparison.

Key Comparison: Classical models overfit on small samples and degrade significantly with noise (factor > 0.3), while quantum models maintain robust performance.

Benchmark Datasets

  • Griewank: Multimodal function with many local minima
  • Schwefel: Deceptive global optimum
  • Color Bob: Custom trigonometric landscape
  • Styblinski-Tang: Classic optimization benchmark

Method

Our approach uses variational quantum circuits with:

  1. Input Encoding: Data encoded into quantum feature maps
  2. Parameterized Ansatz: 15+ circuit architectures from literature
  3. Classical Optimization: COBYLA/L-BFGS-B for parameter updates
  4. Enhanced Training:
    • Parallel encoding (multiple qubits)
    • Data reuploading (repeated feature map uploads)

Performance evaluated using R² score (coefficient of determination).

Project Structure

├── src/                      # Core implementation
│   ├── circuits.py           # Quantum circuit library (15+ ansätze)
│   ├── datasets.py           # Benchmark datasets
│   ├── regressors.py         # Quantum regression models
│   └── visualization.py      # Plotting utilities
├── notebooks/                
│   ├── regression.ipynb      # Quantum training
│   ├── CNN.ipynb             # Classical baseline
│   └── load_models.ipynb     # Model evaluation
└── requirements.txt          # Dependencies

Dependencies

  • Qiskit ≥1.0
  • Qiskit Algorithms ≥0.3.0
  • Qiskit Machine Learning ≥0.7.0
  • NumPy <2.0
  • PyTorch ≥2.0
  • Scikit-learn, Matplotlib

Citation

If you use this code, please cite our work:

@conference{quantum_surrogate_2024,
author={Jonas Stein and Michael Poppel and Philip Adamczyk and Ramona Fabry and Zixin Wu and Michael Kölle and Jonas Nüßlein and Daniëlle Schuman and Philipp Altmann and Thomas Ehmer and Vijay Narasimhan and Claudia Linnhoff{-}Popien},
title={Benchmarking Quantum Surrogate Models on Scarce and Noisy Data},
booktitle={Proceedings of the 16th International Conference on Agents and Artificial Intelligence},
series={ICAART'24},
year={2024},
pages={352-359},
publisher={SciTePress},
doi={10.5220/0012348900003636},
}

DOI: 10.5220/0012348900003636
arXiv: 2306.05042

References

  • S. Sim, P. Johnson, A. Aspuru-Guzik. "Expressibility and entangling capability of parameterized quantum circuits for hybrid quantum-classical algorithms." Advanced Quantum Technologies, 2(12):1900070, 2019.

License

MIT License

About

Code for the experiments in "Benchmarking Quantum Surrogate Models on Scarce and Noisy Data"

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors