Skip to content
Merged

Dev #142

Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
21 commits
Select commit Hold shift + click to select a range
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Binary file added docs/source/_static/aiaa_challenge_statement.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/source/_static/pbox_layers_static.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
4 changes: 2 additions & 2 deletions docs/source/guides/up.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,9 +4,9 @@
Methods to efficiently propagate different types of uncertainty through computational models are of vital interests.
`PyUncertainNumber` includes strategies for black box uncertainty propagation (i.e.non-intrusive) as well as a library of functions for PBA ([Probability Bounds Analysis](https://en.wikipedia.org/wiki/Probability_bounds_analysis)) if the code can be accessed (i.e. intrusive). `PyUncertainNumber` provides a series of uncertainty propagation methods.

It is suggested to use [interval analysis](../interval_analysis.md) for propagating ignorance and the methods of probability theory for propagating variability. But realistic engineering problems or risk analyses will most likely involve a mixture of both types and as such probability bounds analysis provides means to rigourously propagate the uncertainty.
It is suggested to use [interval analysis](../interval_analysis.md) for propagating ignorance and the methods of probability theory for propagating variability. But realistic engineering problems or risk analyses will most likely involve a mixture of both types and as such probability bounds analysis provides means to rigorously propagate the uncertainty.

For aleatory uncertainty, probability theory already provides some established approaches, such as Taylor expansion or sampling methods, etc. This guide will mostly focuses on the propagation of intervals due to the close relations with propagation of p-boxes. A detailed review can be found in this [report](https://sites.google.com/view/dawsreports/up/report). Importantly, see {ref}`up` for a hands-on tutorial.
For aleatory uncertainty, probability theory already provides some established approaches, such as Taylor expansion or sampling methods, etc. This guide will mostly focus on the propagation of intervals due to the close relations with propagation of p-boxes. A detailed review can be found in this [report](https://sites.google.com/view/dawsreports/up/report). Importantly, see {ref}`up` for a hands-on tutorial.

## Vertex method

Expand Down
85 changes: 85 additions & 0 deletions docs/source/guides/vvuq.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,85 @@
(vvuq_guide)=
# VV&UQ framework

Numerical simulations plan an ever-growing role in simulating the behaviour of natural and engineered systems. Various sources uncertainties existing in the computational pipeline shadow the credibility the mathematical model or the numerical implementation {cite:p}`roy2011comprehensive`.

```{note}
The VV&UQ framework is one comprehensive framework, composed by the elements of *validation*, *verification* and *uncertainty quantification*, that estimates the predictive uncertainty of computing applications.
```

## Validation, verification, and uncertainty quantification

The uncertainty about the discrepancy between the model and reality is called *model form* uncertainty {cite:p}`cary2026summary`. This type of uncertainty is most commonly assessed during validation campaigns, which collect experimental data about the phenomena of interest and compare this data to model predictions of conditions nominally identical to those in the tests. Although reflecting the best state of knowledge of the engineer, experimental data is affected, among other things, by *measurement uncertainty*, which must be accounted for during validation.

Moreover, models are generally built to reflect a family of physical processes and are given tunable parameters which allow the engineer to set the model for use in specific scenarios. The values of these parameters which represent the true physical process, to be compared against in validation, are not precisely known, because of the presence of *parametric input uncertainty*. Models that compute numerical solutions to partial differential equations are also subject to *numerical uncertainty* due to finite spatial and/or temporal resolution, iterative convergence error, and the use of finite precision arithmetic in computer codes.

Furthermore, the main purpose of any model is to be used where experiments are impractical or impossible to be conducted and the engineer must reason about the performance of the model from the evidence obtained during validation. This results in *extrapolation uncertainty*. The individual contributions of each uncertainty source interact to determine the ultimate measure of model reliability in its application domain, termed *predictive capability*. To quantify the effect of these process-dependent sources of uncertainty, the model must, typically, be run many times. Many models however are complex enough to only allow a single or a handful of runs at any particular setting, necessitating the need for reduced-order models. These approximations of the full model, also called *surrogates*, come with their own surrogate modeling uncertainty. Together, these uncertainties are referred to here as *total uncertainty*.


## The discrepancy between non-determinstic simulations and experiments

```{tip}
Validation is a process of determining the degree to which a (non-deterministic) model simulation is an accurate representation of the corresponding (imperfect) physical experiment.
```

- Both model predictions and empirical data may contain both aleatory and epistemic uncertainty.

- Effectively the discrepancy measure is between two uncertain data generating processes.

- The discrepancy measure indicated the model-form uncertainty

```{image} ../_static/area_metric_illustration_3.png
:alt: Illustration of the stochastic area metric
:class: bg-primary
:width: 400px
:align: center

Illustration of the stochastic area metric
```

```python
from pyuncertainnumber import area_metric

dist = pba.normal(4, 1) data_sample = dist.sample(5)
ecdf_ = pba.ECDF(data_sample)
area_metric(dist, ecdf_)
0.33125451465728933
```

## Predictive capability

Predictive capability suggests the prediction, along with its uncertainty estimation, with respect to a scenario in the application domain that is likely to be beyond the validation domain. Such predictive uncertainty shall comprise all the possible sources of uncertainties during the computational pipeline, as mentioned above. Notably, some uncertainties are of aleatory nature while others are of epistemic nature. The mechanism of probability bounding is employed to aggregate the effects of all these uncertainties to produce a reliable prediction.

```{image} ../_static/pbox_layers_static.png
:alt: Predictive capability
:class: bg-primary
:width: 400px
:align: center

Predictive capability
```



## An open challenge for VVUQ on aerodynamics

An open challenge has been proposed to focuses on estimating the predictive capability of an aerodynamic analysis tool (XFOIL) given a set of synthetic experimental data in AIAA Sci-Tech Forum 2026 {cite:p}`cary2026summary`.

```{image} ../_static/aiaa_challenge_statement.png
:alt: Problem statement
:class: bg-primary
:width: 600px
:align: center

Problem statement of the AIAA Second Uncertainty Quantification Challenge Problem for Aerodynamics
```

In response to this challenge, The UQ team at UoL presents a methodological framework {cite:p}`chen2026efficient` for tackling the Second Uncertainty Quantification Challenge Problem for Aerodynamics. The challenge requires assessments of validation and predictive capability for an aerodynamic tool given imperfect experimental measurements that are scarce and imprecise. We propose an efficient interval-based integrated approach that comprehensively accounts for multiple sources of uncertainties including experimental data uncertainties, inputs incertitude, model-form discrepancy, numerical approximation, surrogate model prediction, and, importantly, uncertainties related to extrapolated application domains. These uncertainties are tackled by efficient meta-modelling techniques involving interval predictor models, Bayesian optimization on the basis of Gaussian processes, and stochastic neural network models. Our results underscore the effectiveness and efficiency of the approach in achieving a practical balance between comprehensive uncertainty quantification and computational cost, demonstrating broad applicability for uncertainty-driven validation and assessment of predictive capability in engineering applications.



## References

```{bibliography}
:filter: docname in docnames
```
1 change: 1 addition & 0 deletions docs/source/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -14,6 +14,7 @@ guides/up
pbox
cbox
interval_analysis
guides/vvuq
```

```{toctree}
Expand Down
28 changes: 28 additions & 0 deletions docs/source/refs.bib
Original file line number Diff line number Diff line change
Expand Up @@ -37,4 +37,32 @@ @article{de2023robust
pages={109877},
year={2023},
publisher={Elsevier}
}

@article{roy2011comprehensive,
title={A comprehensive framework for verification, validation, and uncertainty quantification in scientific computing},
author={Roy, Christopher J and Oberkampf, William L},
journal={Computer methods in applied mechanics and engineering},
volume={200},
number={25-28},
pages={2131--2144},
year={2011},
publisher={Elsevier}
}


@inproceedings{cary2026summary,
title={Summary of the Second AIAA Uncertainty Quantification Challenge Problem for Aerodynamics},
author={Cary, Andrew W and Rumpfkeil, Markus and Hristov, Peter O and Schaefer, John A},
booktitle={AIAA SCITECH 2026 Forum},
pages={0091},
year={2026}
}

@inproceedings{chen2026efficient,
title={Efficient Interval-Based Uncertainty Quantification for Model Validation and Predictive Capability},
author={Chen, Yu and Ioannou, Ioanna and Ferson, Scott},
booktitle={AIAA SCITECH 2026 Forum},
pages={0296},
year={2026}
}
9 changes: 9 additions & 0 deletions docs/source/tutorials/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -11,6 +11,7 @@ what_is_un
uncertainty_characterisation
uncertainty_aggregation
uncertainty_propagation
validation_assessment
```

::::{grid} 1 2 2 3
Expand Down Expand Up @@ -51,4 +52,12 @@ Techniques for combining multiple sources of uncertainty.
Propagate uncertainty through computational models and functions.
:::


:::{card} Discrepancy assessment by validation
:link: validation_assessment
:link-type: doc
:img-top: ../_static/pbox_layers_static.png
Validation of imprecise probability models.
:::

::::
6 changes: 3 additions & 3 deletions docs/source/tutorials/uncertainty_propagation.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -49,9 +49,9 @@
"jp-MarkdownHeadingCollapsed": true
},
"source": [
"## arithmetic of uncertain number\n",
"## Arithmetic of uncertain numbers\n",
"\n",
"[Probability bounds anlaysis](https://en.wikipedia.org/wiki/Probability_bounds_analysis) combines both interval analysis and probability theory, allowing\n",
"[Probability bounds analysis](https://en.wikipedia.org/wiki/Probability_bounds_analysis) combines both interval analysis and probability theory, allowing\n",
"rigorous bounds of (arithmetic) functions of random variables to be computed even with partial information."
]
},
Expand Down Expand Up @@ -137,7 +137,7 @@
"id": "299a3953-1a37-4e59-ab02-a769e1b54467",
"metadata": {},
"source": [
"## generic propagation of uncertain numbers"
"## Generic propagation of uncertain numbers"
]
},
{
Expand Down
294 changes: 294 additions & 0 deletions docs/source/tutorials/validation_assessment.ipynb

Large diffs are not rendered by default.

2 changes: 1 addition & 1 deletion pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -38,7 +38,7 @@ maintainers = [
name = "pyuncertainnumber"
readme = "README.md"
requires-python = ">= 3.11"
version = "0.1.4"
version = "0.1.15"

[project.optional-dependencies]
test = [
Expand Down
1 change: 0 additions & 1 deletion requirements.txt
Original file line number Diff line number Diff line change
@@ -1,7 +1,6 @@
matplotlib==3.9.2
numpy==2.1.3
pandas==2.2.3
pillow==11.0.0
Pint==0.24.4
Pygments==2.18.0
pymongo==4.10.1
Expand Down
2 changes: 1 addition & 1 deletion src/pyuncertainnumber/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -60,7 +60,7 @@


# * --------------------- validation ---------------------*#
from .pba.core import area_metric
from pyuncertainnumber.validation.area_metric import area_metric

# * --------------------- utils ---------------------*#
from pyuncertainnumber.gutils import inspect_un
17 changes: 17 additions & 0 deletions src/pyuncertainnumber/calibration/tmcmc.py
Original file line number Diff line number Diff line change
Expand Up @@ -1203,6 +1203,23 @@ def get_hdi_bounds(
return pd.DataFrame(out).rename_axis("column").reset_index()


def filter_hdi_bounds(df, level: int = 95) -> pd.DataFrame:
"""Extract HDI bounds for a given credibility level.

args:
df (pd.DataFrame): HDI dataframe containing columns like `hdi_95_low`, `hdi_95_high`

level (int | str): significance level (e.g. 95, 90, 20)

returns:
(pd.DataFrame): DataFrame with columns [hdi_<level>_low, hdi_<level>_high]

"""
level = str(level)
cols = [f"hdi_{level}_low", f"hdi_{level}_high"]
return df.loc[:, cols]


def transform_old_trace_to_new(trace: list[list]) -> list[Stage]:
"""Transform old trace format to new Stage class format.

Expand Down
1 change: 0 additions & 1 deletion src/pyuncertainnumber/pba/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -15,4 +15,3 @@
from .operation import convert
from .aggregation import *
from .pbox_free import KS_bounds
from .core import area_metric
1 change: 1 addition & 0 deletions src/pyuncertainnumber/pba/aggregation.py
Original file line number Diff line number Diff line change
Expand Up @@ -135,6 +135,7 @@ def stochastic_mixture(
return mixture_pbox(*converted_constructs, weights)


# TODO. weights cannot be None for intervals with same weights
def stacking(
vec_interval: Interval | list[Interval],
*,
Expand Down
Loading