Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@
[![codecov](https://codecov.io/gh/fides-dev/Fides.jl/graph/badge.svg?token=J7PXRF30JG)](https://codecov.io/gh/fides-dev/Fides.jl)
[![SciML Code Style](https://img.shields.io/static/v1?label=code%20style&message=SciML&color=9558b2&labelColor=389826)](https://github.com/SciML/SciMLStyle)

Fides.jl is a Julia wrapper of the Python package [Fides.py](https://github.com/fides-dev/fides), which implements an Interior Trust Region Reflective for bounds constrained optimization problems based on [1, 2]. Fides targets problems on the form:
Fides.jl is a Julia wrapper of the Python package [Fides.py](https://github.com/fides-dev/fides), which implements an Interior Trust Region Reflective algorithm for bounds constrained optimization problems based on [1, 2]. Fides targets problems on the form:

```math
\min_{x \in \mathbb{R}^n} f(x) \quad \mathrm{subject \ to} \quad \text{lb} \leq x \leq \text{ub}
Expand Down
4 changes: 2 additions & 2 deletions docs/src/index.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# Fides.jl

Fides.jl is a Julia wrapper of the Python package [Fides.py](https://github.com/fides-dev/fides), which implements an Interior Trust Region Reflective for boundary costrained optimization problems based on [1, 2]. Fides targets problems on the form:
Fides.jl is a Julia wrapper of the Python package [Fides.py](https://github.com/fides-dev/fides), which implements an Interior Trust Region Reflective algorithm for boundary constrained optimization problems based on [1, 2]. Fides targets problems on the form:

```math
\min_{x \in \mathbb{R}^n} f(x) \quad \mathrm{subject \ to} \quad lb \leq x \leq ub
Expand All @@ -13,7 +13,7 @@ Where `f` is a continues at least twice-differentaible function, and `lb` and `u
- Boundary-constrained interior trust-region optimization.
- Recursive reflective and truncated constraint management.
- Full and 2D subproblem solution solvers.
- Supports used provided HEssian, and BFGS, DFP, and SR1 Hessian approximations.
- Supports used provided Hessian, and BFGS, DFP, and SR1 Hessian approximations.
- Good performance for parameter estimating Ordinary Differential Equation models [3].

!!! note "Star us on GitHub!"
Expand Down
12 changes: 6 additions & 6 deletions docs/src/tutorial.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# Tutorial

This overarching tutorial describes how to minimize an objective function with Fides. It further provides performance tips for computationally intensive objective functions.
This overarching tutorial describes how to solve an optimization problem with Fides. It further provides performance tips for computationally intensive objective functions.

## Input - a Function to Minimize

Expand Down Expand Up @@ -32,7 +32,7 @@ Both the gradient and Hessian functions are expected to be in-place on the form;

## Optimization with a Hessian Approximation

Given an objective function and its gradient, the optimization is performed in a two-step procedure. First, a `FidesProblem` is created. When the Hessian is unavailable or too expensive to compute, a Hessian approximation is chosen during this step:
Given an objective function and its gradient, the optimization is performed in a two-step procedure. First, a `FidesProblem` is created.

```@example 1
using Fides
Expand All @@ -42,14 +42,14 @@ x0 = [ 2.0, 2.0]
prob = FidesProblem(f, grad!, x0; lb = lb, ub = ub)
```

Where `x0` is the initial guess for parameter estimation, and `lb` and `ub` are the lower and upper parameter bounds (defaulting to `-Inf` and `Inf` if unspecified). The problem is then minimized by calling `solve`:
Where `x0` is the initial guess for parameter estimation, and `lb` and `ub` are the lower and upper parameter bounds (defaulting to `-Inf` and `Inf` if unspecified). The problem is then minimized by calling `solve`, and when the Hessian is unavailable or too expensive to compute, a Hessian approximation is chosen during this step:

```@example 1
sol = solve(prob, Fides.BFGS()) # hide
sol = solve(prob, Fides.BFGS())
```

Here, the second argument is the Hessian approximation. Several approximations are supported (see the [API](@ref API)), and `BFGS` generally performs well. Additional tuning options can be set by providing a [`FidesOptions`](@ref) struct via the `options` keyword in `solve`, and a full list of available options is documented in the [API](@ref API).
Several Hessian approximations are supported (see the [API](@ref API)), and of these `BFGS` generally performs well. Additional tuning options can be set by providing a [`FidesOptions`](@ref) struct via the `options` keyword in `solve`, and a full list of available options is documented in the [API](@ref API).

## Optimization with a User-Provided Hessian

Expand All @@ -65,7 +65,7 @@ Since a Hessian function is provided, no Hessian approximation needs to be speci

## Performance tip: Computing Derivatives and Objective Simultaneously

Internally, the objective function and its derivatives are computed simultaneously by Fides. Hence, runtime can be reduced if intermediate quantities are reused between the objective and derivative computations. To take advantage of this, a `FidesProblem` can be created with a function that computes the objective and gradient (and optionally the Hessian) for a given `Vector`. For example, when only the gradient is available:
Internally, the objective function and its derivatives are computed simultaneously by Fides. Hence, runtime can be reduced if intermediate quantities are reused between the objective and derivative computations. To take advantage of this, a `FidesProblem` can be created with a function that computes the objective and gradient (and optionally the Hessian) for a given input. For example, when only the gradient is available:

```@example 1
function fides_obj(x)
Expand Down Expand Up @@ -96,4 +96,4 @@ sol = solve(prob) # hide
sol = solve(prob)
```

In this simple example, no runtime benefit is obtained as not quantities are reused. However, if quantities can be reused (for example, when gradients are computed for ODE models), runtime can be noticeably reduced.
In this simple example, no runtime benefit is obtained as not quantities are reused between objective and derivative computations. However, if quantities can be reused (for example, when gradients are computed for ODE models), runtime can be noticeably reduced.
30 changes: 15 additions & 15 deletions src/problem.jl
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,8 @@
Optimization problem to be minimized with the Fides Newton Trust Region optimizer.

## Arguments
- `f`: The objective function to minimize. Accepts a vector as input and return a scalar.
- `f`: The objective function to minimize. Accepts a `Vector` or `ComponentVector` as input
and return a scalar.
- `grad!`: In-place function to compute the gradient of `f` on the form `grad!(g, x)`.
- `x0`: Initial starting point for the optimization. Can be a `Vector` or `ComponentVector`
from [ComponentArrays.jl](https://github.com/SciML/ComponentArrays.jl).
Expand All @@ -16,6 +17,16 @@ Optimization problem to be minimized with the Fides Newton Trust Region optimize

See also [solve](@ref) and [FidesOptions](@ref).

FidesProblem(fides_obj, x0, hess::Bool; lb = nothing, ub = nothing)

Optimization problem created from a function that computes:
- `hess = false`: Objective and gradient; `fides_obj(x) -> (obj, g)`.
- `hess = true`: Objective, gradient and Hessian; `fides_obj(x) -> (obj, g, H)`.

Internally, Fides computes the objective function and derivatives simultaneously. Therefore,
this constructor is the most runtime-efficient option when intermediate quantities can be
reused between the objective and derivative computations.

## Description of Fides method

Fides implements an Interior Trust Region Reflective method for boundary-constrained
Expand All @@ -34,9 +45,9 @@ At each iteration, the Fides approximates the objective function by a second-ord
```

Where, `Δₖ` is the trust region radius reflecting the confidence in the second-order
approximation, `∇f(xₖ)` of `f` at the current iteration `xₖ`, and `Bₖ` is a symmetric
positive-semidefinite matrix, that is either the exact Hessian (if `hess!` is provided) or
an approximation.
approximation, `∇f(xₖ)` is the gradient of `f` at the current iteration `xₖ`, and `Bₖ` is a
symmetric positive-semidefinite matrix, that is either the exact Hessian (if `hess!` is
provided) or an approximation.

## References
1. Coleman, T. F., & Li, Y. (1994). On the convergence of interior-reflective Newton
Expand Down Expand Up @@ -65,17 +76,6 @@ function FidesProblem(f::Function, grad!::Function, x0::InputVector; hess! = not
user_hessian = !isnothing(hess!)
return FidesProblem(fides_objective, fides_objective_py, x0, _lb, _ub, user_hessian)
end
"""
FidesProblem(fides_obj, x0, hess::Bool; lb = nothing, ub = nothing)

Optimization problem created from a function that computes:
- `hess = false`: Objective and gradient; `fides_obj(x) -> (obj, g)`.
- `hess = true`: Objective, gradient and Hessian; `fides_obj(x) -> (obj, g, H)`.

Internally, Fides computes the objective function and derivatives simultaneously. Therefore,
this constructor is the most runtime-efficient option when intermediate quantities can be
reused between the objective and derivative computations.
"""
function FidesProblem(fides_objective::Function, x0::InputVector, hess::Bool; lb = nothing,
ub = nothing)
_lb = _get_bounds(x0, lb, :lower)
Expand Down
4 changes: 2 additions & 2 deletions src/solve.jl
Original file line number Diff line number Diff line change
Expand Up @@ -31,9 +31,9 @@ end
solve(prob::FidesProblem, hess_approximation; options = FidesOptions())

Solve the given `FidesProblem` using the Fides Trust Region method, with the specified
`hess_approximation` to approximate the Hessian matrix.
`hess_approximation` method for approximating the Hessian matrix.

A complete list of available Hessian approximations is provided in the documentation.
A complete list of available Hessian approximations can be found in the API documentation.

See also [FidesOptions](@ref).
"""
Expand Down
2 changes: 1 addition & 1 deletion test/problem.jl
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@ function fides_obj2(x)
return (f, g, H)
end

x0, lb, ub, = [2.0, 2.0], [-10.0, -10.0], [10.0, 10.0]
x0, lb, ub = [2.0, 2.0], [-10.0, -10.0], [10.0, 10.0]
@testset "FidesProblem" begin
prob1 = FidesProblem(rosenbrock, rosenbrock_grad!, x0; lb = lb, ub = ub)
sol1 = solve(prob1, Fides.BFGS())
Expand Down
2 changes: 1 addition & 1 deletion test/show.jl
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@ using Fides, Printf, Test

include(joinpath(@__DIR__, "common.jl"))

x0, lb, ub, = [2.0, 2.0], [-10.0, -10.0], [10.0, 10.0]
x0, lb, ub = [2.0, 2.0], [-10.0, -10.0], [10.0, 10.0]
prob = FidesProblem(rosenbrock, rosenbrock_grad!, x0; lb = lb, ub = ub)
sol = solve(prob, Fides.BFGS())

Expand Down
Loading