Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Better Defaults: Auto AD Selection for Newton Methods #244

Merged
merged 10 commits into from
Oct 18, 2023
2 changes: 1 addition & 1 deletion Project.toml
Original file line number Diff line number Diff line change
Expand Up @@ -50,7 +50,7 @@ PrecompileTools = "1"
RecursiveArrayTools = "2"
Reexport = "0.2, 1"
SciMLBase = "2.4"
SimpleNonlinearSolve = "0.1"
SimpleNonlinearSolve = "0.1.22"
SparseDiffTools = "2.6"
StaticArraysCore = "1.4"
UnPack = "1.0"
Expand Down
3 changes: 1 addition & 2 deletions docs/pages.jl
Original file line number Diff line number Diff line change
Expand Up @@ -2,8 +2,7 @@

pages = ["index.md",
"tutorials/getting_started.md",
"Tutorials" => Any[
"tutorials/code_optimization.md",
"Tutorials" => Any["tutorials/code_optimization.md",
"tutorials/large_systems.md",
"tutorials/small_compile.md",
"tutorials/termination_conditions.md",
Expand Down
8 changes: 4 additions & 4 deletions docs/src/solvers/NonlinearSystemSolvers.md
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,7 @@ it will make sure to work.

If one is looking for more robustness then `RobustMultiNewton` is a good choice.
It attempts a set of the most robust methods in succession and only fails if
all of the methods fail to converge. Additionally, `DynamicSS` can be a good choice
all of the methods fail to converge. Additionally, `DynamicSS` can be a good choice
for high stability.

As a balance, `NewtonRaphson` is a good choice for most problems that aren't too
Expand All @@ -25,13 +25,13 @@ but more stable. If the problem is well-conditioned, `Klement` or `Broyden`
may be faster, but highly dependent on the eigenvalues of the Jacobian being
sufficiently small.

`NewtonRaphson` and `TrustRegion` are designed for for large systems.
`NewtonRaphson` and `TrustRegion` are designed for for large systems.
They can make use of sparsity patterns for sparse automatic differentiation
and sparse linear solving of very large systems. Meanwhile,
`SimpleNewtonRaphson` and `SimpleTrustRegion` are implementations which is specialized for
small equations. They are non-allocating on static arrays and thus really well-optimized
for small systems, thus usually outperforming the other methods when such types are
used for `u0`.
used for `u0`.

## Full List of Methods

Expand All @@ -57,7 +57,7 @@ features, but have a bit of overhead on very small problems.
large-scale and numerically-difficult nonlinear systems.
- `RobustMultiNewton()`: A polyalgorithm that mixes highly robust methods (line searches and
trust regions) in order to be as robust as possible for difficult problems. If this method
fails to converge, then one can be pretty certain that most (all?) other choices would
fails to converge, then one can be pretty certain that most (all?) other choices would
likely fail.
- `FastShortcutNonlinearPolyalg`: The default method. A polyalgorithm that mixes fast methods
with fallbacks to robust methods to allow for solving easy problems quickly without sacrificing
Expand Down
2 changes: 1 addition & 1 deletion docs/src/tutorials/code_optimization.md
Original file line number Diff line number Diff line change
Expand Up @@ -55,4 +55,4 @@ prob = NonlinearProblem(f!, u0, p)
linsolve = LinearSolve.KrylovJL_GMRES()
sol = solve(prob, NewtonRaphson(; linsolve), reltol = 1e-9)
```
```
40 changes: 20 additions & 20 deletions docs/src/tutorials/getting_started.md
Original file line number Diff line number Diff line change
Expand Up @@ -9,19 +9,19 @@ to understanding the deeper parts of the documentation.

There are three types of nonlinear systems:

1. The "standard nonlinear system", i.e. the `NonlinearProblem`. This is a
system of equations with an initial condition where you want to satisfy
all equations simultaniously.
2. The "interval rootfinding problem", i.e. the `IntervalNonlinearProblem`.
This is the case where you're given an interval `[a,b]` and need to find
where `f(u) = 0` for `u` inside the bounds.
3. The "steady state problem", i.e. find the `u` such that `u' = f(u) = 0`.
While related to (1), it's not entirely the same because there's a uniquely
defined privledged root.
4. The nonlinear least squares problem, which is an overconstrained nonlinear
system (i.e. more equations than states) which might not be satisfiable, i.e.
there may be no `u` such that `f(u) = 0`, and thus we find the `u` which
minimizes `||f(u)||` in the least squares sense.
1. The "standard nonlinear system", i.e. the `NonlinearProblem`. This is a
system of equations with an initial condition where you want to satisfy
all equations simultaniously.
2. The "interval rootfinding problem", i.e. the `IntervalNonlinearProblem`.
This is the case where you're given an interval `[a,b]` and need to find
where `f(u) = 0` for `u` inside the bounds.
3. The "steady state problem", i.e. find the `u` such that `u' = f(u) = 0`.
While related to (1), it's not entirely the same because there's a uniquely
defined privledged root.
4. The nonlinear least squares problem, which is an overconstrained nonlinear
system (i.e. more equations than states) which might not be satisfiable, i.e.
there may be no `u` such that `f(u) = 0`, and thus we find the `u` which
minimizes `||f(u)||` in the least squares sense.

For now let's focus on the first two. The other two are covered in later tutorials,
but from the first two we can show the general flow of the NonlinearSolve.jl package.
Expand Down Expand Up @@ -105,7 +105,7 @@ For a complete list of solver choices, see [the nonlinear system solvers page](@
Next we can modify the tolerances. Here let's set some really low tolerances to force a tight solution:

```@example 1
solve(prob, TrustRegion(), reltol=1e-12, abstol=1e-12)
solve(prob, TrustRegion(), reltol = 1e-12, abstol = 1e-12)
```

There are many more options for doing this configuring. Specifically for handling termination conditions,
Expand Down Expand Up @@ -139,10 +139,10 @@ sol = solve(prob_int, ITP(), abstol = 0.01)
Congrats, you now know how to use the basics of NonlinearSolve.jl! However, there is so much more to
see. Next check out:

- [Some code optimization tricks to know about with NonlinearSolve.jl](@ref code_optimization)
- [An iterator interface which lets you step through the solving process step by step](@ref iterator)
- [How to handle large systems of equations efficiently](@ref large_systems)
- [Ways to use NonlinearSolve.jl that is faster to startup and can statically compile](@ref fast_startup)
- [More detailed termination conditions](@ref termination_conditions_tutorial)
- [Some code optimization tricks to know about with NonlinearSolve.jl](@ref code_optimization)
- [An iterator interface which lets you step through the solving process step by step](@ref iterator)
- [How to handle large systems of equations efficiently](@ref large_systems)
- [Ways to use NonlinearSolve.jl that is faster to startup and can statically compile](@ref fast_startup)
- [More detailed termination conditions](@ref termination_conditions_tutorial)

And also check out the rest of the manual.
And also check out the rest of the manual.
2 changes: 1 addition & 1 deletion docs/src/tutorials/iterator_interface.md
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
# [Nonlinear Solver Iterator Interface](@id iterator)

!!! warn

This iterator interface will be expanded with a `step!` function soon!

There is an iterator form of the nonlinear solver which mirrors the DiffEq integrator interface:
Expand Down
28 changes: 1 addition & 27 deletions docs/src/tutorials/large_systems.md
Original file line number Diff line number Diff line change
Expand Up @@ -209,8 +209,7 @@ function incompletelu(W, du, u, p, t, newW, Plprev, Prprev, solverdata)
end

@btime solve(prob_brusselator_2d_sparse,
NewtonRaphson(linsolve = KrylovJL_GMRES(), precs = incompletelu,
concrete_jac = true));
NewtonRaphson(linsolve = KrylovJL_GMRES(), precs = incompletelu, concrete_jac = true));
nothing # hide
```

Expand Down Expand Up @@ -275,28 +274,3 @@ nothing # hide

For more information on the preconditioner interface, see the
[linear solver documentation](https://docs.sciml.ai/LinearSolve/stable/basics/Preconditioners/).

## Speed up Jacobian computation with sparsity exploitation and matrix coloring

To cut down the of Jacobian building overhead, we can choose to exploit the sparsity pattern and deploy matrix coloring during Jacobian construction. With NonlinearSolve.jl, we can simply use `autodiff=AutoSparseForwardDiff()` to automatically exploit the sparsity pattern of Jacobian matrices:

```@example ill_conditioned_nlprob
@btime solve(prob_brusselator_2d,
NewtonRaphson(linsolve = KrylovJL_GMRES(), precs = incompletelu, concrete_jac = true,
autodiff = AutoSparseForwardDiff()));
nothing # hide
```

To setup matrix coloring for the jacobian sparsity pattern, we can simply get the coloring vector by using [ArrayInterface.jl](https://github.com/JuliaArrays/ArrayInterface.jl) for the sparsity pattern of `jac_prototype`:

```@example ill_conditioned_nlprob
using ArrayInterface
colorvec = ArrayInterface.matrix_colors(jac_sparsity)
ff = NonlinearFunction(brusselator_2d_loop; jac_prototype = float.(jac_sparsity), colorvec)
prob_brusselator_2d_sparse = NonlinearProblem(ff, u0, p)

@btime solve(prob_brusselator_2d_sparse,
NewtonRaphson(linsolve = KrylovJL_GMRES(), precs = incompletelu, concrete_jac = true,
autodiff = AutoSparseForwardDiff()));
nothing # hide
```
2 changes: 1 addition & 1 deletion docs/src/tutorials/small_compile.md
Original file line number Diff line number Diff line change
@@ -1,3 +1,3 @@
# Faster Startup and and Static Compilation

This is a stub article to be completed soon.
This is a stub article to be completed soon.
2 changes: 1 addition & 1 deletion docs/src/tutorials/termination_conditions.md
Original file line number Diff line number Diff line change
@@ -1,3 +1,3 @@
# [More Detailed Termination Conditions](@id termination_conditions_tutorial)

This is a stub article to be completed soon.
This is a stub article to be completed soon.
Loading
Loading