Skip to content
This repository was archived by the owner on Oct 9, 2019. It is now read-only.

Return lower order results #94

Merged
merged 6 commits into from
Dec 14, 2016
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion docs/src/demos.md
Original file line number Diff line number Diff line change
Expand Up @@ -105,7 +105,7 @@ julia> F = one(Tensor{2,3}) + rand(Tensor{2,3});

julia> C = tdot(F);

julia> S_AD = 2 * ContMechTensors.gradient(C -> Ψ(C, μ, Kb), C)
julia> S_AD = 2 * gradient(C -> Ψ(C, μ, Kb), C)
3×3 ContMechTensors.SymmetricTensor{2,3,Float64,6}:
4.30534e11 -2.30282e11 -8.52861e10
-2.30282e11 4.38793e11 -2.64481e11
Expand Down
27 changes: 20 additions & 7 deletions docs/src/man/automatic_differentiation.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,24 +7,37 @@ end

# Automatic Differentiation

```@index
Pages = ["automatic_differentiation.md"]
```

`ContMechTensors` supports forward mode automatic differentiation (AD) of tensorial functions to compute first order derivatives (gradients) and second order derivatives (Hessians).
It does this by exploiting the `Dual` number defined in `ForwardDiff.jl`.
While `ForwardDiff.jl` can itself be used to differentiate tensor functions it is a bit awkward because `ForwardDiff.jl` is written to work with standard Julia `Array`s. One therefore has to send the input argument as an `Array` to `ForwardDiff.jl`, convert it to a `Tensor` and then convert the output `Array` to a `Tensor` again. This can also be inefficient since these `Array`s are allocated on the heap so one needs to preallocate which can be annoying.

Instead, it is simpler to use `ContMechTensors` own AD API to do the differentiation. This does not require any conversions and everything will be stack allocated so there is no need to preallocate.

The API for AD in `ContMechTensors` is `ContMechTensors.gradient(f, A)` and `ContMechTensors.hessian(f, A)` where `f` is a function and `A` is a first or second order tensor. For `gradient` the function can return a scalar, vector (in case the input is a vector) or a second order tensor. For `hessian` the function should return a scalar.
The API for AD in `ContMechTensors` is `gradient(f, A)` and `hessian(f, A)` where `f` is a function and `A` is a first or second order tensor. For `gradient` the function can return a scalar, vector (in case the input is a vector) or a second order tensor. For `hessian` the function should return a scalar.

When evaluating the function with dual numbers, the value (value and gradient in the case of hessian) is obtained automatically, along with the gradient. To obtain the lower order results `gradient` and `hessian` accepts a third arguement, a `Symbol`. Note that the symbol is only used to dispatch to the correct function, and thus it can be any symbol. In the examples the symbol `:all` is used to obtain all the lower order derivatives and values.

```@docs
gradient
hessian
```

## Examples

We here give a few examples of differentiating various functions and compare with the analytical solution.
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Perhaps make this into a header not that the section is becoming quite long. ## Examples


## Norm of a vector
### Norm of a vector

$f(\mathbf{x}) = |\mathbf{x}| \quad \Rightarrow \quad \partial f / \partial \mathbf{x} = \mathbf{x} / |\mathbf{x}|$

```jldoctest
julia> x = rand(Vec{2});

julia> ContMechTensors.gradient(norm, x)
julia> gradient(norm, x)
2-element ContMechTensors.Tensor{1,2,Float64,2}:
0.61036
0.792124
Expand All @@ -35,14 +48,14 @@ julia> x / norm(x)
0.792124
```

## Determinant of a second order symmetric tensor
### Determinant of a second order symmetric tensor

$f(\mathbf{A}) = \det \mathbf{A} \quad \Rightarrow \quad \partial f / \partial \mathbf{A} = \mathbf{A}^{-T} \det \mathbf{A}$

```jldoctest
julia> A = rand(SymmetricTensor{2,2});

julia> ContMechTensors.gradient(det, A)
julia> gradient(det, A)
2×2 ContMechTensors.SymmetricTensor{2,2,Float64,3}:
0.566237 -0.766797
-0.766797 0.590845
Expand All @@ -53,7 +66,7 @@ julia> inv(A)' * det(A)
-0.766797 0.590845
```

## Hessian of a quadratic potential
### Hessian of a quadratic potential

$\psi(\mathbf{e}) = 1/2 \mathbf{e} : \mathsf{E} : \mathbf{e} \quad \Rightarrow \quad \partial \psi / (\partial \mathbf{e} \otimes \partial \mathbf{e}) = \mathsf{E}^\text{sym}$

Expand All @@ -64,7 +77,7 @@ julia> E = rand(Tensor{4,2});

julia> ψ(ϵ) = 1/2 * ϵ ⊡ E ⊡ ϵ;

julia> E_sym = ContMechTensors.hessian(ψ, rand(Tensor{2,2}));
julia> E_sym = hessian(ψ, rand(Tensor{2,2}));

julia> norm(majorsymmetric(E) - E_sym)
0.0
Expand Down
1 change: 1 addition & 0 deletions src/ContMechTensors.jl
Original file line number Diff line number Diff line change
Expand Up @@ -10,6 +10,7 @@ export AbstractTensor, SymmetricTensor, Tensor, Vec, FourthOrderTensor, SecondOr
export otimes, ⊗, ⊡, dcontract, dev, vol, symmetric, skew, minorsymmetric, majorsymmetric
export minortranspose, majortranspose, isminorsymmetric, ismajorsymmetric
export tdot, dotdot
export hessian#, gradient

@deprecate extract_components(tensor) Array(tensor)

Expand Down
Loading