diff --git a/README.md b/README.md index 683d5f1f..c9e5154b 100644 --- a/README.md +++ b/README.md @@ -8,43 +8,62 @@ ForwardDiff implements methods to take **derivatives**, **gradients**, **Jacobians**, **Hessians**, and higher-order derivatives of native Julia functions (or any callable object, really) using **forward mode automatic differentiation (AD)**. -While performance can vary depending on the functions you evaluate, the algorithms implemented by ForwardDiff **generally outperform non-AD algorithms in both speed and accuracy.** +While performance can vary depending on the functions you evaluate, the algorithms implemented by ForwardDiff generally outperform non-AD algorithms (such as finite-differencing) in both speed and accuracy. Here's a simple example showing the package in action: ```julia julia> using ForwardDiff -julia> f(x::Vector) = sum(sin, x) + prod(tan, x) * sum(sqrt, x); +julia> f(x::Vector) = sin(x[1]) + prod(x[2:end]); # returns a scalar -julia> x = rand(5) # small size for example's sake -5-element Array{Float64,1}: - 0.986403 - 0.140913 - 0.294963 - 0.837125 - 0.650451 +julia> x = vcat(pi/4, 2:4) +4-element Vector{Float64}: + 0.7853981633974483 + 2.0 + 3.0 + 4.0 -julia> g = x -> ForwardDiff.gradient(f, x); # g = ∇f - -julia> g(x) -5-element Array{Float64,1}: - 1.01358 - 2.50014 - 1.72574 - 1.10139 - 1.2445 +julia> ForwardDiff.gradient(f, x) +4-element Vector{Float64}: + 0.7071067811865476 + 12.0 + 8.0 + 6.0 julia> ForwardDiff.hessian(f, x) -5x5 Array{Float64,2}: - 0.585111 3.48083 1.7706 0.994057 1.03257 - 3.48083 1.06079 5.79299 3.25245 3.37871 - 1.7706 5.79299 0.423981 1.65416 1.71818 - 0.994057 3.25245 1.65416 0.251396 0.964566 - 1.03257 3.37871 1.71818 0.964566 0.140689 - ``` - - Trying to switch to the latest version of ForwardDiff? See our [upgrade guide](http://www.juliadiff.org/ForwardDiff.jl/stable/user/upgrade/) for details regarding user-facing changes between releases. +4×4 Matrix{Float64}: + -0.707107 0.0 0.0 0.0 + 0.0 0.0 4.0 3.0 + 0.0 4.0 0.0 2.0 + 0.0 3.0 2.0 0.0 +``` + +Functions like `f` which map a vector to a scalar are the best case for reverse-mode automatic differentiation, +but ForwardDiff may still be a good choice if `x` is not too large, as it is much simpler. +The best case for forward-mode differentiation is a function which maps a scalar to a vector, like this `g`: + +```julia +julia> g(y::Real) = [sin(y), cos(y), tan(y)]; # returns a vector + +julia> ForwardDiff.derivative(g, pi/4) +3-element Vector{Float64}: + 0.7071067811865476 + -0.7071067811865475 + 1.9999999999999998 + +julia> ForwardDiff.jacobian(x) do x # anonymous function, returns a length-2 vector + [sin(x[1]), prod(x[2:end])] + end +2×4 Matrix{Float64}: + 0.707107 0.0 0.0 0.0 + 0.0 12.0 8.0 6.0 +``` + +See [ForwardDiff's documentation](https://juliadiff.org/ForwardDiff.jl/stable) for full details on how to use this package. +ForwardDiff relies on [DiffRules](https://github.com/JuliaDiff/DiffRules.jl) for the derivatives of many simple function such as `sin`. + +See the [JuliaDiff web page](https://juliadiff.org) for other automatic differentiation packages. ## Publications