- Working out many functions and partial derivatives can be time-consuming and error-prone.
- Function, even Julia functions, not just mathematical ones, are made up of elementary differentiable operations, such +, -, sin, cos, etc. So a computer program should be able to differentiate those functions.
- Automatic Differentiation, AD, describes this process. The way it works is different from using analytical or numerical solutions.
- JuliaDiff lists available differentiation packages in Julia.
- Flux deep learning package uses Zygote.jl for AD.
using Zygote
f(x) = 3x^2 + 2x + 1
# Gradient of f at x=2
gradient(f, 2)
# gradient of vectors
x = [2, 1];
y = [2, 0];
f(x, y) = sum((x .- y).^2)
# gradient returns a tuple, with a gradient for each argument to the function.
grad = gradient((x,y) -> f(x, y), x, y)
([-2.0, 2.0], [2.0, -2.0])
# gradient of Julia functions
function pow(x, n)
r = 1
for i = 1:n
r *= x
end
return r
end
gradient(x -> pow(x, 3), 5)
(75,)