• Working out many functions and partial derivatives can be time-consuming and error-prone.
• Function, even Julia functions, not just mathematical ones, are made up of elementary differentiable operations, such +, -, sin, cos, etc. So a computer program should be able to differentiate those functions.
• Automatic Differentiation, AD, describes this process. The way it works is different from using analytical or numerical solutions.
• JuliaDiff lists available differentiation packages in Julia.
• Flux deep learning package uses Zygote.jl for AD.
using Zygote

f(x) = 3x^2 + 2x + 1

# Gradient of f at x=2

x = [2, 1];
y = [2, 0];
f(x, y) = sum((x .- y).^2)

# gradient returns a tuple, with a gradient for each argument to the function.
([-2.0, 2.0], [2.0, -2.0])