Calculus for robots: derivatives, gradients, Jacobians
Only what robots need. Derivatives as rates, gradients as steepest-ascent, Jacobians as the matrix that connects joint velocities to end-effector velocities — and almost no symbolic integration.
Calculus shows up in robotics roughly four ways: derivatives (velocities), gradients (optimization), Jacobians (kinematics, sensitivity), and the chain rule (backprop). Almost never as symbolic integration. Here's what to know cold.
The derivative — rate of change
For a scalar function :
In robotics, when the input is time, the derivative is a velocity:
- — velocity is the time-derivative of position
- — acceleration is the time-derivative of velocity
- — joint velocity, the time-derivative of joint angle
You compute these in code via finite differences () or by reading from a sensor that measures velocity directly (gyroscope, encoder).
Partial derivatives — change in one direction
For a function of multiple inputs , the partial derivative measures how f changes if you wiggle only and hold the others fixed.
Example: end-effector x-coordinate of a 2-DOF arm:
Reads as: "if I move θ₁ a little, the end-effector x changes by this much per unit of θ₁."
Gradient — steepest-ascent direction
Stack the partials of a scalar function f with respect to each input into a vector:
The gradient points in the direction of steepest increase. Gradient descent — the engine of every neural net trained — takes a step in the direction of the negative gradient:
Where η is the learning rate. That's it. That's gradient descent. Every modern ML framework wraps this in autograd machinery, but the operation is the same.
Jacobian — the matrix-valued partial derivative
For a function (vector input, vector output), the Jacobian J is the m×n matrix of partials:
For an arm's forward kinematics, the Jacobian connects joint velocities to end-effector velocities:
If you know your robot's Jacobian, you know:
- Forward velocity kinematics: command joint velocities, predict end-effector velocity.
- Inverse velocity kinematics: want a particular end-effector velocity? Solve (pseudoinverse).
- Manipulability: rank of J tells you in which directions the end-effector can move.
- Force mapping: end-effector forces map to joint torques via .
The Jacobian is the second-most-important matrix in robotics, after the rotation matrix.
Chain rule — the engine of backprop
If , then . For vector functions:
This is how neural networks compute gradients: chain forward through the layers, propagate the Jacobians backward, multiply along the chain. PyTorch and JAX automate this; you almost never apply the chain rule by hand. But understanding it lets you debug autograd when it disagrees with your intuition.
Integrals you actually compute
In robotics, integrals are usually:
- Numerical: integrate a state forward through dynamics. Euler, RK4, etc.
- Symbolic for inertia tensors: for rigid bodies. CAD tools or formulas, not pencil-and-paper.
- Probabilistic marginalization: . Closed form for Gaussians; otherwise approximate.
If you're solving an integral by integration-by-parts in robotics, you've taken a wrong turn.
Numerical differentiation in code
You will sometimes need a Jacobian without a closed form. Use finite differences:
def numerical_jacobian(f, x, h=1e-6):
n = len(x)
fx = f(x)
m = len(fx)
J = np.zeros((m, n))
for j in range(n):
x_perturbed = x.copy()
x_perturbed[j] += h
J[:, j] = (f(x_perturbed) - fx) / h
return J
Two gotchas:
- Step size h: too small → rounding error; too big → truncation error. for double-precision inputs is a reasonable default.
- Central differences: is more accurate than forward differences but costs 2× function evaluations.
Modern robotics code increasingly uses autodiff (PyTorch, JAX) to get exact derivatives without finite differences. Faster and more accurate.
The 30-minute drill
Make sure you can do these without notes:
- Compute of (chain rule).
- Compute the gradient of .
- For the 2-DOF arm above, write the 2×2 Jacobian for the end-effector (x, y).
- Implement gradient descent in 5 lines on a 2D quadratic.
If those land easily, you have enough calculus. Add depth as topics come up.
What you almost never need
- Integration by parts
- Trigonometric substitution
- Surface integrals (unless you're doing fluid sim)
- Differential forms
- Most of "Calculus 2"
Don't grind these. If you ever need them, they take an afternoon to refresh.
Next
Coordinate frames and unit hygiene — the silent source of half of all robotics bugs. After that, you have the math foundations done.
Comments
Sign in to post a comment.