CGO 07-1: Automatic Differentiation

Jupyter Notebookのファイルをここからダウンロードしてください。

CGO 07-1: Automatic Differentiation

This notebook uses pytorch. Instructions for installing can be fonud at https://pytorch.org/

import torch
from torch import nn
import math

Function

We wish to model the following function:

\[f(x) = \frac{ x_1 x_2 \sin(x_3) + e^{x_1 x_2} }{ x_3 }\]

We will compute the gradient at the point $x = [x_1,\,x_2,\,x_3]^\intercal = [1,\,2,\,\pi/2]^\intercal$

# Create the input values
x = torch.Tensor( (1, 2, math.pi/2) )

# We have to use requires_grad=True for the visualization
x.requires_grad=True

# Call the function
def f( x ):
    return (x[0]*x[1]*torch.sin(x[2])+torch.exp(x[0]*x[1])) / x[2]
y = f(x)

# Calculate the gradient in reverse mode
y.backward()
print( x.grad )

Visualization of the Computational Graph

We can visualize the computational graph using torchviz.

pip install torchviz

from torchviz import make_dot, make_dot_from_trace
make_dot( y )

Computing the Hessian

We can easily compute gradients of gradients using the automatic differentiation system. This will allow us to approximate arbitrary order derivatives of arbitrary functions.

As a simple example we compute the Hessian of $f(x) = x_1^2 + 4 x_2^2$ with $x=[1,1]^\intercal$.

# Trivial function
x = torch.Tensor( (1,1) )
def f(x):
    return x[0]**2 + 4*x[1]**2

# We have to use requires_grad=True for the visualization
x.requires_grad=True

# Jacobian
[g] = torch.autograd.grad( f(x), (x,), create_graph=True )
print( 'g:', g )

# Hessian
h = torch.zeros( (g.numel(),g.numel()) ) 
for i in range(g.numel()):
    [h[i,:]] = torch.autograd.grad( g[i], (x,), create_graph=True )
print( 'h:', h )