Previous topic

Univariate Taylor Series Expansions

Next topic

Covariance Matrix Computation

First Order Derivatives in the Forward Mode of Algorithmic DifferentiationΒΆ

Example 1:

In this example we want to show how one can extract derivatives from the computed univariate Taylor polynomials (UTP). For simplicity we only show first-order derivatives but ALGOPY also supports the computation of higher-order derivatives by an interpolation-evaluation approach.

The basic observation is that by use of the chain rule one obtains functions $$F: \mathbb R^N \rightarrow \mathbb R^M$$

$\left. \frac{d}{d t} F(x_0 + x_1 t) \right|_{t=0} = \left. \frac{d}{d x} f(x) \right|_{x = x_0} \cdot x_1\;.$

i.e. a Jacobian-vector product.

Again, we look a simple contrived example and we want to compute the first column of the Jacobian, i.e., $$x_1 = (1,0,0)$$.

import numpy; from numpy import array
from algopy import UTPM, zeros

def F(x):
y = zeros(3, dtype=x)
y[0] = x[0]*x[1]
y[1] = x[1]*x[2]
y[2] = x[2]*x[0]
return y

x0 = array([1,3,5],dtype=float)
x1 = array([1,0,0],dtype=float)

D = 2; P = 1
x = UTPM(numpy.zeros((D,P,3)))
x.data[0,0,:] = x0
x.data[1,0,:] = x1

# normal function evaluation
y0 = F(x0)

# UTP function evaluation
y = F(x)

print('y0 = ', y0)
print('y  = ', y)
print('y.shape =', y.shape)
print('y.data.shape =', y.data.shape)
print('dF/dx(x0) * x1 =', y.data[1,0])


As output one gets:

y0 =  [  3.  15.   5.]
y  =  [[[  3.  15.   5.]]

[[  3.   0.   5.]]]
y.shape = (3,)
y.data.shape = (2, 1, 3)
dF/dx(x0) * x1 = [ 3.  0.  5.]


and the question is how to interpret this result. First off, y0 is just the usual function evaluation using numpy but y represent a univariate Taylor polynomial (UTP). One can see that each coefficient of the polynomial has the shape (3,). We extract the directional derivative as the first coefficient of the UTP.

One can see that this is indeed the numerical value of first column of the Jacobian J(1,3,5):

def J(x):
ret = numpy.zeros((3,3),dtype=float)
ret[0,:] = [x[1], x[0],  0  ]
ret[1,:] = [0,  , x[2], x[1]]
ret[2,:] = [x[2],   0 , x[0]]


Example 2:

ALGOPY can not only be used to compute series expansions of simple functions as shown above. A particular strenght of ALGOPY is that it allows to compute series expansions through numerical linear algebra functions. Consider the contrived example that appears in similar form in statistically motivated functions. It is the goal to compute the directional derivative

$\begin{split}\nabla_x f((3,5)) \cdot \begin{pmatrix} 7 \\ 11 \end{pmatrix}\end{split}$
import numpy; from numpy import log, exp, sin, cos, abs
import algopy; from algopy import UTPM, dot, inv, zeros

def f(x):
A = zeros((2,2),dtype=x)
A[0,0] = numpy.log(x[0]*x[1])
A[0,1] = numpy.log(x[1]) + exp(x[0])
A[1,0] = sin(x[0])**2 + abs(cos(x[0]))**3.1
A[1,1] = x[0]**cos(x[1])
return log( dot(x.T,  dot( inv(A), x)))

x = numpy.array([3.,7.])
x = UTPM.init_jacobian(x)

y = f(x)

print('normal function evaluation f(x) = ',y.data[0,0])
print('Jacobian df/dx = ', UTPM.extract_jacobian(y))