next up previous
Next: Special case: White noise Up: Multi-variate Gaussian Integration: Previous: Case 2:

Case 3:


$\displaystyle \int \exp\left( - (Y - B F)^{\mathrm{\textsf{T}}}V^{-1} (Y - B F) \right) \; dF$ $\displaystyle =$ $\displaystyle \int \exp\left( - Y^{\mathrm{\textsf{T}}}V^{-1} Y + 2 Y^{\mathrm{...
...1} B F - F^{\mathrm{\textsf{T}}}B^{\mathrm{\textsf{T}}}V^{-1} B F \right) \; dF$  
  $\displaystyle =$ $\displaystyle \vert\det(K)\vert \int \exp\left( - Y^{\mathrm{\textsf{T}}}V^{-1}...
...sf{T}}}K^{\mathrm{\textsf{T}}}B^{\mathrm{\textsf{T}}}V^{-1} B K G \right) \; dG$  
  $\displaystyle =$ $\displaystyle \vert\det(K)\vert \int \exp\left( - Y^{\mathrm{\textsf{T}}}V^{-1}...
... 2 Y^{\mathrm{\textsf{T}}}V^{-1} B K G - G^{\mathrm{\textsf{T}}}G \right) \; dG$  
  $\displaystyle =$ $\displaystyle \vert\det(K)\vert \int \exp\left( - (G - K^{\mathrm{\textsf{T}}}B...
...{T}}}(G - K^{\mathrm{\textsf{T}}}B^{\mathrm{\textsf{T}}}V^{-1} Y) \right) \; dG$  
    $\displaystyle \qquad \qquad \qquad \exp\left( - Y^{\mathrm{\textsf{T}}}V^{-1} Y...
...f{T}}}V^{-1} B K K^{\mathrm{\textsf{T}}}B^{\mathrm{\textsf{T}}}V^{-1} Y \right)$  
  $\displaystyle =$ $\displaystyle \vert\det(K)\vert \, (\pi)^{M/2} \exp\left( - Y^{\mathrm{\textsf{...
... B K K^{\mathrm{\textsf{T}}}B^{\mathrm{\textsf{T}}}V^{-1/2}) V^{-1/2} Y \right)$  

where $ \dim(Y) = N$, $ \dim(F) = \dim(G) = M \le N$, $ F = KG$ and $ K^{\mathrm{\textsf{T}}}B^{\mathrm{\textsf{T}}}V^{-1} B K = I$. This assumes that $ \ensuremath{\mathrm{rank}}(B^{\mathrm{\textsf{T}}}V^{-1} B) = M$ so that $ K = (B^{\mathrm{\textsf{T}}}V^{-1} B)^{-1/2}$.

For $ M = N = \ensuremath{\mathrm{rank}}(B)$

$ \ensuremath{\mathrm{rank}}(B^{\mathrm{\textsf{T}}}V^{-1} B) = N$ implies that $ \ensuremath{\mathrm{rank}}(B) = N$ and $ \ensuremath{\mathrm{rank}}(V)
= N$. Consequently, using the SVD decompositions $ B=U_B D_B W_B^{\mathrm{\textsf{T}}}$ and $ V=U_V D_V U_V^{\mathrm{\textsf{T}}}$ gives $ K = W_B D_B^{-1} U_B^{\mathrm{\textsf{T}}}U_V D_V^{1/2}$. Therefore $ B K = U_V D_V^{1/2}$ and so $ V^{-1/2} B K K^{\mathrm{\textsf{T}}}B^{\mathrm{\textsf{T}}}V^{-1/2} = I$, so that the integral is given by:

$\displaystyle \int \exp\left( - (Y - B F)^{\mathrm{\textsf{T}}}V^{-1} (Y - B F)...
...ght) \; dF = \vert\det(B^{\mathrm{\textsf{T}}}V^{-1} B)\vert^{-1/2} (\pi)^{M/2}$ (17)

For $ \ensuremath{\mathrm{rank}}(B) = M < N$,

$ B^{\mathrm{\textsf{T}}}V^{-1} B$ is taken to have full rank, but $ B K K^{\mathrm{\textsf{T}}}B^{\mathrm{\textsf{T}}}$ is not, as $ D_B$ is not a square matrix, so that $ D_B^{-1}$ does not exist. Instead, $ K = W (D^{\mathrm{\textsf{T}}}D)^{-1/2}$ where $ V^{-1/2} B = U D
W^{\mathrm{\textsf{T}}}$ by SVD decomposition. As $ D$ has dimensions $ N$ by $ M$ (same as $ B$) then $ D^{\mathrm{\textsf{T}}}D$ is $ M$ by $ M$ and hence invertible (by virtue of $ B^{\mathrm{\textsf{T}}}V^{-1} B$ being full rank). Therefore, $ K^{\mathrm{\textsf{T}}}B^{\mathrm{\textsf{T}}}V^{-1}
B K = (D^{\mathrm{\textsf{T}}}D)^{-1/2} (D^{\mathrm{\textsf{T}}}D) (D^{\mathrm{\textsf{T}}}D)^{-1/2} = I$ as desired, but $ V^{-1/2} B K K^{\mathrm{\textsf{T}}}B^{\mathrm{\textsf{T}}}V^{-1/2} = U D (D^{\mathrm{\textsf{T}}}D)^{-1} D^{\mathrm{\textsf{T}}}U^{\mathrm{\textsf{T}}}=
P_w$ which is a projection matrix. Consequently, $ R_w = I - P_w = I -
V^{-1/2} B K K^{\mathrm{\textsf{T}}}B^{\mathrm{\textsf{T}}...
...{-1/2} B (B^{\mathrm{\textsf{T}}}V^{-1}
B)^{-1} B^{\mathrm{\textsf{T}}}V^{-1/2}$ is not zero, but is the residual projection matix (onto the null space of $ V^{-1/2} B$ - the prewhitened version of $ B$). The integral can then be written as:

$\displaystyle \int \exp\left( - (Y - B F)^{\mathrm{\textsf{T}}}V^{-1} (Y - B F) \right) \; dF$ $\displaystyle =$ $\displaystyle \vert\det(K)\vert (\pi)^{M/2} \exp\left( - Y^{\mathrm{\textsf{T}}}V^{-1/2} R_w V^{-1/2} Y \right)$  
  $\displaystyle =$ $\displaystyle \vert\det(B^{\mathrm{\textsf{T}}}V^{-1} B)\vert^{-1/2} (\pi)^{M/2} \exp\left( - Y^{\mathrm{\textsf{T}}}R_c Y \right)$ (18)

where $ R_c = V^{-1/2} R_w V^{-1/2}$ is the residual projection matrix in the coloured space, and $ R_w = I - V^{-1/2} B (B^{\mathrm{\textsf{T}}}V^{-1} B)^{-1}
B^{\mathrm{\textsf{T}}}V^{-1/2}$ is the residual projection matrix in the whitened space.

For $ \ensuremath{\mathrm{rank}}(B) < M$,

In this case the matrix $ B$ will have many linearly dependent columns and $ K^{\mathrm{\textsf{T}}}B^{\mathrm{\textsf{T}}}V^{-1} B K = I$ cannot be achieved. Instead, let the number of independent columns be $ N_1 \le N$. Furthermore, let $ V^{-1/2} B = U D
W^{\mathrm{\textsf{T}}}$ by SVD, where

$\displaystyle D = \left[ \begin{array}{cccc} D_1 & 0 \\ 0 & 0 \end{array} \right]
$

where $ D_1$ is an $ N_1$ by $ N_1$ diagonal matrix with all diagonal entries being non-zero. Integration of the parameters associated with the zero singular values can be carried out if they have finite extents. Letting $ F = KG$, and $ K = W$ gives
$\displaystyle \int \exp\left( - (Y - B F)^{\mathrm{\textsf{T}}}V^{-1} (Y - B F) \right) \; dF$ $\displaystyle =$ $\displaystyle \int \exp\left( - (V^{-1/2} Y - U D G)^{\mathrm{\textsf{T}}}(V^{-1/2} Y - U D G) \right) \; dG$  
  $\displaystyle =$ $\displaystyle L^{M-N_1} \vert\det(D_1^{\mathrm{\textsf{T}}}D_1)\vert^{-1/2} (\pi)^{M/2} \exp\left( - Y^{\mathrm{\textsf{T}}}R_c Y \right)$ (19)

where the null parameters are integrated over $ [0,L]$, and $ R_c = V^{-1} - V^{-1/2} U D_0 (D_1^{\mathrm{\textsf{T}}}D_1)^{-1} D_0^{\mathrm{\textsf{T}}}U^{\mathrm{\textsf{T}}}V^{-1/2}$, with

$\displaystyle D_0 =\left[ \begin{array}{c} D_1 \\ 0 \end{array} \right]
$

which is an $ N$ by $ N_1$ matrix.


next up previous
Next: Special case: White noise Up: Multi-variate Gaussian Integration: Previous: Case 2: