next up previous
Next: Extent of Minima Up: Asymptotic Behaviour Previous: Large Translation

   
Large Scale Disparity

In this case a small portion of the floating volume is stretched to cover the reference volume.2 Therefore the floating volume (Y) is approximately constant over the volume of overlap. Consequently:

CW $\textstyle \approx$ $\displaystyle \sum_i \frac{n_i}{N} \frac{\sqrt{\ensuremath{{\mathrm{Var}}} (Y_i)}}{\bar{Y}}
\rightarrow 0$ (23)
CCR $\textstyle \approx$ $\displaystyle \frac{1}{\ensuremath{{\mathrm{Var}}} (Y)} \sum_{i} \frac{n_i}{N} \ensuremath{{\mathrm{Var}}} (Y_i)$  
    $\displaystyle \quad \approx \frac{1}{\ensuremath{{\mathrm{Var}}} (Y)} \ensuremath{{\mathrm{Var}}} (Y) = 1$ (24)
CJE = $\displaystyle H(X,Y) \approx H(X)$ (25)
CMI $\textstyle \approx$ H(X) - H(X) - 0 = 0 (26)
CNMI $\textstyle \approx$ $\displaystyle \frac{H(X)}{H(X) + 0} = 1.$ (27)

where the joint histogram becomes compressed into a small horizontal band (minimal variation in Y) and it is assumed that the spatial variation of Y is uncorrelated with the iso-sets of X, such that $\sum_i \frac{n_i}{N} \ensuremath{{\mathrm{Var}}} (Y_i) \approx \ensuremath{{\mathrm{Var}}} (Y)$.

Once again, the Woods function and Joint Entropy give low cost values indicating a good registration, and so are likely to violate the assumption (stated in section 2.5) that the global minimum is the desired solution. Consequently, these functions should not be used. Note that although the Joint Entropy does not approach zero, the value is H(X) = H(X,X) which is the same as it would be for a perfect registration of the volume with itself.

Overall, only the Correlation Ratio, Mutual Information and Normalised Mutual Information display the correct asymptotic behaviour and therefore these are the only suitable cost functions (from this selection) to use.


next up previous
Next: Extent of Minima Up: Asymptotic Behaviour Previous: Large Translation
Mark Jenkinson
2000-05-10