next up previous
Next: SVD within tensor algebra Up: tr00dl2 Previous: Shortcomings of current statistical

Multiway multidimensional data reduction

The aim of this section is to explain the basics of the SVD-kmodes method as an extension of SVD. The presentation can be limited to $k=2$ and $k=3$ as the case $k > 3$ is then a straightforward extension. Further details are in [16]. First of all the development of this generalisation of the SVD (Singular Value Decomposition) is described within tensor algebra framework in finite dimension. It enables us to extend matrix algebra calculus in an easy way. A tensor of order one is a vector, a tensor of order two is a matrix, a tensor of order three is three-way array etc...
Let $\{e_i\}_{i=1,n}$, $\{f_j\}_{j=1,p}$, and $\{g_k\}_{k=1,t}$ be the canonical bases respectively of $E=I\!\!R^n$, $F=I\!\!R^p$ and $G=I\!\!R^t$ ; with $a\in E$ and $b\in F$ let us define the bilinear map $a \otimes b$ by:
\begin{displaymath}
a \otimes b (x,y)=<a,x>_E<b,y>_F,
\end{displaymath} (1)

(where $<.,.>_E$ is the inner product in $E$) ; consider now the canonical bilinear maps built with the $e_i$ and $f_j$, they constitute a base of a space noted $E \otimes F$ the tensor product of the spaces $E$ and $F$. Without going further into algebraic concepts, notice that because of symmetry in equation (1) $x \otimes y$ can be considered as a linear map onto $E \otimes F$, so that one has the universal property of the tensor product: transforming a bilinear map (multilinear in general) into a linear map. An $n \times p$ matrix $A$ of elements $A_{ij} \in I\!\!R$, can be written algebraically,
\begin{displaymath}
A=\sum_{ij} A_{ij} \quad e_i \otimes f_j \quad ,
\end{displaymath} (2)

and $A$ is said to belong to the space $E \otimes F$ , tensorial product of the spaces $E$ and $F$1. Notice that the array, the linear map associated, the tensor are noted $A$ because of isomorphisms. In the same manner a three-way array $n \times p \times t $ $A$ of elements $A_{ijk} \in I\!\!R$, can be written algebraically,
\begin{displaymath}
A=\sum_{ijk} A_{ijk} \quad e_i \otimes f_j \otimes g_k
\mbox{, and } A \in E \otimes F \otimes G .
\end{displaymath} (3)

The vectors of the space $E \otimes F \otimes G$ with the form $d=\alpha \otimes \beta \otimes
\gamma$ (where $ \alpha=\sum_i \alpha_i e_i, \beta=\sum_j \beta_j f_j, \gamma=\sum_k \gamma_k g_k
$) are called decomposed tensors, and are said to be of rank one - a sum of $r$ linearly independent decomposed tensors would give a rank $r$ tensor-2. To finish with basic tools of tensor algebra, let us also introduce the generalisation of a product of a vector by a matrix: the product of vector (or a tensor) by a tensor, also called contraction and noted ``..". For example let $A \in E \otimes F \otimes G, \gamma \in G$, then $A..\gamma \in
E \otimes F $ with:
$\displaystyle A..\gamma$ $\textstyle =$ $\displaystyle \sum_{ijk} A_{ijk} \quad e_i \otimes f_j <g_k,\gamma>_G$  
  $\textstyle =$ $\displaystyle \sum_{ijk} A_{ijk}\gamma_k \quad e_i \otimes f_j .$ (4)

Arithmetically this can be seen considering $A$ as a matrix with $np$ rows and $t$ columns, then calculating the image of $\gamma$ by this matrix gives a representation of $A..\gamma $. Note that $A..B$ is the inner product between the tensors $<A,B>_{E \otimes F \otimes G}$.

Subsections
next up previous
Next: SVD within tensor algebra Up: tr00dl2 Previous: Shortcomings of current statistical
Didier Leibovici 2001-09-04