If you’ve done much matrix algebra you will know that many linear problems can be quickly solved using the method of diagonalisation. The symmetry of the problem is used to transform the matrix until only the diagonal elements are non-zero. Matrix multiplication then just involves multiplying the diagonal elements (known as the eigenvalues) and the determinant is just their product.

It would be nice if we could do a similar trick with multi-dimensional hypermatrices. A hypermatrix does not have a specific set of components that form a diagonal but we could choose one of the diagonals and try to use linear transforms to eliminate the other elements. If that fails we could look for other ways to reduce most of the components to zero so that computations simplify. In general the diagonalisation trick does not work that well for hypermatrices and it is easy to see why. A hypermatrix of rank n with N^{n} components can be transformed using n linear transforms which have N^{2}-1 degrees of freedom. Just by counting we can see that less than nN^{2 }components can be eliminated by a general method. If n > 2 and N or n is large then only a minority of the components can be removed.

Nevertheless, for specific small formats the elimination method is still useful and often we are interested in properties of small hypermatrices. An example already came up in the construction of cayley’s hyperdeterminant. For the 2x2x2 hypermatrix A we demonstrated that if the hyperdeterminant is not zero then the matrix (over the complex numbers) can be written in this form

A = f_{1} **x***_{1}⊗**y***_{1}⊗**z***_{1 }+ f_{2} **x***_{2}⊗**y***_{2}⊗**z***_{2}

And we can normalise the vectors so that **x**_{1}Λ **x**_{2} = **y**_{1}Λ **y**_{2} = **z**_{1}Λ **z**_{2} = 1.

Although we did not mention it at the time, we can use special linear transforms to rotate to a system where these vectors form a basis for each of the three vector spaces. The hypermatrix A then only has two non-zero elements (a_{111} = f_{1} and a_{222} = f_{2}) and the other six are zero. For this specific case the hypermatrix can be diagonalised and the hyperdeterminant in this form is just det(A) = f_{1}^{2} f_{2}^{2} .

What of the case where the hyperdeteriminant is zero? Then it is not always possible to diagonalise the 2x2x2 hypermatrix in the same way. However we can eliminate a different set of elements. To see this first recall that when the hyperdeterminant det(A) is zero there are three vectors **x**_{1}, **y**_{1}, **z**_{1 }such that contracting any two with A gives a zero vector. Choose a second set of vectors that are linearly independent of these three and normalise so that **x**_{1}Λ **x**_{2} = **y**_{1}Λ **y**_{2} = **z**_{1}Λ **z**_{2} = 1. Then we can use these as a basis in which **x**_{1}= (1,0), **x**_{2}= (0,1) etc. To get the zero vectors from the contractions it must then follow that

a_{111} = a_{112} = a_{121} = a_{211 } = 0

So half the components have been eliminated. A similar argument works for any hypermatrix whose hyperdeterminant is zero to eliminante the first element and all others in the same row in each of the directions.

Another way to get rid of components is to use an elimination process similar to the familiar one for matrices. Consider a transformation by a matrix whose diagonal elements are all one and whose off diagonal elements are all zero except for one exception which has a value u. Such a matrix always has determinant one so the hyperdteriminant (and other invariants) are unchanged under such a transformation. You can see that this transformation on a hypermatrix of rank n is actually equivalent to taking an (n-1) dimensional slice from the hypermatrix, multiplying by u and adding that to a different slice with the same orientation. This generalises the rule that the determinant of a matrix is unchanged when you add a multiple of a row (or column) to another row (o column), and it is this rule that can be used to eliminate elements

We will be using some of these methods later.

October 11, 2008 at 10:43 am

[…] « Hypermatrix Diagonalisation […]