We can apply a two-sided diagonal scaling to a nonnegative matrix to render it into doubly stochastic form if and only if the matrix is fully indecomposable. The scaling often reveals key structural properties of the matrix as the effects of element size and connectivity are balanced.
We begin our talk with a review of some of the properties of doubly stochastic matrices and show their natural role in a number of modern applications from machine learning to graph theory.
As with many problems involving nonnegative matrices, the Perron--Frobenius theorem is a key tool to exploit and we aim to use spectral properties of doubly stochastic matrices to reveal hidden block structure in matrices.
By applying a scaling to|A| we can apply our ideas to any square matrix (and even to rectangular matrices by a simple extension). In particular, the structure of the basis of the principal singular vectors of the scaled matrix allows us to perform a multi-way partition of the rows and columns of the matrix in the spirit of the Fiedler vector.
We have succeeded in finding structure without any prior knowledge of the number of blocks. We describe how to use a Canny filter to one or more singular vectors allows us to detect the number of blocks automatically.
We discuss some applications of our algorithm in sparse matrix problems and show how to use it to find partitions in real-world networks with reference to some classic examples.
This work was performed in collaboration with Iain Duff (RAL), Sandrine Mouysset (U Paul Sabatier, Toulouse), Daniel Ruiz (ENSEEIHT, Toulouse) and Bora Ucar (ENS, Lyon).