Low-Rank Quadratic Semidefinite Programming

Low-Rank Quadratic Semidefinite Programming

Ganzhao Yuan, Zhenjie Zhang*, Bernard Ghanem*, and Zhifeng Hao
"Low-Rank Quadratic Semidefinite Programming"
Neurocomputing Journal 2012 

Ganzhao Yuan, Zhenjie Zhang, Bernard Ghanem, Zhifeng Hao
Semidefinite Programming, Metric Learning, Kernel Learning, Eigenvalue Decomposition, Low-Rank and Sparse Matrix Approximation
2012
Low rank matrix approximation is an attractive model in large scale machine learning  problems, because it can not only reduce the memory and runtime complexity, but also  provide a natural way to regularize parameters while preserving learning accuracy. In this  paper, we address a special class of nonconvex quadratic matrix optimization problems,  which require a low rank positive semidefinite solution. Despite their non-convexity, we  exploit the structure of these problems to derive an efficient solver that converges to their  local optima. Furthermore, we show that the proposed solution is capable of dramatically  enhancing the efficiency and scalability of a variety of concrete problems, which are of  significant interest to the machine learning community. These problems include the Top k  Eigenvalue Problem, Distance Learning and Kernel Learning. Extensive experiments  on UCI benchmarks have shown the effectiveness and efficiency of our proposed method.