Niveau: Supérieur, Doctorat, Bac+8
Predictive low-rank decomposition for kernel methods Francis R. Bach Centre de Morphologie Mathematique, Ecole des Mines de Paris 35 rue Saint-Honore, 77300 Fontainebleau, France Michael I. Jordan Computer Science Division and Department of Statistics University of California, Berkeley, CA 94720, USA Abstract Low-rank matrix decompositions are essen- tial tools in the application of kernel meth- ods to large-scale learning problems. These decompositions have generally been treated as black boxes—the decomposition of the kernel matrix that they deliver is indepen- dent of the specific learning task at hand— and this is a potentially significant source of inefficiency. In this paper, we present an algorithm that can exploit side informa- tion (e.g., classification labels, regression re- sponses) in the computation of low-rank de- compositions for kernel matrices. Our al- gorithm has the same favorable scaling as state-of-the-art methods such as incomplete Cholesky decomposition—it is linear in the number of data points and quadratic in the rank of the approximation. We present simulation results that show that our algo- rithm yields decompositions of significantly smaller rank than those found by incomplete Cholesky decomposition. 1. Introduction Kernel methods provide a unifying framework for the design and analysis of machine learning algo- rithms (Scholkopf and Smola, 2001, Shawe-Taylor and Cristianini, 2004).
- kernel matrix
- can thus
- gadvk?1 ?
- look-ahead decompositions
- cholesky decomposition
- learning task
- jordan jordan