Learning the kernel matrix with semidefinite programming
Lanckriet, G. R. G., Cristianini, N., Bartlett, P. L., El Ghaoui, L., & Jordan, M. I. (2004) Learning the kernel matrix with semidefinite programming. Journal of Machine Learning Research, 5, pp. 27-72.
Kernel-based learning algorithms work by embedding the data into a Euclidean space, and then searching for linear relations among the embedded data points. The embedding is performed implicitly, by specifying the inner products between each pair of points in the embedding space. This information is contained in the so-called kernel matrix, a symmetric and positive semidefinite matrix that encodes the relative positions of all points. Specifying this matrix amounts to specifying the geometry of the embedding space and inducing a notion of similarity in the input space - classical model selection problems in machine learning. In this paper we show how the kernel matrix can be learned from data via semidefinite programming (SDP) techniques. When applied to a kernel matrix associated with both training and test data this gives a powerful transductive algorithm -using the labeled part of the data one can learn an embedding also for the unlabeled part. The similarity between test points is inferred from training points and their labels. Importantly, these learning problems are convex, so we obtain a method for learning both the model class and the function without local minima. Furthermore, this approach leads directly to a convex method for learning the 2-norm soft margin parameter in support vector machines, solving an important open problem.
Impact and interest:
Citation counts are sourced monthly from and citation databases.
These databases contain citations from different subsets of available publications and different time periods and thus the citation count from each is usually different. Some works are not in either database and no count is displayed. Scopus includes citations from articles published in 1996 onwards, and Web of Science® generally from 1980 onwards.
Citations counts from theindexing service can be viewed at the linked Google Scholar™ search.
|Item Type:||Journal Article|
|Keywords:||Convex optimization, Kernel methods, Learning kernels, Model selection, Semidefinite programming, Support vector machines, Transduction, Data reduction, Eigenvalues and eigenfunctions, Geometry, Learning algorithms, Learning systems, Mathematical programming, Optimization, Parameter estimation, Problem solving, Matrix algebra, OAVJ|
|Subjects:||Australian and New Zealand Standard Research Classification > INFORMATION AND COMPUTING SCIENCES (080000) > ARTIFICIAL INTELLIGENCE AND IMAGE PROCESSING (080100)
Australian and New Zealand Standard Research Classification > PSYCHOLOGY AND COGNITIVE SCIENCES (170000) > COGNITIVE SCIENCE (170200)
|Divisions:||Past > QUT Faculties & Divisions > Faculty of Science and Technology
Past > Schools > Mathematical Sciences
|Deposited On:||12 Aug 2011 03:02|
|Last Modified:||12 Aug 2011 03:34|
Repository Staff Only: item control page