Hauptnavigation

Pfahler/Morik/2018a: Nystroem-SGD: Rapidly Learning Kernel-Classifiers with Conditioned Stochastic Gradient Descent

Bibtype Article
Bibkey Pfahler/Morik/2018a
Author Pfahler, Lukas and Morik, Katharina
Ls8autor Morik, Katharina
Pfahler, Lukas
Title Nystroem-SGD: Rapidly Learning Kernel-Classifiers with Conditioned Stochastic Gradient Descent
Abstract Kernel methods are a popular choice for classification problems, but when solving large-scale learning tasks computing the quadratic kernel matrix quickly becomes infeasible. To circumvent this problem, the Nyström method that approximates the kernel matrix using only a smaller sample of the kernel matrix has been proposed. Other techniques to speed up kernel learning include stochastic first order optimization and conditioning.We introduce Nyström-SGD, a learning algorithm that trains kernel classifiers by minimizing a convex loss function with conditioned stochastic gradient descent while exploiting the low-rank structure of a Nyström kernel approximation.Our experiments suggest that the Nyström-SGD enables us to rapidly train high-accuracy classifiers for large-scale classification tasks.
Year 2018



  • Privacy Policy
  • Imprint