Online kernel dictionary learning on a budget

Jeon Lee, Seung Jun Kim

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

Online kernel-based dictionary learning (DL) algorithms are considered, which perform DL on training data lifted to a high-dimensional feature space via a nonlinear mapping. Compared to batch versions, online algorithms require low computational complexity, essential for processing the Big Data, based on the stochastic gradient descent method. However, as with any kernel-based learning algorithms, the number of parameters needed to represent the desired dictionary is equal to the number of training samples, which incurs prohibitive memory requirement and computational complexity for large-scale datasets. In this work, appropriate sparsification and pruning strategies are combined with online kernel DL to mitigate this issue. Numerical tests verify the efficacy of the proposed strategies.

Original languageEnglish (US)
Title of host publicationConference Record of the 50th Asilomar Conference on Signals, Systems and Computers, ACSSC 2016
EditorsMichael B. Matthews
PublisherIEEE Computer Society
Pages1535-1539
Number of pages5
ISBN (Electronic)9781538639542
DOIs
StatePublished - Mar 1 2017
Event50th Asilomar Conference on Signals, Systems and Computers, ACSSC 2016 - Pacific Grove, United States
Duration: Nov 6 2016Nov 9 2016

Publication series

NameConference Record - Asilomar Conference on Signals, Systems and Computers
ISSN (Print)1058-6393

Conference

Conference50th Asilomar Conference on Signals, Systems and Computers, ACSSC 2016
Country/TerritoryUnited States
CityPacific Grove
Period11/6/1611/9/16

ASJC Scopus subject areas

  • Signal Processing
  • Computer Networks and Communications

Fingerprint

Dive into the research topics of 'Online kernel dictionary learning on a budget'. Together they form a unique fingerprint.

Cite this