Machine learning on sequential data using a recurrent weighted average

Jared Ostmeyer, Lindsay G Cowell

Research output: Contribution to journalArticle

Abstract

Recurrent Neural Networks (RNN) are a type of statistical model designed to handle sequential data. The model reads a sequence one symbol at a time. Each symbol is processed based on information collected from the previous symbols. With existing RNN architectures, each symbol is processed using only information from the previous processing step. To overcome this limitation, we propose a new kind of RNN model that computes a recurrent weighted average (RWA) over every past processing step. Because the RWA can be computed as a running average, the computational overhead scales like that of any other RNN architecture. The approach essentially reformulates the attention mechanism into a stand-alone model. The performance of the RWA model is assessed on the variable copy problem, the adding problem, classification of artificial grammar, classification of sequences by length, and classification of the MNIST images (where the pixels are read sequentially one at a time). On almost every task, the RWA model is found to fit the data significantly faster than a standard LSTM model.

Original languageEnglish (US)
JournalNeurocomputing
DOIs
StateAccepted/In press - Jan 1 2018

Fingerprint

Learning systems
Recurrent neural networks
Neural Networks (Computer)
Statistical Models
Network architecture
Automatic Data Processing
Processing
Machine Learning
Pixels

Keywords

  • Attention mechanism
  • Recurrent neural network
  • Sequences

ASJC Scopus subject areas

  • Computer Science Applications
  • Cognitive Neuroscience
  • Artificial Intelligence

Cite this

Machine learning on sequential data using a recurrent weighted average. / Ostmeyer, Jared; Cowell, Lindsay G.

In: Neurocomputing, 01.01.2018.

Research output: Contribution to journalArticle

@article{d8edf574a8364ba0aca61980c3089f59,
title = "Machine learning on sequential data using a recurrent weighted average",
abstract = "Recurrent Neural Networks (RNN) are a type of statistical model designed to handle sequential data. The model reads a sequence one symbol at a time. Each symbol is processed based on information collected from the previous symbols. With existing RNN architectures, each symbol is processed using only information from the previous processing step. To overcome this limitation, we propose a new kind of RNN model that computes a recurrent weighted average (RWA) over every past processing step. Because the RWA can be computed as a running average, the computational overhead scales like that of any other RNN architecture. The approach essentially reformulates the attention mechanism into a stand-alone model. The performance of the RWA model is assessed on the variable copy problem, the adding problem, classification of artificial grammar, classification of sequences by length, and classification of the MNIST images (where the pixels are read sequentially one at a time). On almost every task, the RWA model is found to fit the data significantly faster than a standard LSTM model.",
keywords = "Attention mechanism, Recurrent neural network, Sequences",
author = "Jared Ostmeyer and Cowell, {Lindsay G}",
year = "2018",
month = "1",
day = "1",
doi = "10.1016/j.neucom.2018.11.066",
language = "English (US)",
journal = "Neurocomputing",
issn = "0925-2312",
publisher = "Elsevier",

}

TY - JOUR

T1 - Machine learning on sequential data using a recurrent weighted average

AU - Ostmeyer, Jared

AU - Cowell, Lindsay G

PY - 2018/1/1

Y1 - 2018/1/1

N2 - Recurrent Neural Networks (RNN) are a type of statistical model designed to handle sequential data. The model reads a sequence one symbol at a time. Each symbol is processed based on information collected from the previous symbols. With existing RNN architectures, each symbol is processed using only information from the previous processing step. To overcome this limitation, we propose a new kind of RNN model that computes a recurrent weighted average (RWA) over every past processing step. Because the RWA can be computed as a running average, the computational overhead scales like that of any other RNN architecture. The approach essentially reformulates the attention mechanism into a stand-alone model. The performance of the RWA model is assessed on the variable copy problem, the adding problem, classification of artificial grammar, classification of sequences by length, and classification of the MNIST images (where the pixels are read sequentially one at a time). On almost every task, the RWA model is found to fit the data significantly faster than a standard LSTM model.

AB - Recurrent Neural Networks (RNN) are a type of statistical model designed to handle sequential data. The model reads a sequence one symbol at a time. Each symbol is processed based on information collected from the previous symbols. With existing RNN architectures, each symbol is processed using only information from the previous processing step. To overcome this limitation, we propose a new kind of RNN model that computes a recurrent weighted average (RWA) over every past processing step. Because the RWA can be computed as a running average, the computational overhead scales like that of any other RNN architecture. The approach essentially reformulates the attention mechanism into a stand-alone model. The performance of the RWA model is assessed on the variable copy problem, the adding problem, classification of artificial grammar, classification of sequences by length, and classification of the MNIST images (where the pixels are read sequentially one at a time). On almost every task, the RWA model is found to fit the data significantly faster than a standard LSTM model.

KW - Attention mechanism

KW - Recurrent neural network

KW - Sequences

UR - http://www.scopus.com/inward/record.url?scp=85057804108&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85057804108&partnerID=8YFLogxK

U2 - 10.1016/j.neucom.2018.11.066

DO - 10.1016/j.neucom.2018.11.066

M3 - Article

JO - Neurocomputing

JF - Neurocomputing

SN - 0925-2312

ER -