Learning Precise Timing with LSTM Recurrent Networks
Felix A. Gers, Nicol N. Schraudolph, Jürgen Schmidhuber;
3(Aug):115-143, 2002.
Abstract
The temporal distance between events conveys information essential
for numerous sequential tasks such as motor control and rhythm detection.
While Hidden Markov Models tend to ignore this information, recurrent
neural networks (RNNs) can in principle learn to make use of it.
We focus on Long Short-Term Memory (LSTM) because it has been shown
to outperform other RNNs on tasks involving long time lags.
We find that LSTM augmented by "peephole connections"
from its internal cells to its multiplicative gates can learn the fine
distinction between sequences of spikes spaced either 50 or 49
time steps apart without the help of any short training exemplars.
Without external resets or teacher forcing,
our LSTM variant also learns to generate
stable streams of precisely timed spikes and other highly nonlinear
periodic patterns. This makes LSTM a promising approach for
tasks that require the accurate measurement or generation of
time intervals.
[abs]
[pdf]
[ps.gz]
[ps]