Abstract of FKI-207-95

Document-Name:  fki-207-95.ps.gz
Title:          Long Short Term Memory
Authors:        Sepp Hochreiter and Juergen Schmidhuber
Revision-Date:  8/21/95
Category:	Technical Report (Forschungsberichte Künstliche Intelligenz)
Abstract:       ``Recurrent backprop'' for learning to store information over 
                extended time periods takes too long. The main reason is 
                insufficient, decaying error back flow.  We describe a novel, 
                efficient ``Long Short Term Memory'' (LSTM) that overcomes 
                this and related problems. Unlike previous approaches,
                LSTM can learn to bridge arbitrary time lags by enforcing
                constant error flow. Using gradient descent, LSTM explicitly 
                learns when to store information and when to access it. In 
                experimental comparisons with  ``Real-Time Recurrent 
                Learning'', ``Recurrent Cascade-Correlation'', ``Elman nets'', 
                and ``Neural Sequence Chunking'', LSTM leads to many more 
                successful runs, and learns much faster. Unlike its 
                competitors, LSTM can solve tasks involving minimal time lags 
                of more than 1000 time steps, even in noisy environments.
Keywords:       Long Short Term Memory (LSTM)
Size:		8 pages
Language:	English
ISSN:		0941-6358
Copyright:	The ``Forschungsberichte Künstliche Intelligenz''
		series includes primarily preliminary publications,
		specialized partial results, and supplementary
		material. In the interest of a subsequent final
		publication these reports should not be copied. All
		rights and the responsability for the contents of the
		report are with the authors, who would appreciate
		critical comments.





Gerhard Weiss