Abstract of FKI-200-94

Document-Name:  fki-200-94.ps.gz
Title:		Flat Minimum Search Finds Simple Nets
Authors:	Sepp Hochreiter 
                Juergen Schmidhuber 
Revision-Date:	1994/12/31
Category:	Technical Report (Forschungsberichte Künstliche Intelligenz)
Abstract:	We present a new algorithm for finding low complexity neural
                networks with high generalization capability.  The algorithm 
                searches for a ``flat'' minimum of  the  error  function.  A 
                flat minimum is a large  connected  region  in  weight-space 
                where the error remains approximately constant. An MDL-based 
                argument shows that  flat minima  correspond to low expected 
                overfitting. Although our algorithm requires the computation 
                of  second order  derivatives,  it has  backprop's  order of 
                complexity.  Automatically,  it  effectively  prunes  units, 
                weights, and  input lines. Various experiments with feedfor-
                ward and recurrent nets are described.  In an application to 
                stock  market  prediction,  flat minimum search  outperforms  
                (1) conventional backprop, (2) weight  decay, (3)  ``optimal  
                brain  surgeon'' / ``optimal  brain  damage''.  
Keywords:	Flat minima, minimum description length, overfitting error,
                generalization, stock market prediction, recurrent networks,
		feedforward networks, pruning
Size:		26 pages
Language:	English
ISSN:		0941-6358
Copyright:	The ``Forschungsberichte Künstliche Intelligenz''
		series includes primarily preliminary publications,
		specialized partial results, and supplementary
		material. In the interest of a subsequent final
		publication these reports should not be copied. All
		rights and the responsability for the contents of the
		report are with the authors, who would appreciate
		critical comments.

Gerhard Weiss