MML Inference of Single-Layer Neural Networks

Enes Makalic, Lloyd Allison & David L. Dowe,
School of Computer Science and Software Engineering,
Monash University, Clayton, Victoria 3800, Australia.

(The Third IASTED International Conference on Artificial Intelligence and Applications, AIA 2003, September 8-10, 2003, Benalmadena, Spain.)

home1 home2
 Bib
 Algorithms
 Bioinfo
 FP
 Logic
 MML
 Prog.Lang
and the
 Book

MML
 Glossary
 Discrete
 Continuous
 Structured
 SMML
 KL-dist
 "Art"
 Ind. Inf.

Abstract: The architecture selection problem is of great importance when designing neural networks. A network that is too simple does not learn the problem sufficiently well. Conversely, a larger than necessary network presumably indicates overfitting and provides low generalisation performance. This paper presents a novel architecture selection criterion for single hidden layer feedforward networks. The optimal network size is determined using a version of the Minimum Message Length (MML) inference method. Performance is demonstrated on several problems and compared with a Minimum Description Length (MDL) based selection criterion.

[preprint.ps], [~enesm][10/'04].

Coding Ockham's Razor, L. Allison, Springer

A Practical Introduction to Denotational Semantics, L. Allison, CUP

Linux
 Ubuntu
free op. sys.
OpenOffice
free office suite
The GIMP
~ free photoshop
Firefox
web browser

Also see:
 II
  ACSC06
  JFP05
  ACSC03

AIA 2003

© L. Allison   http://www.allisons.org/ll/   (or as otherwise indicated),
Faculty of Information Technology (Clayton), Monash University, Australia 3800 (6/'05 was School of Computer Science and Software Engineering, Fac. Info. Tech., Monash University,
was Department of Computer Science, Fac. Comp. & Info. Tech., '89 was Department of Computer Science, Fac. Sci., '68-'71 was Department of Information Science, Fac. Sci.)
Created with "vi (Linux + Solaris)",  charset=iso-8859-1,  fetched Thursday, 28-Mar-2024 22:08:21 AEDT.