The sample complexity of pattern classification with neural networks: the size of the weights is more important than the size of the network
Bartlett, P.L. (1998) The sample complexity of pattern classification with neural networks: the size of the weights is more important than the size of the network. IEEE Transactions on Information Theory, 44(2), pp. 525536.

Published Version
(PDF 591kB)

Abstract
Sample complexity results from computational learning theory, when applied to neural network learning for pattern classification problems, suggest that for good generalization performance the number of training examples should grow at least linearly with the number of adjustable parameters in the network. Results in this paper show that if a large neural network is used for a pattern classification problem and the learning algorithm finds a network with small weights that has small squared error on the training patterns, then the generalization performance depends on the size of the weights rather than the number of weights. For example, consider a twolayer feedforward network of sigmoid units, in which the sum of the magnitudes of the weights associated with each unit is bounded by A and the input dimension is n. We show that the misclassification probability is no more than a certain error estimate (that is related to squared error on the training set) plus A3 √((log n)/m) (ignoring log A and log m factors), where m is the number of training patterns. This may explain the generalization performance of neural networks, particularly when the number of training examples is considerably smaller than the number of weights. It also supports heuristics (such as weight decay and early stopping) that attempt to keep the weights small during training. The proof techniques appear to be useful for the analysis of other pattern classifiers: when the input domain is a totally bounded metric space, we use the same approach to give upper bounds on misclassification probability for classifiers with decision boundaries that are far from the training examples.
Impact and interest:
Citation counts are sourced monthly from Scopus and Web of Science® citation databases.
These databases contain citations from different subsets of available publications and different time periods and thus the citation count from each is usually different. Some works are not in either database and no count is displayed. Scopus includes citations from articles published in 1996 onwards, and Web of Science® generally from 1980 onwards.
Citations counts from the Google Scholar™ indexing service can be viewed at the linked Google Scholar™ search.
Fulltext downloads:
Fulltext downloads displays the total number of times this work’s files (e.g., a PDF) have been downloaded from QUT ePrints as well as the number of downloads in the previous 365 days. The count includes downloads for all files if a work has more than one.
ID Code:  43927 

Item Type:  Journal Article 
Refereed:  Yes 
Keywords:  Computer networks , Neural networks , Pattern analysis , Pattern classification , Probability distribution , Statistical learning , Training data , Upper bound , Virtual colonoscopy , Pattern recognition 
DOI:  10.1109/18.661502 
Subjects:  Australian and New Zealand Standard Research Classification > INFORMATION AND COMPUTING SCIENCES (080000) > ARTIFICIAL INTELLIGENCE AND IMAGE PROCESSING (080100) Australian and New Zealand Standard Research Classification > ENGINEERING (090000) > ELECTRICAL AND ELECTRONIC ENGINEERING (090600) Australian and New Zealand Standard Research Classification > PSYCHOLOGY AND COGNITIVE SCIENCES (170000) > COGNITIVE SCIENCE (170200) 
Divisions:  Past > QUT Faculties & Divisions > Faculty of Science and Technology Past > Schools > Mathematical Sciences 
Copyright Owner:  IEEE 
Copyright Statement:  Personal use of this material is permitted. However, permission to reprint/republish this material for advertising or promotional purposes or for creating new collective works for resale or redistribution to servers or lists, or to reuse any copyrighted component of this work in other works must be obtained from the IEEE. 
Deposited On:  12 Aug 2011 01:07 
Last Modified:  12 Aug 2011 01:07 
Export: EndNote  Dublin Core  BibTeX
Repository Staff Only: item control page