Prof

Páginas: 12 (2979 palabras) Publicado: 18 de enero de 2013
ESANN'2000 proceedings - European Symposium on Artificial Neural Networks
Bruges (Belgium), 26-28 April 2000, D-Facto public., ISBN 2-930307-00-5, pp. 75-80

Confidence Estimation Methods for Neural
Networks: A Practical Comparison
G. Papadopoulos, P.J. Edwards, A.F. Murray
Department of Electronics and Electrical Engineering,
University of Edinburgh

Abstract. Feed-forward neuralnetworks (Multi-Layered Perceptrons) are used
widely in real-world regression or classification tasks. A reliable and practical
measure of prediction “confidence” is essential in real-world tasks. This paper
compares three approaches to prediction confidence estimation, using both artificial
and real data. The three methods are maximum likelihood, approximate Bayesian
and bootstrap. Both noise inherentto the data and model uncertainty are considered.

1.

Introduction

Truly reliable neural prediction systems require the prediction to be qualified by a
confidence measure. This important issue has, surprisingly, received little systematic
study and most references to confidence measures take an ad hoc approach. This paper
offers a systematic comparison of three commonly-used confidenceestimation methods
for neural networks and takes a first step towards a better understanding of the practical
issues involving prediction uncertainty.
Neural network predictions suffer uncertainty due to (a) inaccuracies in the training data and (b) the limitations of the model. The training set is typically noisy and
incomplete (not all possible input-output examples are available). Noise isinherent
to all real data and contributes to the total prediction variance as data noise variance
2
. Moreover, the limitations of the model and the training algorithm introduce further
uncertainty to the network’s prediction. Neural networks are trained using an iterative optimisation algorithm (e.g. steepest descent). The resultant weight values often
correspond, therefore, to a local ratherthan the global minimum of the error function.
Additionally, as the optimisation algorithm can only “use” the information available in
the training set, the solution is likely to be valid only for regions sufficiently represented
by the training data [1]. We call this model uncertainty and its contribution to the total
2
prediction variance model uncertainty variance m .
These two uncertaintysources are assumed to be independent and the total predic2
2
tion variance is given by the sum of their variances: TOTAL = 2 + m . Confidence
estimation must take into account both sources. Some researchers ignore model uncertainty assuming that the regression model is correct (e.g. [8]). However, in reality
a system can be presented with novel inputs that differ from the training data. For thisreason, the inclusion of model uncertainty estimation is crucial. Finally, we do not make

ESANN'2000 proceedings - European Symposium on Artificial Neural Networks
Bruges (Belgium), 26-28 April 2000, D-Facto public., ISBN 2-930307-00-5, pp. 75-80

2

the oversimplification that the data noise variance,
this is unlikely in complex real-world problems.

2.

, is constant for all inputdata, as

Methods for Neural Network prediction uncertainty estimation

x

In regression tasks, the problem is to estimate an unknown function f ( ) given a set of
input-target pairs D = f n tng n = 1 ::: N . The targets are assumed to be corrupted
by additive noise tn = f ( n ) + en . The errors e are modelled as Gaussian i.i.d. with
zero mean and variance 2 ( ).
Several approaches toprediction confidence estimation have been reported in the
literature. In [8] a method for obtaining an input-dependent 2 estimate using maximum
likelihood (ML) is presented. The traditional network architecture is extended and a
new set of hidden units is used to compute 2 ( ; ˆ ), the network estimate for data noise
variance. The variance output unit has exponential activation function so...
Leer documento completo

Regístrate para leer el documento completo.

Estos documentos también te pueden resultar útiles

  • Profe
  • PROF
  • Profes
  • prof
  • Prof
  • profe
  • el profe
  • profe

Conviértase en miembro formal de Buenas Tareas

INSCRÍBETE - ES GRATIS