TY - JOUR
T1 - Stability of neural networks for slightly perturbed training data sets
AU - Berhane, Indrias
AU - Srinivasan, C.
PY - 2004/9
Y1 - 2004/9
N2 - In learning models of artificial neural networks, that randomness comes from the distribution of the training data. We show individual observations do not affect excessively for a neutral network modeling, provided that it has adequate nodes on the hidden layer and proves that the empirical error of a neural network with p number of weights converges to the expected error when p/ m → 0 where m is the size of the perturbed training data.
AB - In learning models of artificial neural networks, that randomness comes from the distribution of the training data. We show individual observations do not affect excessively for a neutral network modeling, provided that it has adequate nodes on the hidden layer and proves that the empirical error of a neural network with p number of weights converges to the expected error when p/ m → 0 where m is the size of the perturbed training data.
KW - Artificial neural networks
KW - Stability
UR - http://www.scopus.com/inward/record.url?scp=7544234850&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=7544234850&partnerID=8YFLogxK
U2 - 10.1081/STA-200026629
DO - 10.1081/STA-200026629
M3 - Article
AN - SCOPUS:7544234850
SN - 0361-0926
VL - 33
SP - 2259
EP - 2270
JO - Communications in Statistics - Theory and Methods
JF - Communications in Statistics - Theory and Methods
IS - 9 SPEC.ISS.
ER -