Robust training termination criterion for back-propagation ANNs applicable to small data sets

V. Chandramouli, S. Lingireddy, G. M. Brion

Research output: Contribution to journalArticlepeer-review

12 Scopus citations


One of the daunting tasks of a neural network modeler is prescribing an appropriate training termination criterion, a criterion that avoids underfitting or overfitting the underlying functional relationship between input and output variables. This is particularly true when dealing with smaller data sets that do not offer the luxury of splitting the database into traditional training, testing, and validation sets. In the absence of a testing data set or when the testing data set is small, which is not very uncommon when working with environmental databases, it is extremely difficult to know when to terminate the training exercise. This paper proposes a new criterion that provides adequate guidance on training termination without the necessity for a testing data set and illustrates the validity of the proposed criterion on three data sets for water resources and environmental engineering applications. An extensive study of a number of large and small data sets has indicated that the moving average of relative strength index of a randomly generated dummy input variable tends to reach zero at the optimal termination point and tends to move away from zero beyond the optimal point. Based on this observation, a training terminating index was developed, tested, and validated on three datasets.

Original languageEnglish
Pages (from-to)39-46
Number of pages8
JournalJournal of Computing in Civil Engineering
Issue number1
StatePublished - 2007


  • Data analysis
  • Databases
  • Microbes

ASJC Scopus subject areas

  • Civil and Structural Engineering
  • Computer Science Applications


Dive into the research topics of 'Robust training termination criterion for back-propagation ANNs applicable to small data sets'. Together they form a unique fingerprint.

Cite this