## Abstract

We discuss the Bayesian decision theory on neural networks. In the two-category case where the state-conditional probabilities are normal, a three-layer neural network having d hidden layer units can approximate the posterior probability in L^{p} (R^{d},p), where d is the dimension of the space of observables. We extend this result to multicategory cases. Then, the number of the hidden layer units must be increased, but can be bounded by 1/2 d (d + 1) irrespective of the number of categories if the neural network has direct connections between the input and output layers. In the case where the state-conditional probability is one of familiar probability distributions such as binomial, multinomial, Poisson, negative binomial distributions and so on, a two-layer neural network can approximate the posterior probability.

Original language | English |
---|---|

Pages (from-to) | 209-228 |

Number of pages | 20 |

Journal | Neurocomputing |

Volume | 63 |

Issue number | SPEC. ISS. |

DOIs | |

State | Published - Jan 2005 |

## Keywords

- Approximation
- Bayesian decision
- Direct connection
- Layered neural network
- Logistic transform

## ASJC Scopus subject areas

- Computer Science Applications
- Cognitive Neuroscience
- Artificial Intelligence