Revising Bayesian Network Parameters Using Backpropagation (1996)
The problem of learning Bayesian networks with hidden variables is known to be a hard problem. Even the simpler task of learning just the conditional probabilities on a Bayesian network with hidden variables is hard. In this paper, we present an approach that learns the conditional probabilities on a Bayesian network with hidden variables by transforming it into a multi-layer feedforward neural network (ANN). The conditional probabilities are mapped onto weights in the ANN, which are then learned using standard backpropagation techniques. To avoid the problem of exponentially large ANNs, we focus on Bayesian networks with noisy-or and noisy-and nodes. Experiments on real world classification problems demonstrate the effectiveness of our technique.
View:
PDF, PS
Citation:
In Proceedings of the International Conference on Neural Networks (ICNN-96), Special Session on Knowledge-Based Artificial Neural Networks, pp. 82--87, Washington DC, June 1996.
Bibtex:

Raymond J. Mooney Faculty mooney [at] cs utexas edu
Sowmya Ramachandran Ph.D. Alumni sowmya [at] shai com