UTCS Artificial Intelligence
courses
talks/events
demos
people
projects
publications
software/data
labs
areas
admin
Revising Bayesian Network Parameters Using Backpropagation (1996)
Sowmya Ramachandran
and
Raymond J. Mooney
The problem of learning Bayesian networks with hidden variables is known to be a hard problem. Even the simpler task of learning just the conditional probabilities on a Bayesian network with hidden variables is hard. In this paper, we present an approach that learns the conditional probabilities on a Bayesian network with hidden variables by transforming it into a multi-layer feedforward neural network (ANN). The conditional probabilities are mapped onto weights in the ANN, which are then learned using standard backpropagation techniques. To avoid the problem of exponentially large ANNs, we focus on Bayesian networks with noisy-or and noisy-and nodes. Experiments on real world classification problems demonstrate the effectiveness of our technique.
View:
PDF
,
PS
Citation:
In
Proceedings of the International Conference on Neural Networks (ICNN-96), Special Session on Knowledge-Based Artificial Neural Networks
, pp. 82--87, Washington DC, June 1996.
Bibtex:
@inproceedings{ramachandran:icnn96kbai, title={Revising Bayesian Network Parameters Using Backpropagation}, author={Sowmya Ramachandran and Raymond J. Mooney}, booktitle={Proceedings of the International Conference on Neural Networks (ICNN-96), Special Session on Knowledge-Based Artificial Neural Networks}, month={June}, address={Washington DC}, pages={82--87}, url="http://www.cs.utexas.edu/users/ai-lab?ramachandran:icnn96kbai", year={1996} }
People
Raymond J. Mooney
Faculty
mooney [at] cs utexas edu
Sowmya Ramachandran
Ph.D. Alumni
sowmya [at] shai com
Areas of Interest
Machine Learning
Neural-Symbolic Learning
Theory and Knowledge Refinement
Uncertain and Probabilistic Reasoning
Labs
Machine Learning