UTCS Artificial Intelligence
courses
talks/events
demos
people
projects
publications
software/data
labs
areas
admin
Parameter Revision Techniques for Bayesian Networks with Hidden Variables: An Experimental Comparison (1997)
Sowmya Ramachandran
and
Raymond J. Mooney
Learning Bayesian networks inductively in the presence of hidden variables is still an open problem. Even the simpler task of learning just the conditional probabilities on a Bayesian network with hidden variables is not completely solved. In this paper, we present an approach that learns the parameters of a Bayesian network composed of noisy-or and noisy-and nodes by using a gradient descent back-propagation approach similar to that used to train neural networks. For the task of causal inference, it has the advantage of being able to learn in the presence of hidden variables. We compare the performance of this approach with the adaptive probabilistic networks technique on a real-world classification problem in molecular biology, and show that our approach trains faster and learns networks with higher classification accuracy.
View:
PDF
,
PS
Citation:
unpublished. Unpublished Technical Note.
Bibtex:
@unpublished{ramachandran:unpublished97, title={Parameter Revision Techniques for Bayesian Networks with Hidden Variables: An Experimental Comparison}, author={Sowmya Ramachandran and Raymond J. Mooney}, month={January}, note={Unpublished Technical Note}, url="http://www.cs.utexas.edu/users/ai-lab?ramachandran:unpublished97", year={1997} }
People
Raymond J. Mooney
Faculty
mooney [at] cs utexas edu
Sowmya Ramachandran
Ph.D. Alumni
sowmya [at] shai com
Areas of Interest
Machine Learning
Theory and Knowledge Refinement
Labs
Machine Learning