UTCS Artificial Intelligence
courses
talks/events
demos
people
projects
publications
software/data
labs
areas
admin
Growing Layers of Perceptrons: Introducing the Extentron Algorithm (1992)
Paul T. Baffes
and
John M. Zelle
The ideas presented here are based on two observations of perceptrons: (1) when the perceptron learning algorithm cycles among hyperplanes, the hyperplanes may be compared to select one that gives a best
SPLIT
of the examples, an d (2) it is always possible for the perceptron to build a hyperplane that separates
at least one
example from all the rest. We describe the Extentron, which grows multi-layer networks capable of distinguishing non-linearly-separable data using the simple perceptron rule for linear threshold units. The resulting algorithm is simple, very fast, scales well to large problems, retains the convergence properties of the perceptron, and can be completely specified using only two parameters. Results are presented comparing the Extentron to other neural network paradigms and to symbolic learning systems.
View:
PDF
,
PS
Citation:
In
Proceedings of the 1992 International Joint Conference on Neural Networks
, pp. 392--397, Baltimore, MD, June 1992.
Bibtex:
@inproceedings{baffes:ijcnn92, title={Growing Layers of Perceptrons: Introducing the Extentron Algorithm}, author={Paul T. Baffes and John M. Zelle}, booktitle={Proceedings of the 1992 International Joint Conference on Neural Networks}, month={June}, address={Baltimore, MD}, pages={392--397}, url="http://www.cs.utexas.edu/users/ai-lab?baffes:ijcnn92", year={1992} }
People
Paul Baffes
Ph.D. Alumni
John M. Zelle
Ph.D. Alumni
john zelle [at] wartburg edu
Areas of Interest
Inductive Learning
Machine Learning
Neural-Symbolic Learning
Labs
Machine Learning