Sequence
Learning with Incremental Higher-Order Neural Networks,
University
of Texas at Austin AI lab technical report, 1993.
An incremental, higher-order, non-recurrent neural-network combines two
properties found to be useful for sequence learning in neural-networks:
higher-order connections and the incremental introduction of new
units. The incremental, higher-order neural-network adds higher
orders when needed by adding new units that dynamically modify
connection weights. The new units modify the weights at the next
time-step with information from the previous step. Since a
theoretically unlimited number of units can be added to the network,
information from the arbitrarily distant past can be brought to bear on
each prediction. Temporal tasks can thereby be learned without the use
of feedback, in contrast to recurrent neural-networks. Because
there are no recurrent connections, training is simple and fast.
Experiments have demonstrated speedups of two orders of magnitude over
recurrent networks.