DistAl: An Inter-pattern Distance-based Constructive Learning Algorithm

Thumbnail Image
Date
1997-02-10
Authors
Yang, Jihoon
Parekh, Rajesh
Honavar, Vasant
Major Professor
Advisor
Committee Member
Journal Title
Journal ISSN
Volume Title
Publisher
Authors
Research Projects
Organizational Units
Organizational Unit
Computer Science

Computer Science—the theory, representation, processing, communication and use of information—is fundamentally transforming every aspect of human endeavor. The Department of Computer Science at Iowa State University advances computational and information sciences through; 1. educational and research programs within and beyond the university; 2. active engagement to help define national and international research, and 3. educational agendas, and sustained commitment to graduating leaders for academia, industry and government.

History
The Computer Science Department was officially established in 1969, with Robert Stewart serving as the founding Department Chair. Faculty were composed of joint appointments with Mathematics, Statistics, and Electrical Engineering. In 1969, the building which now houses the Computer Science department, then simply called the Computer Science building, was completed. Later it was named Atanasoff Hall. Throughout the 1980s to present, the department expanded and developed its teaching and research agendas to cover many areas of computing.

Dates of Existence
1969-present

Related Units

Journal Issue
Is Version Of
Versions
Series
Department
Computer Science
Abstract

Multi-layer networks of threshold logic units offer an attractive framework for the design of pattern classification systems. A new constructive neural network learning algorithm (DistAl) based on inter-pattern distance is introduced. DistAl uses spherical threshold neurons in a hidden layer to find a cluster of patterns to be covered (or classified) by each hidden neuron. It does not depend on an iterative, expensive and time-consuming perceptron training algorithm to find the weight settings for the neurons in the network, and thus extremely fast even for large data sets. The experimental results (in terms of generalization capability and network size) of DistAl on a number of benchmark classification problems show reasonable performance compared to other learning algorithms despite its simplicity and fast learning time. Therefore, DistAl is a good candidate to various tasks that involve very large data sets (such as largescale datamining and knowledge acquisition) or that require reasonably accurate classifiers to be learned in almost real time or that use neural network learning as the inner loop of a more complex optimization process in hybrid learning systems.

Comments
Description
Keywords
Citation
DOI
Source
Copyright
Collections