A cost function based dynamic node architecture algorithm

Thumbnail Image
Date
2000-01-01
Authors
Dantuluri, Varma
Major Professor
Advisor
Committee Member
Journal Title
Journal ISSN
Volume Title
Publisher
Altmetrics
Authors
Research Projects
Organizational Units
Organizational Unit
Computer Science

Computer Science—the theory, representation, processing, communication and use of information—is fundamentally transforming every aspect of human endeavor. The Department of Computer Science at Iowa State University advances computational and information sciences through; 1. educational and research programs within and beyond the university; 2. active engagement to help define national and international research, and 3. educational agendas, and sustained commitment to graduating leaders for academia, industry and government.

History
The Computer Science Department was officially established in 1969, with Robert Stewart serving as the founding Department Chair. Faculty were composed of joint appointments with Mathematics, Statistics, and Electrical Engineering. In 1969, the building which now houses the Computer Science department, then simply called the Computer Science building, was completed. Later it was named Atanasoff Hall. Throughout the 1980s to present, the department expanded and developed its teaching and research agendas to cover many areas of computing.

Dates of Existence
1969-present

Related Units

Journal Issue
Is Version Of
Versions
Series
Department
Computer Science
Abstract

Static architecture learning algorithms have the architecture fixed. The amount of learning and generalization does depend on the architecture trained on. Therefore static architecture algorithms require a careful selection of architecture before training. But how good, an architecture is, is not known until after the training is done. To overcome this shortcoming of static architecture learning algorithms, a Cost Function Based Dynamic Node Architecture Algorithm (DNAC) is proposed, which dynamically changes the architecture of the network and finds the one with high learning and generalization capability. The DNAC algorithm starts with a minimal architecture and stops at a good architecture for the data to be learned. The Scaled Conjugate Gradient Algorithm (Moller, 1993) is used as the underlying learning algorithm. The RMS error on a cross validation set is used as the measure of generalization. A computationally simple nodal and layer importance functions are proposed. The results on two benchmark problems and a real world problem are presented.

Comments
Description
Keywords
Citation
Source
Copyright
Sat Jan 01 00:00:00 UTC 2000