Neural networks: genetic-based learning, network architecture, and applications to nondestructive evaluation

Thumbnail Image
Date
1990
Authors
Mann, James
Major Professor
Advisor
Committee Member
Journal Title
Journal ISSN
Volume Title
Publisher
Altmetrics
Authors
Research Projects
Organizational Units
Journal Issue
Is Version Of
Versions
Series
Department
Electrical and Computer Engineering
Abstract

Even before the invention of the electronic digital computer by John Vincent Atanasoff, there developed a desire to create a machine which could emulate the functionality and thought processing of a living organism. This quest initially began as a movement called "cybernetics" in the 1940s and was later formalized and coined Artificial Intelligence (AI) by researchers in the mid-1950s. Since that time the majority of advances in the field of AI have been related to symbolic processing of information using expen systems. In such systems, a set of rules is developed which can be manipulated by a program, or inference engine, in order to draw conclusions from a given set of input data, often in conjunction with a data base of facts, to produce an appropriate response. While this method has proven effective in many situations, the process relies on the ability to create formal representations of the problem at hand and generate a set of rules that appropriately describes the interaction of these representations. Rules are usually generated through consultation with a human expen by a knowledge engineer (the person implementing the expen system). Many refinements of the rules are often necessary, requiring many interviews and adjustments to the expen system. This can involve hundreds or thousands of person-hours. Furthermore, these types of systems are often sensitive to noise in the input data--a slight penurbance can produce significantly different results. Also, expen systems can require hundreds or thousands of inferences for a given set of input data, thus requiring great computational power. It is clear that this is not the son of processing which occurs in biological organisms, as the neuronal elements comprising these organisms are typically orders of magnitude slower than their electronic, artificial counterparts. These underlying assumptions and limitations make it extremely difficult, if not impossible, to produce expert systems which perform tasks nearly every human performs routinely, such as reading hand-written text.

Comments
Description
Keywords
Citation
Source
Subject Categories
Copyright
Mon Jan 01 00:00:00 UTC 1990