Implementation of automated 3D defect detection for low signal-to noise features in NDE data

Thumbnail Image
Date
2014-01-01
Authors
Grandin, Robert
Gray, Joseph
Major Professor
Advisor
Committee Member
Journal Title
Journal ISSN
Volume Title
Publisher
Authors
Person
Grandin, Robert
Research Computing Systems Analyst
Research Projects
Organizational Units
Organizational Unit
Mechanical Engineering
The Department of Mechanical Engineering at Iowa State University is where innovation thrives and the impossible is made possible. This is where your passion for problem-solving and hands-on learning can make a real difference in our world. Whether you’re helping improve the environment, creating safer automobiles, or advancing medical technologies, and athletic performance, the Department of Mechanical Engineering gives you the tools and talent to blaze your own trail to an amazing career.
Organizational Unit
Journal Issue
Is Version Of
Versions
Series
Department
Mechanical EngineeringCenter for Nondestructive Evaluation
Abstract

The need for robust defect detection in NDE applications requires the identification of subtle, low-contrast changes in measurement signals usually in very noisy data. Most algorithms rarely perform at the level of a human inspector and often, as data sets are now routinely 10+ Gigabytes, require laborious manual inspection. We present two automated defect segmentation methods, simple threshold and a binomial hypothesis test, and compare effectiveness of these approaches in noisy data with signal to noise ratios at 1:1. The defect-detection ability of our algorithm will be demonstrated on a 3D CT volume, UT C-scan data, magnetic particle images, and using simulated data generated by XRSIM. The latter is a physics-based forward model useful in demonstrating the effectiveness of data processing approaches in a simulation which includes complex defect geometry and realistic measurement. These large data setsrepresent significant demands on compute resources and easily overwhelm typical PC platforms; however, the emergence of graphics processing units(GPU) processing power provides a means to overcome this bottleneck. Processing large, multi-dimensional datasets requires an optimal GPU implementation which addresses both computational complexity and memory-bandwidth usage.

Comments

The following article appeared in AIP Conference Proceedings 1581 (2014): 1840, and may be found at doi: 10.1063/1.4865047.

Description
Keywords
Citation
DOI
Copyright
Wed Jan 01 00:00:00 UTC 2014