Date of Award
Doctor of Philosophy
Electrical and Computer Engineering
Texture analysis is a fundamental issue in image analysis and computer vision. While considerable research has been carried out in the texture analysis domain, problems relating to texture representation have been addressed only partially and active research is continuing. The vast majority of algorithms for texture analysis make either an explicit or implicit assumption that all images are captured under the same measurement conditions, such as orientation and illumination. These assumptions are often unrealistic in many practical applications;This dissertation addresses the viewpoint-invariance problem in texture classification by introducing a rotated wavelet filterbank. The proposed filterbank, in conjunction with a standard wavelet filterbank, provides better freedom of orientation tuning for texture analysis. This allows one to obtain texture features that are invariant with respect to texture rotation and linear grayscale transformation. In this study, energy estimates of channel outputs that are commonly used as texture features in texture classification are transformed into a set of viewpoint-invariant features. Texture properties that have a physical connection with human perception are taken into account in the transformation of the energy estimates;Experiments using natural texture image sets that have been used for evaluating other successful approaches were conducted in order to facilitate comparison. We observe that the proposed feature set outperformed methods proposed by others in the past. A channel selection method is also proposed to minimize the computational complexity and improve performance in a texture segmentation algorithm. Results demonstrating the validity of the approach are presented using experimental ultrasound tendon images.
Digital Repository @ Iowa State University, http://lib.dr.iastate.edu/
Kim, Nam-Deuk, "Texture representation using wavelet filterbanks " (2000). Retrospective Theses and Dissertations. 12338.