Degree Type

Dissertation

Date of Award

2020

Degree Name

Doctor of Philosophy

Department

Mathematics

Major

Mathematics

First Advisor

Eric S Weber

Abstract

This thesis is an ensemble of four research projects that I developed and subsequently made significant contributions and an ongoing research project. A chapter is dedicated to each published or submitted paper, and the content of those chapters adhere to their corresponding publication except for additional detail and some rather important observations and results. As such, the five chapter dedications each contain an abstract that specifically outlines the research focus and corresponding results for that project, yet a laconic description of the importance of every project now follows.

The weighted Cantor measures are a special class of singular measures on the unit interval that have attracted much attention from the harmonic analysis community. They naturally arise from an iteration process associated to the classic development of a Cantor set. In motivation of the construction of a class of Legendre polynomials, we study the moments of these measures, specifically their calculation, approximation, and asymptotic behavior.

The Walsh functions form an orthonormal basis and can be described as a discrete analog to the Fourier basis. Because of these properties, the ubiquity of the classic Walsh functions in the field of electrical engineering, especially in regards to signal processing, can not be overstated. We generalize the Walsh system, encoded as the entries of a rectangular matrix, and study its properties including an application to image processing that is analogous to the discrete wavelet transform.

The Kaczmarz algorithm is a row-action iterative projection method for solving linear systems of equations. The simplicity of the Kaczmarz algorithm and its effectiveness in handling large sparse systems have led to its wide-range applications. Typically, a sequence of vectors performs both the role of analysis and synthesis in the algorithm, but we develop a nascent modified version of the algorithm taking two vector sequences where one sequence performs the role of analysis and the other performs the role of synthesis. We identify ideal conditions on the pair of sequences so that we can recover a vector from its inner product against one of the sequences.

Further, in a collaboration as part of an REU mentorship, we consider the situation when a linear system of equations is distributed over a network, interpreted as a tree in graph theory parlance. The Kaczmarz algorithm with relaxation is iteratively applied from node to node along the network structure with a weighted average back-propagation. We prove that the relaxation parameters may be chosen to be larger than the standard bound in literature and provide empirical evidence toward such a selection leading to an improved convergence rate.

An artificial neural network is an algorithm whose architecture -- nodes (neurons) and the connections (synapses) between them -- derives loosely from the human brain. The output of any neural network assumes the form of a linear combination of an activation function evaluated over an affine transformation of the input data. We study the integral representation of the neural network for unbounded activation functions and consider a generalization of the neural network.

DOI

https://doi.org/10.31274/etd-20200902-54

Copyright Owner

Steven Nathan Harding

Language

en

File Format

application/pdf

File Size

149 pages

Share

COinS