Date of Award
Doctor of Philosophy
Daniel J. Nordman
Stephen B. Vardeman
Markov chain Monte Carlo (MCMC) is a computational statistical approach for numerically approximating distributional quantities useful for inference that might otherwise be intractable to directly calculate. A challenge with MCMC methods is developing implementations which are both statistically rigorous and computationally scalable to large data sets. This work generally aims to bridge these aspects by exploiting conditional independence, or Markov structures, in data models. Chapter 2 investigates the model properties and Bayesian fitting of a graph model with Markovian dependence used in deep machine learning and image classification, called a restricted Bolzmann machine (RBM), and Chapter 3 presents a framework for describing inherent instability in a general class of models which includes RBMs. Chapters 4 and 5 introduce a fast method for simulating data from a Markov Random Field (MRF) by exploiting conditional independence specified in the model and a flexible `R` package that implements the approach in C++.
Kaplan, Andrea, "On advancing MCMC-based methods for Markovian data structures with applications to deep learning, simulation, and resampling" (2017). Graduate Theses and Dissertations. 15546.