#### Publication Date

2017

#### Abstract

A probability model exhibits instability if small changes in a data outcome result in large, and often unanticipated, changes in probability. This instability is a property of the probability model, rather than the fitted parameter vector. For correlated data structures found in several application areas, there is increasing interest in predicting/identifying such sensitivity in model probability structure. We consider the problem of quantifying instability for general probability models defined on sequences of observations, where each sequence of length N has a finite number of possible values. A sequence of probability models results, indexed by N, that accommodates data of expanding dimension. Model instability is formally shown to occur when a certain log-probability ratio under such models grows faster than N. In this case, a one component change in the data sequence can shift probability by orders of magnitude. Also, as instability becomes more extreme, the resulting probability models are shown to tend to degeneracy, placing all their probability on potentially small portions of the sample space. These results on instability apply to large classes of models commonly used in random graphs, network analysis, and machine learning contexts.

#### Language

en

#### Included in

Industrial Engineering Commons, Industrial Technology Commons, Statistics and Probability Commons

## Comments

This is a preprint of the article Kaplan, Andee, Daniel Nordman, and Stephen Vardeman. "A note on the instability and degeneracy of deep learning models."

arXiv preprint arXiv:1612.01159v2(2017).