Recursive robust PCA or recursive sparse recovery in large but structured noise

Thumbnail Image
Date
2013-01-01
Authors
Qiu, Chenlu
Major Professor
Advisor
Namrata Vaswani
Committee Member
Journal Title
Journal ISSN
Volume Title
Publisher
Authors
Research Projects
Organizational Units
Journal Issue
Is Version Of
Versions
Series
Department
Electrical and Computer Engineering
Abstract

In this work, we study the problem of recursively recovering a time sequence of sparse vectors, St, from measurements Mt := St + Lt that are corrupted by structured noise Lt which is dense and can have large magnitude. The structure that we require is that Lt should lie in a low dimensional subspace that is either fixed or changes ``slowly enough"; and the eigenvalues of its covariance matrix are ``clustered". We do not assume anything about the sequence of sparse vectors, except a bound on their support size. Their support sets and their nonzero element values may be either independent or correlated over time (usually in many applications they are correlated). A key application where this problem occurs is in video surveillance where the goal is to separate a slowly changing background (Lt) from moving foreground objects (St) on-the-fly. To solve the above problem, we introduce a novel solution called Recursive Projected Compressive Sensing (ReProCS). Under mild assumption, we show that ReProCS can exactly recover the support set of St at all times; and the reconstruction errors of both St and Lt are upper bounded by a time-invariant and small value at all times. ReProCS is designed under the assumption that the subspace in which the most recent several Lt's lie can only grow over time. Therefore, it needs to assume a bound on the total number of subspace changes, $J$. To address this limitation, we introduce a novel subspace estimation scheme called cluster-PCA and we refer to the resulting algorithm as ReProCS with cluster-PCA (ReProCS-cPCA). ReProCS-cPCA does not need a bound on J as long as the delay between subspace change times increases in proportion to log J. An extra assumption that is needed though is that the eigenvalues of the covariance matrix of Lt are sufficiently clustered. As a by-product, at certain times, the basis vectors for the subspace in which the most recent several Lt's lies is also recovered.

Comments
Description
Keywords
Citation
DOI
Source
Subject Categories
Copyright
Tue Jan 01 00:00:00 UTC 2013