Support Recovery from Linear Sketches

# 29






Abstract

In this talk, I will describe the problem of support recovery, where samples sharing a common unknown support are observed through low dimensional projections or ”linear sketches”, and the goal is to recover the common support. This problem has been well-studied in the single sample setting, and it is known that for certain classes of random projections, the projection dimension only needs to scale linearly in the support size and logarithmically in the dimension of the samples for guaranteeing support recovery. For the multiple sample setting, a natural question to ask is if we can make do with a smaller projection dimension per sample at the cost of a larger number of samples, and to characterize the tradeoff arising between support size, projection dimension, and the number of samples. We will see that the nature of this tradeoff differs depending on whether the projection dimension is larger or smaller relative to the support size. I will mention some results from the literature, which has mostly focused on characterizing this tradeoff in the large projection dimension regime, followed by our results in the case when the projection dimension is small. We will see some commonly used algorithms for this problem, and some lower bounds results that are used to show optimality of the algorithms.

Lekshmi Ramesh

Lekshmi Ramesh is a PhD student in the Department of ECE, working with Prof. Chandra R. Murthy and Prof. Himanshu Tyagi.