In this paper, the problem of one-bit compressed sensing (OBCS) is formulated as a problem in probably approximately correct (PAC) learning. It is shown that the VapnikChervonenkis (VC-) dimension of the set of half-spaces in Rn generated by k-sparse vectors is bounded below by k(blg(n=k)c + 1) and above by b2k lg(en)c. By coupling this estimate with well-established results in PAC learning theory, we show that a consistent algorithm can recover a k-sparse vector with O(k lg n) measurements, given only the signs of the measurement vector. This result holds for all probability measures on Rn. The theory is also applicable to the case of noisy labels, where the signs of the measurements are flipped with some unknown probability.
|Original language||English (US)|
|Journal||Journal of Machine Learning Research|
|State||Published - Jan 1 2019|
ASJC Scopus subject areas
- Control and Systems Engineering
- Statistics and Probability
- Artificial Intelligence