Tremendous efforts have been made to study the theoretical and algorithmic aspects of sparse recovery and low-rank matrix recovery. This paper establishes (near) optimal sample complexities for stable matrix recovery without constants or log factors. We treat sparsity, low-rankness, and other parsimonious structures within the same framework: constraint sets that have small covering numbers or Minkowski dimensions, which include notoriously challenging cases such as simultaneously sparse and low-rank matrices. We consider three types of random measurement matrices (unstructured, rank-1, and symmetric rank-1 matrices), following probability distributions that satisfy some mild conditions. In all these cases, we prove a fundamental achievability result - the recovery of matrices with parsimonious structures, using an optimal (or near optimal) number of measurements, is stable with high probability.