Maximum-likelihood (ML) spectrum estimation is a notorious illposed problem. In this paper, we are concerned with the use of a regularization method for addressing this fundamental issue. We investigate a method of sieves and present two main results. The first one is a criterion for selecting the mesh size of the sieve, which determines the rate of convergence of the estimates. This criterion is based on information concepts for measuring convergence in the parameter set and is applicable to a wide class of estimation problems. We also recommend a method of sieves based upon a spline representation for the spectral density. The estimates are computationally tractable and consistent in information. The setup of this problem is very general and can be applied without major difficulties to the estimation of higher-dimensional spectral functions.