Many aspects of multivariate analysis involve obtaining the precision matrix, i.e., the inverse of the covariance matrix. When the dimension is larger than the sample size, the sample covariance matrix is no longer positive definite, and the inverse does not exist. Under the sparsity assumption on the elements of the precision matrix, the problem can be solved by fitting a Gaussian graphical model with lasso penalty. However, in high-dimensional settings in behavioral sciences, the sparsity assumption does not necessarily hold. The dimensions are often greater than the sample sizes, while they are likely to be comparable in size. Under such circumstances, introducing some covariance structures might solve the issue of estimating the precision matrix. Factor analysis is employed for modeling the covariance structure and the Woodbury identity to find the precision matrix. Different methods are compared such as unweighted least squares and factor analysis with equal unique variances (i.e., the probabilistic principal component analysis), as well as ridge factor analysis with small ridge parameters. Results indicate that they all give relatively small mean squared errors even when the dimensions are larger than the sample size.