Algorithmic noise tolerance (ANT) is an effective statistical error compensation technique for digital signal processing systems. This paper proves a long held hypothesis that ANT has a strong Bayesian foundation, and develops an analytical framework for predicting the performance of, and designing performance-optimal ANT-based systems. ANT is shown to approximate an optimal Bayesian detector and an optimal minimum mean squared error (MMSE) estimator. We show that the theoretically optimum threshold and the optimal threshold obtained via Monte Carlo simulations agree to within 8%, with performance degradation of at most 2.1% for a variety of error probability mass functions. For a 2D-DCT implemented in a 45nm CMOS process, we find similar results where the thresholds have a 7.8% difference. Furthermore, both analysis and simulations indicate that ANT's probability of error detection is robust to the choice of the threshold.