The noisiness of a channel can be measured by comparing suitable functionals of the input and output distributions. For instance, if we fix a reference input distribution, then the worst-case ratio of output relative entropy to input relative entropy for any other input distribution is bounded by one, by the data processing theorem. However, for a fixed reference input distribution, this quantity may be strictly smaller than one, giving so-called strong data processing inequalities (SDPIs). This paper shows that the problem of determining both the best constant in an SDPI and any input distributions that achieve it can be addressed using so-called logarithmic Sobolev inequalities, which relate input relative entropy to certain measures of input-output correlation. Another contribution is a proof of equivalence between SDPIs and a limiting case of certain strong data processing inequalities for the Rényi divergence.