Recently, there has been a lot of interest in the connections between information-theoretic and estimation-theoretic properties of various noisy channel models. For example, Guo, Shamai, and Verdú have shown that mutual information in Gaussian channels is related in a simple way to minimum meansquare error, regardless of the input distribution. In this paper, we consider the class of E-type channels, i.e., additive noise channels induced by an exponential family of distributions. We derive several differential and integral representations of the mutual information and the posterior information gain that are valid for any E-type channel regardless of input distribution. Next, we establish an extremal property of E-type channels that connects the Bayesian concept of a posterior estimate with a natural rate-distortion problem and makes precise a qualitative observation made by Mitter and Newton concerning informationtheoretic properties of optimal nonlinear filters. Finally, we indicate how our results may be used to show monotonicity of the mutual information in E-type channels as a function of a "channel quality" parameter without assuming stochastic degradation.