This paper deals with information capacities of Gaussian channels under small (but nonvanishing) peak power constraints. We prove that, when the peak amplitude is below 1.05, the capacity of the scalar Gaussian channel is achieved by symmetric equiprobable signaling and is equal to at least 80% of the corresponding average-power capacity. The proof uses the identity of Guo, Shamai and Verdú that relates mutual information and minimum mean square error in Gaussian channels, together with several results on the minimax estimation of a bounded parameter in white Gaussian noise. We also give upper and lower bounds on peak-power capacities of vector Gaussian channels whose inputs are constrained to lie in suitably small ellipsoids and show that we can achieve at least 80% of the average-power capacity by having the transmitters use symmetric equiprobable signaling at amplitudes determined from the usual water-filling policy. The 80% figure comes from an upper bound on the ratio of the nonlinear and the linear minimax risks of estimating a bounded parameter in white Gaussian noise.