Among the functions of the brain is information storage, which is physically implemented through changes in the strengths of synapses. Experimental investigations have revealed that synapses possess interesting and, in some cases, unexpected properties. Adopting the optimization approach to biology, we describe an information theoretic framework that accounts for several of these properties: typical central synapses are noisy, the distribution of synaptic weights among central synapses is wide, and synaptic connectivity between neurons is sparse. Our approach is based on maximizing channel capacity of neural tissue under resource constraints. We cast volume as a limited resource and utilize the empirical relationship between volume and synaptic strength. We find that capacity-achieving input distributions not only explain existing experimental measurements but also make non-trivial predictions about the physical structure of the mammalian brain. We also comment on the robustness of our optimization principles to uncertainties that are inherent in science.