This paper focuses on two shortcomings of radiative transfer codes commonly used in climate models. The first aspect concerns the partitioning of solar versus infrared spectral energy. In most climate models, the solar spectrum comprises wavelengths shortward of 4 μm with all incoming solar energy deposited in that range. In reality, however, the solar spectrum extends into the infrared, with about 12 Wm−2 in the 4–1000 μm range. In this paper a simple method is proposed wherein the longwave radiative transfer equation with solar energy input is solved. In comparison with the traditional method, the new solution results in more solar energy absorbed in the atmosphere and less at the surface. As mentioned in a recent intercomparison of IPCC-AR4 and line-by-line (LBL) radiation models (Collins et al. 2006), most climate model radiation schemes neglect shortwave absorption by methane. Yet, the shortwave radiative forcing at the surface due to CH4 since the preindustrial period is estimated to exceed that due to CO2. We show that the CH4 shortwave effect can be included in a correlated k-distribution model, with the additional flux being accurately simulated in comparison with LBL models. We also present 10-year GCM simulations showing the detailed climatic effect of these changes in radiation treatment. It is demonstrated that the inclusion of solar flux in the infrared range produces a significant amount of extra warming in the atmosphere, specifically: (i) in the tropical stratosphere where the warming can exceed 1 K day−1; and (ii) near the tropical tropopause layer. Additional GCM simulations show that the inclusion of CH4 in the shortwave calculations also produces a warming of the atmosphere and a consequent reduction of the upward flux at the top of the atmosphere.