Exact asymptotics are derived for composite hypothesis testing between two product probability measures Pn vs Qn, subject to a type-I error-probability constraint ϵ. Here P is known but Q is an unknown element of a given d-dimensional regular exponential family. We study the Rao score test, which is a quadratic approximation to the GLRT. The type-II error probability is shown to vanish as equation where D and V are respectively the Kullback-Leibler divergence and the variance of information divergence between P and Q; τ(ϵ; d) is the 1 - ϵ quantile for the χd2 distribution; and the constants βd > 0 and γd are explicitly identified. The asymptotic regret relative to the Neyman-Pearson test (which knows Q) is reflected in the coefficient τ(ϵ; d), as is the cost of dimensionality. Looser asymptotics (with O(1) in place of ϵd) are obtained for the GLRT.