In this paper, we address the problem of coordinating a set of distributed energy resources (DERs) to regulate voltage in power distribution systems to desired levels. To this end, we formulate the voltage control problem as an optimization problem, the objective of which is to determine the optimal DER power injections that minimize the voltage deviations from desirable voltage levels subject to a set of constraints. The nonlinear relationship between the voltage magnitudes and the nodal power injections is approximated by a linear model, the parameters of which can be estimated in real-time efficiently using measurements. In particular, the parameter estimation requires much fewer data by exploiting the structural characteristics of the power distribution system. As such, the voltage control framework is intrinsically adaptive to parameter changes. Numerical studies on the IEEE 37-bus power distribution test feeder validated the effectiveness of the propose framework.