With an ever increasing demand and depleting energy resources, there has been a growing interest in conserving energy, such as the conservation voltage reduction (CVR) program to reduce energy consumption by decreasing feeder voltage. Several utilities are conducting pilot projects on their feeder systems to determine the feasibility and actual CVR payoff. One major challenge in analyzing the CVR field test data lies in the uncertainty of the power system load and a variety of dependent factors encompassing temperature and time. This paper proposes a methodology to facilitate the CVR performance analysis at the utilities by accounting all potentially influential factors. A linear relation to model the system power demand is presented, which allows a sparse linear regression method to obtain its sensitivity parameter to voltage magnitude and accordingly to quantify the CVR payoff. All input factors can also be ranked according to their statistical influence on representing the power demand output. The proposed method is first tested and validated using synthetic CVR data simulated for a 13.8 kV distribution feeder using OpenDss. It is further tested using the field CVR test data provided by a major U.S. Midwest electric utility. Both tests demonstrated the effectiveness of the proposed method as well as the usefulness and validity of the input factor ranking.