Adaptive techniques like voltage and frequency scaling, process variations and the randomness of input data contribute significantly to the statistical aspect of contemporary hardware designs. Therefore, the performance metrics of such designs are also statistical in nature. In previous work, we have employed probabilistic model checking to rigorously evaluate the statistical performance of hardware designs. In this paper, we present an automatic compositional reasoning technique to improve the scalability of probabilistic model checking of hardware systems. We partition the set of system observables into disjoint subsets and use them to structurally decompose the system into smaller components. We employ an assume-guarantee form of reasoning and analyze the space of environmental constraints using a value-based case splitting approach. We split the space of values of all the observables of one component into separate value-based cases. We provide an argument for the correctness of our technique. We illustrate the effectiveness of our technique by making probabilistic model checking feasible for evaluating performance metrics such as delay and Bit Error Rate (BER) of non-trivial hardware designs that we use as case studies. For example, we are able to determine the statistical delay of a 64-bit adder design with over 1040 states. We use PRISM as the probabilistic model checking engine in all our experiments.