This paper outlines interesting applications of large-deviation theory and asymptotic analysis to the design of wireless sensor networks. Sensor networks are envisioned to contain a large amount of wireless nodes. As such, asymptotic regimes where the number of nodes becomes large are important tools in identifying good design rules for future sensor systems. Through a simple example, we show how the Gärtner-Ellis theorem can be used to study the impact of density on overall performance in resource constrained systems. Specifically, we consider the problem where sensor nodes receive partial information about their environment, and then send a summary of their observations to a fusion center for purpose of detection. Each node transmits its own data on a noisy communication channel. Observations are assumed to become increasingly correlated as sensor nodes are placed in close proximity. It is found that high node density performs well even when observations from adjacent sensors are highly correlated. Furthermore, the tools presented in this paper can be employed for a more complete analysis of the tradeoff between resource allocation, system complexity, and overall performance in wireless sensor networks.