This paper addresses the problems of real-time localization and 3D depth estimation across disparate sensing systems. The sensing systems include wireless micro-electro-mechanical systems (MEMS) sensor networks, such as MICA sensors by Crossbow Inc., Radio Frequency Identification (RFID) tags and cameras that capture a variety of spectra. Some of the sensing is adaptive in time and space by using a remotely controlled robot for the sensor deployment. The motivation for integrating and analyzing multiple sensing systems and spectral modalities comes from the fact that in many applications a single sensing system or modality does not lead to robust and accurate performance. In this work we design systems for localization using Radio Frequency Identification (RFID) tags and real time 3D depth estimation from stereo vision in order to incorporate power constraints imposed on deployment of battery-operated wireless MICA sensors. The resulting methods are applied to the development of (a) hazard aware spaces (HAS) to alert people in events of Are, and (b) tele-immersive spaces (TEEVE) to enable remote collaborations, training and art performances. The novelty of our work lies in the power efficient deployment of wireless sensors for location aware applications by combining multiple sensors with advanced signal and image processing algorithms.