TY - GEN
T1 - ARIA
T2 - ACM Multimedia 2004 - proceedings of the 12th ACM International Conference on Multimedia
AU - Peng, Lina
AU - Candan, K. Selçuk
AU - Ryu, Kyung D.
AU - Chatha, Karamvir S.
AU - Sundaram, Hari
PY - 2004
Y1 - 2004
N2 - We are developing an adaptive and programmable media-flow ARchitecture for Interactive Arts (ARIA) to enable real-time control of audio, video, and lighting on an intelligent stage. The intelligent stage is being equipped with a matrix of floor sensors for object localization, microphone arrays for sound localization, beam forming and motion capture system. ARIA system provides an interface for specifying intended mappings of the sensory inputs to audio-visual responses. Based on the specifications, the sensory inputs are streamed, filtered and fused, actuate a controllable projection system, sound surround and lighting system. The actuated responses take place in real-time and satisfy QoS requirements in live performance. In this paper, we present the ARIA quality-adaptive architecture. We model the basic information unit as a data object with a meta-data header and object payload streamed between nodes in the system and use a directed acyclic network to model media stream processing. We define performance metrics for the output precision, resource consumption, and end-to-end delay. The filters and fusion operators are being implemented by quality aware signal processing algorithms. The proper node behavior is chosen at runtime to achieve the QoS requirements and adapt to input object properties. For this purpose, ARIA utilizes a two-phase approach: static pre-optimization and dynamic run-time adaptation.
AB - We are developing an adaptive and programmable media-flow ARchitecture for Interactive Arts (ARIA) to enable real-time control of audio, video, and lighting on an intelligent stage. The intelligent stage is being equipped with a matrix of floor sensors for object localization, microphone arrays for sound localization, beam forming and motion capture system. ARIA system provides an interface for specifying intended mappings of the sensory inputs to audio-visual responses. Based on the specifications, the sensory inputs are streamed, filtered and fused, actuate a controllable projection system, sound surround and lighting system. The actuated responses take place in real-time and satisfy QoS requirements in live performance. In this paper, we present the ARIA quality-adaptive architecture. We model the basic information unit as a data object with a meta-data header and object payload streamed between nodes in the system and use a directed acyclic network to model media stream processing. We define performance metrics for the output precision, resource consumption, and end-to-end delay. The filters and fusion operators are being implemented by quality aware signal processing algorithms. The proper node behavior is chosen at runtime to achieve the QoS requirements and adapt to input object properties. For this purpose, ARIA utilizes a two-phase approach: static pre-optimization and dynamic run-time adaptation.
KW - Interactive
KW - Multi-model art
KW - Tools for creating multimedia art
UR - http://www.scopus.com/inward/record.url?scp=13444306254&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=13444306254&partnerID=8YFLogxK
M3 - Conference contribution
AN - SCOPUS:13444306254
SN - 1581138938
SN - 9781581138931
T3 - ACM Multimedia 2004 - proceedings of the 12th ACM International Conference on Multimedia
SP - 532
EP - 535
BT - ACM Multimedia 2004 - proceedings of the 12th ACM International Conference on Multimedia
Y2 - 10 October 2004 through 16 October 2004
ER -