Abstract
Recent advances in computer vision algorithms and video streaming technologies have facilitated the development of edge-server-based video analytics systems, enabling them to process sophisticated real-world tasks, such as traffic surveillance and workspace monitoring. Meanwhile, due to their omnidirectional recording capability, 360-degree cameras have been proposed to replace traditional cameras in video analytics systems to offer enhanced situational awareness. Yet, we found that providing an efficient 360-degree video analytics framework is a non-trivial task. Due to the higher resolution and geometric distortion in 360-degree videos, existing video analytics pipelines fail to meet the performance requirements for end-to-end latency and query accuracy. To address these challenges, we introduce the innovative ST-360 framework specifically designed for 360-degree video analytics. This framework features a spatial-temporal filtering algorithm that optimizes both data transmission and computational workloads. Evaluation of the ST-360 framework on a unique dataset of 360-degree first-responders videos reveals that it yields accurate query results with a 50% reduction in end-to-end latency compared to state-of-the-art methods.
| Original language | English (US) |
|---|---|
| Article number | 248 |
| Journal | ACM Transactions on Multimedia Computing, Communications and Applications |
| Volume | 21 |
| Issue number | 9 |
| Early online date | Sep 11 2025 |
| DOIs | |
| State | Published - Sep 12 2025 |
Keywords
- 360-Degree Video Analysis
- Edge Computing
- Smart Filtering
ASJC Scopus subject areas
- Hardware and Architecture
- Computer Networks and Communications