TY - GEN
T1 - Redefine the A in ABR for 360-degree Videos
T2 - 22nd IEEE International Symposium on Multimedia, ISM 2020
AU - Lee, Kuan Ying
AU - Yoo, Andrrw
AU - Park, Jounsup
AU - Nahrstedt, Klara
N1 - Publisher Copyright:
© 2020 IEEE.
PY - 2020/12
Y1 - 2020/12
N2 - 360-degree video has been popular due to the immersive experience it provides to the viewer. While watching, viewer can control the field of view (FoV)1 In this paper, we use viewport interchangeably with FoV in the range of 360° by 180°. As this trend continues, adaptive bitrate (ABR) streaming is becoming a prevalent issue. Most existing ABR algorithms for 360 videos (360 ABR algorithms) require real-time head traces and certain computation resource from the client for streaming, which largely constrains the range of audience. Also, while more 360 ABR algorithms rely upon machine learning (ML) for viewport prediction, ML and ABR are research topics that grow mostly independently. In this paper, we propose a two-fold ABR algorithm for 360 video streaming that utilizes 1) an off-the-shelf ABR algorithm for ordinary videos, and 2) an off-the-shelf viewport prediction model. Our algorithm requires neither real-time head traces nor additional computation from the viewing device. In addition, it adapts easily to the newest developments in viewport prediction and ABR. As a consequence, the proposed method fits nicely to the existing streaming framework and any advancement in viewport prediction and ABR could enhance its performance. With the quantitative experiments, we demonstrate that the proposed method achieves twice the quality of experience (QoE) compared to the baseline.
AB - 360-degree video has been popular due to the immersive experience it provides to the viewer. While watching, viewer can control the field of view (FoV)1 In this paper, we use viewport interchangeably with FoV in the range of 360° by 180°. As this trend continues, adaptive bitrate (ABR) streaming is becoming a prevalent issue. Most existing ABR algorithms for 360 videos (360 ABR algorithms) require real-time head traces and certain computation resource from the client for streaming, which largely constrains the range of audience. Also, while more 360 ABR algorithms rely upon machine learning (ML) for viewport prediction, ML and ABR are research topics that grow mostly independently. In this paper, we propose a two-fold ABR algorithm for 360 video streaming that utilizes 1) an off-the-shelf ABR algorithm for ordinary videos, and 2) an off-the-shelf viewport prediction model. Our algorithm requires neither real-time head traces nor additional computation from the viewing device. In addition, it adapts easily to the newest developments in viewport prediction and ABR. As a consequence, the proposed method fits nicely to the existing streaming framework and any advancement in viewport prediction and ABR could enhance its performance. With the quantitative experiments, we demonstrate that the proposed method achieves twice the quality of experience (QoE) compared to the baseline.
KW - 360°Video Streaming
KW - Machine Learning
UR - http://www.scopus.com/inward/record.url?scp=85101431744&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85101431744&partnerID=8YFLogxK
U2 - 10.1109/ISM.2020.00020
DO - 10.1109/ISM.2020.00020
M3 - Conference contribution
AN - SCOPUS:85101431744
T3 - Proceedings - 2020 IEEE International Symposium on Multimedia, ISM 2020
SP - 82
EP - 84
BT - Proceedings - 2020 IEEE International Symposium on Multimedia, ISM 2020
PB - Institute of Electrical and Electronics Engineers Inc.
Y2 - 2 December 2020 through 4 December 2020
ER -