Energy-efficient video processing for virtual reality

Yue Leng, Chi Chun Chen, Qiuyue Sun, Jian Huang, Yuhao Zhu

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

Virtual reality (VR) has huge potential to enable radically new applications, behind which spherical panoramic video processing is one of the backbone techniques. However, current VR systems reuse the techniques designed for processing conventional planar videos, resulting in significant energy inefficiencies. Our characterizations show that operations that are unique to processing 360° VR content constitute 40% of the total processing energy consumption. We present EVR, an end-to-end system for energy-efficient VR video processing. EVR recognizes that the major contributor to the VR tax is the projective transformation (PT) operations. EVR mitigates the overhead of PT through two key techniques: semantic-aware streaming (SAS) on the server and hardware-accelerated rendering (HAR) on the client device. EVR uses SAS to reduce the chances of executing projective transformation on VR devices by pre-rendering 360° frames in the cloud. Different from conventional pre-rendering techniques, SAS exploits the key semantic information inherent in VR content that is previously ignored. Complementary to SAS, HAR mitigates the energy overhead of on-device rendering through a new hardware accelerator that is specialized for projective transformation. We implement an EVR prototype on an Amazon AWS server instance and a NVIDA Jetson TX2 board combined with a Xilinx Zynq-7000 FPGA. Real system measurements show that EVR reduces the energy of VR rendering by up to 58%, which translates to up to 42% energy saving for VR devices.

Original languageEnglish (US)
Title of host publicationISCA 2019 - Proceedings of the 2019 46th International Symposium on Computer Architecture
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages91-103
Number of pages13
ISBN (Electronic)9781450366694
DOIs
StatePublished - Jun 22 2019
Event46th International Symposium on Computer Architecture, ISCA 2019 - Phoenix, United States
Duration: Jun 22 2019Jun 26 2019

Publication series

NameProceedings - International Symposium on Computer Architecture
ISSN (Print)1063-6897

Conference

Conference46th International Symposium on Computer Architecture, ISCA 2019
CountryUnited States
CityPhoenix
Period6/22/196/26/19

Fingerprint

Virtual reality
Processing
Semantics
Servers
Taxation
Computer hardware
Particle accelerators
Field programmable gate arrays (FPGA)
Energy conservation
Energy utilization
Hardware

Keywords

  • Energy efficiency
  • Hardware accelerator
  • Pre-rendering
  • Projective transformation
  • Video processing
  • Virtual reality

ASJC Scopus subject areas

  • Hardware and Architecture

Cite this

Leng, Y., Chen, C. C., Sun, Q., Huang, J., & Zhu, Y. (2019). Energy-efficient video processing for virtual reality. In ISCA 2019 - Proceedings of the 2019 46th International Symposium on Computer Architecture (pp. 91-103). (Proceedings - International Symposium on Computer Architecture). Institute of Electrical and Electronics Engineers Inc.. https://doi.org/10.1145/3307650.3322264

Energy-efficient video processing for virtual reality. / Leng, Yue; Chen, Chi Chun; Sun, Qiuyue; Huang, Jian; Zhu, Yuhao.

ISCA 2019 - Proceedings of the 2019 46th International Symposium on Computer Architecture. Institute of Electrical and Electronics Engineers Inc., 2019. p. 91-103 (Proceedings - International Symposium on Computer Architecture).

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Leng, Y, Chen, CC, Sun, Q, Huang, J & Zhu, Y 2019, Energy-efficient video processing for virtual reality. in ISCA 2019 - Proceedings of the 2019 46th International Symposium on Computer Architecture. Proceedings - International Symposium on Computer Architecture, Institute of Electrical and Electronics Engineers Inc., pp. 91-103, 46th International Symposium on Computer Architecture, ISCA 2019, Phoenix, United States, 6/22/19. https://doi.org/10.1145/3307650.3322264
Leng Y, Chen CC, Sun Q, Huang J, Zhu Y. Energy-efficient video processing for virtual reality. In ISCA 2019 - Proceedings of the 2019 46th International Symposium on Computer Architecture. Institute of Electrical and Electronics Engineers Inc. 2019. p. 91-103. (Proceedings - International Symposium on Computer Architecture). https://doi.org/10.1145/3307650.3322264
Leng, Yue ; Chen, Chi Chun ; Sun, Qiuyue ; Huang, Jian ; Zhu, Yuhao. / Energy-efficient video processing for virtual reality. ISCA 2019 - Proceedings of the 2019 46th International Symposium on Computer Architecture. Institute of Electrical and Electronics Engineers Inc., 2019. pp. 91-103 (Proceedings - International Symposium on Computer Architecture).
@inproceedings{c757a707f621407094a98449fe8135f9,
title = "Energy-efficient video processing for virtual reality",
abstract = "Virtual reality (VR) has huge potential to enable radically new applications, behind which spherical panoramic video processing is one of the backbone techniques. However, current VR systems reuse the techniques designed for processing conventional planar videos, resulting in significant energy inefficiencies. Our characterizations show that operations that are unique to processing 360° VR content constitute 40{\%} of the total processing energy consumption. We present EVR, an end-to-end system for energy-efficient VR video processing. EVR recognizes that the major contributor to the VR tax is the projective transformation (PT) operations. EVR mitigates the overhead of PT through two key techniques: semantic-aware streaming (SAS) on the server and hardware-accelerated rendering (HAR) on the client device. EVR uses SAS to reduce the chances of executing projective transformation on VR devices by pre-rendering 360° frames in the cloud. Different from conventional pre-rendering techniques, SAS exploits the key semantic information inherent in VR content that is previously ignored. Complementary to SAS, HAR mitigates the energy overhead of on-device rendering through a new hardware accelerator that is specialized for projective transformation. We implement an EVR prototype on an Amazon AWS server instance and a NVIDA Jetson TX2 board combined with a Xilinx Zynq-7000 FPGA. Real system measurements show that EVR reduces the energy of VR rendering by up to 58{\%}, which translates to up to 42{\%} energy saving for VR devices.",
keywords = "Energy efficiency, Hardware accelerator, Pre-rendering, Projective transformation, Video processing, Virtual reality",
author = "Yue Leng and Chen, {Chi Chun} and Qiuyue Sun and Jian Huang and Yuhao Zhu",
year = "2019",
month = "6",
day = "22",
doi = "10.1145/3307650.3322264",
language = "English (US)",
series = "Proceedings - International Symposium on Computer Architecture",
publisher = "Institute of Electrical and Electronics Engineers Inc.",
pages = "91--103",
booktitle = "ISCA 2019 - Proceedings of the 2019 46th International Symposium on Computer Architecture",
address = "United States",

}

TY - GEN

T1 - Energy-efficient video processing for virtual reality

AU - Leng, Yue

AU - Chen, Chi Chun

AU - Sun, Qiuyue

AU - Huang, Jian

AU - Zhu, Yuhao

PY - 2019/6/22

Y1 - 2019/6/22

N2 - Virtual reality (VR) has huge potential to enable radically new applications, behind which spherical panoramic video processing is one of the backbone techniques. However, current VR systems reuse the techniques designed for processing conventional planar videos, resulting in significant energy inefficiencies. Our characterizations show that operations that are unique to processing 360° VR content constitute 40% of the total processing energy consumption. We present EVR, an end-to-end system for energy-efficient VR video processing. EVR recognizes that the major contributor to the VR tax is the projective transformation (PT) operations. EVR mitigates the overhead of PT through two key techniques: semantic-aware streaming (SAS) on the server and hardware-accelerated rendering (HAR) on the client device. EVR uses SAS to reduce the chances of executing projective transformation on VR devices by pre-rendering 360° frames in the cloud. Different from conventional pre-rendering techniques, SAS exploits the key semantic information inherent in VR content that is previously ignored. Complementary to SAS, HAR mitigates the energy overhead of on-device rendering through a new hardware accelerator that is specialized for projective transformation. We implement an EVR prototype on an Amazon AWS server instance and a NVIDA Jetson TX2 board combined with a Xilinx Zynq-7000 FPGA. Real system measurements show that EVR reduces the energy of VR rendering by up to 58%, which translates to up to 42% energy saving for VR devices.

AB - Virtual reality (VR) has huge potential to enable radically new applications, behind which spherical panoramic video processing is one of the backbone techniques. However, current VR systems reuse the techniques designed for processing conventional planar videos, resulting in significant energy inefficiencies. Our characterizations show that operations that are unique to processing 360° VR content constitute 40% of the total processing energy consumption. We present EVR, an end-to-end system for energy-efficient VR video processing. EVR recognizes that the major contributor to the VR tax is the projective transformation (PT) operations. EVR mitigates the overhead of PT through two key techniques: semantic-aware streaming (SAS) on the server and hardware-accelerated rendering (HAR) on the client device. EVR uses SAS to reduce the chances of executing projective transformation on VR devices by pre-rendering 360° frames in the cloud. Different from conventional pre-rendering techniques, SAS exploits the key semantic information inherent in VR content that is previously ignored. Complementary to SAS, HAR mitigates the energy overhead of on-device rendering through a new hardware accelerator that is specialized for projective transformation. We implement an EVR prototype on an Amazon AWS server instance and a NVIDA Jetson TX2 board combined with a Xilinx Zynq-7000 FPGA. Real system measurements show that EVR reduces the energy of VR rendering by up to 58%, which translates to up to 42% energy saving for VR devices.

KW - Energy efficiency

KW - Hardware accelerator

KW - Pre-rendering

KW - Projective transformation

KW - Video processing

KW - Virtual reality

UR - http://www.scopus.com/inward/record.url?scp=85069517866&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85069517866&partnerID=8YFLogxK

U2 - 10.1145/3307650.3322264

DO - 10.1145/3307650.3322264

M3 - Conference contribution

T3 - Proceedings - International Symposium on Computer Architecture

SP - 91

EP - 103

BT - ISCA 2019 - Proceedings of the 2019 46th International Symposium on Computer Architecture

PB - Institute of Electrical and Electronics Engineers Inc.

ER -