TY - GEN
T1 - ViX
T2 - 2023 Design, Automation and Test in Europe Conference and Exhibition, DATE 2023
AU - Misra, Ashitabh
AU - Laurel, Jacob
AU - Misailovic, Sasa
N1 - Publisher Copyright:
© 2023 EDAA.
PY - 2023
Y1 - 2023
N2 - As large quantities of stochastic data are processed onboard tiny edge devices, these systems must constantly make decisions under uncertainty. This challenge necessitates principled embedded compiler support for time- and energy-efficient probabilistic inference. However, compiling probabilistic inference to run on the edge is significantly understudied, and the existing research is limited to computationally expensive MCMC algorithms. Hence, these works cannot leverage faster variational inference algorithms which can better scale to larger data sizes that are representative of realistic workloads in the edge setting. However, naively writing code for differentiable inference on resource-constrained edge devices is challenging due to the need for expensive floating point computations. Even when using reduced precision, a developer still faces the challenge of choosing the right quantization scheme, as gradients can be notoriously unstable in the face of low-precision. To address these challenges, we propose ViX which is the first compiler for low-precision probabilistic programming with variational inference. ViX generates optimized variational inference code in reduced precision by automatically exploiting Bayesian domain knowledge and analytical mathematical properties to ensure that low-precision gradients can still be effectively used. ViX can scale inference to much larger data-sets than previous compilers for resource-constrained probabilistic programming while attaining both high accuracy and significant speedup. Our evaluation of ViX across 7 benchmarks shows that ViX-generated code is up to 8.15× faster than performing the same variational inference in 32-bit floating point and also up to 22.67× faster than performing the variational inference in 64-bit double precision, all with minimal accuracy loss. Further, on a subset of our benchmarks, ViX can scale inference to data sizes between 16-80× larger than the existing state-of-the-art tool Statheros.
AB - As large quantities of stochastic data are processed onboard tiny edge devices, these systems must constantly make decisions under uncertainty. This challenge necessitates principled embedded compiler support for time- and energy-efficient probabilistic inference. However, compiling probabilistic inference to run on the edge is significantly understudied, and the existing research is limited to computationally expensive MCMC algorithms. Hence, these works cannot leverage faster variational inference algorithms which can better scale to larger data sizes that are representative of realistic workloads in the edge setting. However, naively writing code for differentiable inference on resource-constrained edge devices is challenging due to the need for expensive floating point computations. Even when using reduced precision, a developer still faces the challenge of choosing the right quantization scheme, as gradients can be notoriously unstable in the face of low-precision. To address these challenges, we propose ViX which is the first compiler for low-precision probabilistic programming with variational inference. ViX generates optimized variational inference code in reduced precision by automatically exploiting Bayesian domain knowledge and analytical mathematical properties to ensure that low-precision gradients can still be effectively used. ViX can scale inference to much larger data-sets than previous compilers for resource-constrained probabilistic programming while attaining both high accuracy and significant speedup. Our evaluation of ViX across 7 benchmarks shows that ViX-generated code is up to 8.15× faster than performing the same variational inference in 32-bit floating point and also up to 22.67× faster than performing the variational inference in 64-bit double precision, all with minimal accuracy loss. Further, on a subset of our benchmarks, ViX can scale inference to data sizes between 16-80× larger than the existing state-of-the-art tool Statheros.
UR - http://www.scopus.com/inward/record.url?scp=85162633998&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85162633998&partnerID=8YFLogxK
U2 - 10.23919/DATE56975.2023.10137324
DO - 10.23919/DATE56975.2023.10137324
M3 - Conference contribution
AN - SCOPUS:85162633998
T3 - Proceedings -Design, Automation and Test in Europe, DATE
BT - 2023 Design, Automation and Test in Europe Conference and Exhibition, DATE 2023 - Proceedings
PB - Institute of Electrical and Electronics Engineers Inc.
Y2 - 17 April 2023 through 19 April 2023
ER -