Abstract
We present the first systematic study of effectiveness of robustness transformations on a diverse set of 24 probabilistic programs representing generalized linear models, mixture models, and time-series models. We evaluate five robustness transformations from literature on each model. We quantify and present insights on (1) the improvement of the posterior prediction accuracy and (2) the execution time overhead of the robustified programs, in the presence of three input noise models. To automate the evaluation of various robustness transformations, we developed ASTRA - a novel framework for quantifying the robustness of probabilistic programs and exploring the trade-offs between robustness and execution time. Our experimental results indicate that the existing transformations are often suitable only for specific noise models, can significantly increase execution time, and have non-trivial interaction with the inference algorithms.
Original language | English (US) |
---|---|
Pages (from-to) | 900-910 |
Number of pages | 11 |
Journal | Proceedings of Machine Learning Research |
Volume | 216 |
State | Published - 2023 |
Externally published | Yes |
Event | 39th Conference on Uncertainty in Artificial Intelligence, UAI 2023 - Pittsburgh, United States Duration: Jul 31 2023 → Aug 4 2023 |
ASJC Scopus subject areas
- Artificial Intelligence
- Software
- Control and Systems Engineering
- Statistics and Probability