Abstract
We propose injective generative models called TRUMPETs that generalize invertible normalizing flows. The proposed generators progressively increase dimension from a low-dimensional latent space. We demonstrate that TRUMPETs can be trained orders of magnitudes faster than standard flows while yielding samples of comparable or better quality. They retain many of the advantages of the standard flows such as training based on maximum likelihood and a fast, exact inverse of the generator. Since TRUMPETs are injective and have fast inverses, they can be effectively used for downstream Bayesian inference. To wit, we use TRUMPET priors for maximum a posteriori estimation in the context of image reconstruction from compressive measurements, outperforming competitive baselines in terms of reconstruction quality and speed. We then propose an efficient method for posterior characterization and uncertainty quantification with TRUMPETs by taking advantage of the low-dimensional latent space.
Original language | English (US) |
---|---|
Pages | 1269-1278 |
Number of pages | 10 |
State | Published - 2021 |
Event | 37th Conference on Uncertainty in Artificial Intelligence, UAI 2021 - Virtual, Online Duration: Jul 27 2021 → Jul 30 2021 |
Conference
Conference | 37th Conference on Uncertainty in Artificial Intelligence, UAI 2021 |
---|---|
City | Virtual, Online |
Period | 7/27/21 → 7/30/21 |
ASJC Scopus subject areas
- Artificial Intelligence
- Applied Mathematics