University of Oulu

Barbano, Riccardo; et al. (2020) Quantifying sources of uncertainty in deep learning-based image reconstruction. In: NeurIPS 2020 Workshop on Deep Learning and Inverse Problems, Accepted poster papers, https://openreview.net/pdf?id=iUGcSYdJogv

Quantifying sources of uncertainty in deep learning-based image reconstruction

Saved in:
Author: Barbano, Riccardo1; Kereta, Željko1; Zhang, Chen2;
Organizations: 1University College London, UK
2Huawei Technologies R&D UK
3University of Oulu, Finland
Format: poster
Version: published version
Access: open
Online Access: PDF Full Text (PDF, 2.2 MB)
Persistent link: http://urn.fi/urn:nbn:fi-fe20201214100577
Language: English
Published: Deepinverse, 2020
Publish Date: 2020-12-14
Description:

Abstract

Image reconstruction methods based on deep neural networks have shown outstanding performance, equalling or exceeding the state-of-the-art results of conventional approaches, but often do not provide uncertainty information about the reconstruction. In this work we propose a scalable and efficient framework to simultaneously quantify aleatoric and epistemic uncertainties in learned iterative image reconstruction. We build on a Bayesian deep gradient descent method for quantifying epistemic uncertainty, and incorporate the heteroscedastic variance of the noise to account for the aleatoric uncertainty. We show that our method exhibits competitive performance against conventional benchmarks for computed tomography with both sparse view and limited angle data. The estimated uncertainty captures the variability in the reconstructions, caused by the restricted measurement model, and by missing information, due to the limited angle geometry.

see all

Pages: 1 - 13
Host publication: NeurIPS 2020 Workshop on Deep Learning and Inverse Problems
Conference: Workshop on Deep Learning and Inverse Problems
Field of Science: 111 Mathematics
112 Statistics and probability
113 Computer and information sciences
Subjects:
Copyright information: © 2020 The Authors.