University of Oulu

R. Barbano et al., "An Educated Warm Start for Deep Image Prior-Based Micro CT Reconstruction," in IEEE Transactions on Computational Imaging, vol. 8, pp. 1210-1222, 2022, doi: 10.1109/TCI.2022.3233188

An educated warm start for deep image prior-based micro CT reconstruction

Saved in:
Author: Barbano, Riccardo1; Leuschner, Johannes2; Schmidt, Maximilian2;
Organizations: 1Department of Computer Science, University College London, London, U.K.
2Center for Industrial Mathematics, University of Bremen, Bremen, Germany
3Research Unit of Mathematical Sciences, University of Oulu, Oulu, Finland
4Department of Mathematics, The Chinese University of Hong Kong, Shatin, N.T., Hong Kong
Format: article
Version: accepted version
Access: open
Online Access: PDF Full Text (PDF, 4.8 MB)
Persistent link: http://urn.fi/urn:nbn:fi-fe2023021627505
Language: English
Published: Institute of Electrical and Electronics Engineers, 2022
Publish Date: 2023-02-16
Description:

Abstract

Deep image prior (DIP) was recently introduced as an effective unsupervised approach for image restoration tasks. DIP represents the image to be recovered as the output of a deep convolutional neural network, and learns the network’s parameters such that the model output matches the corrupted observation. Despite its impressive reconstructive properties, the approach is slow when compared to supervisedly learned, or traditional reconstruction techniques. To address the computational challenge, we bestow DIP with a two-stage learning paradigm: (i) perform a supervised pretraining of the network on a simulated dataset; (ii) fine-tune the network’s parameters to adapt to the target reconstruction task. We provide a thorough empirical analysis to shed insights into the impacts of pretraining in the context of image reconstruction. We showcase that pretraining considerably speeds up and stabilizes the subsequent reconstruction task from real-measured 2D and 3D micro computed tomography data of biological specimens.

see all

Series: IEEE transactions on computational imaging
ISSN: 2573-0436
ISSN-E: 2333-9403
ISSN-L: 2573-0436
Volume: 8
Pages: 1210 - 1222
DOI: 10.1109/TCI.2022.3233188
OADOI: https://oadoi.org/10.1109/TCI.2022.3233188
Type of Publication: A1 Journal article – refereed
Field of Science: 113 Computer and information sciences
Subjects:
Funding: The work of Riccardo Barbano was supported in part by the i4health Ph.D. studentship, U.K. EPSRC under Grant EP/S021930/1, and in part by The Alan Turing Institute through U.K. EPSRC under Grant EP/N510129/1. The work of Johannes Leuschner, Maximilian Schmidt, and Alexander Denker was supported in part by German Research Foundation, DFG, under Grant GRK 2224/1. The work of Johannes Leuschner and Maximilian Schmidt was supported by the Federal Ministry of Education and Research, (BMBF) through DELETO Project under Grant 05M20LBB. The work of Alexander Denker was supported by the Klaus Tschira Stiftung through the Project MALDISTAR under Grant 00.010.2019. The work of Andreas Hauptmann was supported by the Academy of Finland under Grants 338408, 336796, 353093, and 334817. The work of Peter Maass was supported by the Sino-German Center for Research Promotion (CDZ) through the Mobility Programme 2021: Inverse Problems – Theories, Methods and Implementations (IP–TMI). The work of Bangti Jin was supported by U.K. EPSRC under Grants EP/T000864/1 and EP/V026259/1.
Academy of Finland Grant Number: 338408
336796
353093
334817
Detailed Information: 338408 (Academy of Finland Funding decision)
336796 (Academy of Finland Funding decision)
353093 (Academy of Finland Funding decision)
334817 (Academy of Finland Funding decision)
Copyright information: © 2022 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.