Blur invariants for image recognition |
|
Author: | Flusser, Jan1; Lébl, Matěj1; Šroubek, Filip1; |
Organizations: |
1Czech Academy of Sciences, Institute of Information Theory and Automation, Pod Vodárenskou věží 4, 18208, Prague 8, Czech Republic 2Department of Computer Science and Engineering, Center for Machine Vision and Signal Analysis, University of Oulu, 90014, Oulu, Finland |
Format: | article |
Version: | published version |
Access: | open |
Online Access: | PDF Full Text (PDF, 1.5 MB) |
Persistent link: | http://urn.fi/urn:nbn:fi-fe20231106143271 |
Language: | English |
Published: |
Springer Nature,
2023
|
Publish Date: | 2023-11-06 |
Description: |
AbstractBlur is an image degradation that makes object recognition challenging. Restoration approaches solve this problem via image deblurring, deep learning methods rely on the augmentation of training sets. Invariants with respect to blur offer an alternative way of describing and recognising blurred images without any deblurring and data augmentation. In this paper, we present an original theory of blur invariants. Unlike all previous attempts, the new theory requires no prior knowledge of the blur type. The invariants are constructed in the Fourier domain by means of orthogonal projection operators and moment expansion is used for efficient and stable computation. Applying a general substitution rule, combined invariants to blur and spatial transformations are easy to construct and use. Experimental comparison to Convolutional Neural Networks shows the advantages of the proposed theory. see all
|
Series: |
International journal of computer vision |
ISSN: | 0920-5691 |
ISSN-E: | 1573-1405 |
ISSN-L: | 0920-5691 |
Volume: | 131 |
Issue: | 9 |
Pages: | 2298 - 2315 |
DOI: | 10.1007/s11263-023-01798-7 |
OADOI: | https://oadoi.org/10.1007/s11263-023-01798-7 |
Type of Publication: |
A1 Journal article – refereed |
Field of Science: |
213 Electronic, automation and communications engineering, electronics 113 Computer and information sciences |
Subjects: | |
Funding: |
Open access publishing supported by the National Technical Library in Prague. |
Copyright information: |
© The Author(s) 2023. This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/. |
https://creativecommons.org/licenses/by/4.0/ |