University of Oulu

A. Arjas et al., "Neural Network Kalman Filtering for 3-D Object Tracking From Linear Array Ultrasound Data," in IEEE Transactions on Ultrasonics, Ferroelectrics, and Frequency Control, vol. 69, no. 5, pp. 1691-1702, May 2022, doi: 10.1109/TUFFC.2022.3162097

Neural network Kalman filtering for 3D object tracking from linear array ultrasound data

Saved in:
Author: Arjas, Arttu1; Alles, Erwin J.2,3; Maneas, Efthymios2,3;
Organizations: 1Research Unit of Mathematical Sciences, University of Oulu, Finland
2Department of Medical Physics & Biomedical Engineering, University College London, UK
3Wellcome/EPSRC Centre for Interventional and Surgical Sciences, University College London, UK
4Department of Computer Science, University College London, UK
Format: article
Version: published version
Access: open
Online Access: PDF Full Text (PDF, 2 MB)
Persistent link: http://urn.fi/urn:nbn:fi-fe2022042530184
Language: English
Published: Institute of Electrical and Electronics Engineers, 2022
Publish Date: 2022-04-25
Description:

Abstract

Many interventional surgical procedures rely on medical imaging to visualise and track instruments. Such imaging methods not only need to be real-time capable, but also provide accurate and robust positional information. In ultrasound applications, typically only two-dimensional data from a linear array are available, and as such obtaining accurate positional estimation in three dimensions is non-trivial. In this work, we first train a neural network, using realistic synthetic training data, to estimate the out-of-plane offset of an object with the associated axial aberration in the reconstructed ultrasound image. The obtained estimate is then combined with a Kalman filtering approach that utilises positioning estimates obtained in previous time-frames to improve localisation robustness and reduce the impact of measurement noise. The accuracy of the proposed method is evaluated using simulations, and its practical applicability is demonstrated on experimental data obtained using a novel optical ultrasound imaging setup. Accurate and robust positional information is provided in real-time. Axial and lateral coordinates for out-of-plane objects are estimated with a mean error of 0.1mm for simulated data and a mean error of 0.2mm for experimental data. Three-dimensional localisation is most accurate for elevational distances larger than 1mm, with a maximum distance of 6mm considered for a 25mm aperture.

see all

Series: IEEE transactions on ultrasonics, ferroelectrics and frequency control
ISSN: 0885-3010
ISSN-E: 1525-8955
ISSN-L: 0885-3010
Volume: 69
Issue: 5
Pages: 1691 - 1702
DOI: 10.1109/TUFFC.2022.3162097
OADOI: https://oadoi.org/10.1109/TUFFC.2022.3162097
Type of Publication: A1 Journal article – refereed
Field of Science: 112 Statistics and probability
113 Computer and information sciences
217 Medical engineering
Subjects:
Funding: This work was supported by Academy of Finland Projects 336796, 326291, and 338408, the CMIC-EPSRC platform grant (EP/M020533/1), the Wellcome Trust (203145Z/16/Z), the Engineering and Physical Sciences Research Council (EPSRC) (NS/A000050/1), and the Rosetrees Trust (PGS19-2/10006).
Academy of Finland Grant Number: 336796
338408
Detailed Information: 336796 (Academy of Finland Funding decision)
338408 (Academy of Finland Funding decision)
Copyright information: © The Author(s) 2022. This work is licensed under a Creative Commons Attribution 4.0 License. For more information, see https://creativecommons.org/licenses/by/4.0/.
  https://creativecommons.org/licenses/by/4.0/