E. Maneas et al., "Deep Learning for Instrumented Ultrasonic Tracking: From Synthetic Training Data to In Vivo Application," in IEEE Transactions on Ultrasonics, Ferroelectrics, and Frequency Control, vol. 69, no. 2, pp. 543-552, Feb. 2022, doi: 10.1109/TUFFC.2021.3126530
Deep learning for instrumented ultrasonic tracking : from synthetic training data to in vivo application
|Author:||Maneas, Efthymios1,2; Hauptmann, Andreas3,4; Alles, Erwin J.1,2;|
1Wellcome/ EPSRC Centre for Interventional and Surgical Sciences, University College London, London W1W 7TY, U.K
2Department of Medical Physics and Biomedical Engineering, University College London, London WC1E 6BT, U.K.
3Research Unit of Mathematical Sciences, University of Oulu, Oulu FI-90014, Finland
4Department of Computer Science, University College London, London WC1E 6BT, U.K.
5School of Biomedical Engineering and Imaging Sciences, King’s College London, London SE1 7EH, U.K.
6Wellcome/EPSRC Centre for Interventional and Surgical Sciences, University College London, London W1W 7TY, U.K., the Institute for Women’s Health, University College London, London WC1E 6HX, U.K.
7NIHR UCLH Biomedical Research Centre, London W1T 7DN, U.K.
|Online Access:||PDF Full Text (PDF, 2.7 MB)|
|Persistent link:|| http://urn.fi/urn:nbn:fi-fe2021111856028
Institute of Electrical and Electronics Engineers,
|Publish Date:|| 2021-11-18
Instrumented ultrasonic tracking is used to improve needle localisation during ultrasound guidance of minimally-invasive percutaneous procedures. Here, it is implemented with transmitted ultrasound pulses from a clinical ultrasound imaging probe that are detected by a fibre-optic hydrophone integrated into a needle. The detected transmissions are then reconstructed to form the tracking image. Two challenges are considered with the current implementation of ultrasonic tracking. First, tracking transmissions are interleaved with the acquisition of B-mode images and thus, the effective B-mode frame rate is reduced. Second, it is challenging to achieve an accurate localisation of the needle tip when the signal-to-noise ratio is low. To address these challenges, we present a framework based on a convolutional neural network (CNN) to maintain spatial resolution with fewer tracking transmissions and to enhance signal quality. A major component of the framework included the generation of realistic synthetic training data. The trained network was applied to unseen synthetic data and experimental in vivo tracking data. The performance of needle localisation was investigated when reconstruction was performed with fewer (up to eight-fold) tracking transmissions. CNN-based processing of conventional reconstructions showed that the axial and lateral spatial resolution could be improved even with an eight-fold reduction in tracking transmissions. The framework presented in this study will significantly improve the performance of ultrasonic tracking, leading to faster image acquisition rates and increased localisation accuracy.
IEEE transactions on ultrasonics, ferroelectrics and frequency control
|Pages:||543 - 552|
|Type of Publication:||
A1 Journal article – refereed
|Field of Science:||
113 Computer and information sciences
217 Medical engineering
This work was supported by the Wellcome Trust (WT101957; 203145Z/16/Z; 203148/Z/16/Z) and the Engineering and Physical Sciences Research Council (EPSRC) (NS/A000027/1; NS/A000050/1; NS/A000049/1; EP/L016478/1), by a Starting Grant from the European Research Council (ERC-2012-StG, Proposal 310970 MOPHIM) and by a CMIC-EPSRC platform grant (EP/M020533/1), and by the Academy
of Finland Project 336796 (Finnish Centre of Excellence in Inverse Modelling and Imaging, 2018–2025) as well as Project 338408. A.L. David is supported by the UCL/UCLH NIHR Comprehensive Biomedical
|Academy of Finland Grant Number:||
336796 (Academy of Finland Funding decision)
338408 (Academy of Finland Funding decision)
© 2021 The Authors. This work is licensed under a Creative Commons Attribution 4.0 License. For more information, see https://creativecommons.org/licenses/by/4.0/.