University of Oulu

Maneas, E., Hauptmann, A., Alles, E.J. et al. Enhancement of instrumented ultrasonic tracking images using deep learning. Int J CARS 18, 395–399 (2023).

Enhancement of instrumented ultrasonic tracking images using deep learning

Saved in:
Author: Maneas, Efthymios1,2; Hauptmann, Andreas3,4; Alles, Erwin J.2,3;
Organizations: 1Wellcome/EPSRC Centre for Interventional and Surgical Sciences, University College London, London, W1W 7TY, UK
2Department of Medical Physics and Biomedical Engineering, University College London, London, WC1E 6BT, UK
3Department of Computer Science, University College London, London, WC1E 6BT, UK
4Research Unit of Mathematical Sciences, University of Oulu, FI-90014, Oulu, Finland
5School of Biomedical Engineering and Imaging Sciences, King’s College London, London, SE1 7EH, UK
6Institute for Women’s Health, University College London, London, WC1E 6HX, UK
7NIHR UCLH Biomedical Research Centre, London, W1T 7DN, UK
Format: article
Version: published version
Access: open
Online Access: PDF Full Text (PDF, 0.8 MB)
Persistent link:
Language: English
Published: Springer Nature, 2022
Publish Date: 2023-05-05


Purpose: Instrumented ultrasonic tracking provides needle localisation during ultrasound-guided minimally invasive percutaneous procedures. Here, a post-processing framework based on a convolutional neural network (CNN) is proposed to improve the spatial resolution of ultrasonic tracking images.

Methods: The custom ultrasonic tracking system comprised a needle with an integrated fibre-optic ultrasound (US) transmitter and a clinical US probe for receiving those transmissions and for acquiring B-mode US images. For post-processing of tracking images reconstructed from the received fibre-optic US transmissions, a recently-developed framework based on ResNet architecture, trained with a purely synthetic dataset, was employed. A preliminary evaluation of this framework was performed with data acquired from needle insertions in the heart of a fetal sheep in vivo. The axial and lateral spatial resolution of the tracking images were used as performance metrics of the trained network.

Results: Application of the CNN yielded improvements in the spatial resolution of the tracking images. In three needle insertions, in which the tip depth ranged from 23.9 to 38.4 mm, the lateral resolution improved from 2.11 to 1.58 mm, and the axial resolution improved from 1.29 to 0.46 mm.

Conclusion: The results provide strong indications of the potential of CNNs to improve the spatial resolution of ultrasonic tracking images and thereby to increase the accuracy of needle tip localisation. These improvements could have broad applicability and impact across multiple clinical fields, which could lead to improvements in procedural efficiency and reductions in risk of complications.

see all

Series: International journal of computer assisted radiology and surgery
ISSN: 1861-6410
ISSN-E: 1861-6429
ISSN-L: 1861-6410
Volume: 18
Pages: 395 - 399
DOI: 10.1007/s11548-022-02728-7
Type of Publication: A1 Journal article – refereed
Field of Science: 217 Medical engineering
113 Computer and information sciences
Funding: This work was funded by the Wellcome (WT101957; 203145Z/16/Z; 203148/Z/16/Z) and the Engineering and Physical Sciences Research Council (EPSRC) (NS/A000027/1; NS/A000050/1; NS/A000049/1; EP/L016478/1; EP/M020533/1; EP/S001506/1), by the European Research Council (ERC-2012-StG, Proposal 310970 MOPHIM), by the Rosetrees Trust (PGS19-2/10006) and by the Academy of Finland (336796; 338408). A.L.D. is supported by the UCL/UCL Hospital National Institute for Health Research Comprehensive Biomedical Research Centre.
Academy of Finland Grant Number: 336796
Detailed Information: 336796 (Academy of Finland Funding decision)
338408 (Academy of Finland Funding decision)
Copyright information: © The Author(s) 2022. This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit