University of Oulu

Nguyen, K., Nguyen, H.H., Tiulpin, A. (2022). AdaTriplet: Adaptive Gradient Triplet Loss with Automatic Margin Learning for Forensic Medical Image Matching. In: Wang, L., Dou, Q., Fletcher, P.T., Speidel, S., Li, S. (eds) Medical Image Computing and Computer Assisted Intervention – MICCAI 2022. MICCAI 2022. Lecture Notes in Computer Science, vol 13438. Springer, Cham.

AdaTriplet : adaptive gradient triplet loss with automatic margin learning for forensic medical image matching

Saved in:
Author: Nguyen, Khanh1; Nguyen, Huy Hoang1; Tiulpin, Aleksei1
Organizations: 1University of Oulu, Oulu, Finland
Format: article
Version: accepted version
Access: embargoed
Persistent link:
Language: English
Published: Springer Nature, 2022
Publish Date: 2023-09-16


This paper tackles the challenge of forensic medical image matching (FMIM) using deep neural networks (DNNs). FMIM is a particular case of content-based image retrieval (CBIR). The main challenge in FMIM compared to the general case of CBIR, is that the subject to whom a query image belongs may be affected by aging and progressive degenerative disorders, making it difficult to match data on a subject level. CBIR with DNNs is generally solved by minimizing a ranking loss, such as Triplet loss (TL), computed on image representations extracted by a DNN from the original data. TL, in particular, operates on triplets: anchor, positive (similar to anchor) and negative (dissimilar to anchor). Although TL has been shown to perform well in many CBIR tasks, it still has limitations, which we identify and analyze in this work. In this paper, we introduce (i) the AdaTriplet loss — an extension of TL whose gradients adapt to different difficulty levels of negative samples, and (ii) the AutoMargin method — a technique to adjust hyperparameters of margin-based losses such as TL and our proposed loss dynamically. Our results are evaluated on two large-scale benchmarks for FMIM based on the Osteoarthritis Initiative and Chest X-ray-14 datasets. The codes allowing replication of this study have been made publicly available at

see all

Series: Lecture notes in computer science
ISSN: 0302-9743
ISSN-E: 1611-3349
ISSN-L: 0302-9743
Volume: 13438
Pages: 725 - 735
DOI: 10.1007/978-3-031-16452-1_69
Host publication: Medical image computing and computer assisted intervention : MICCAI 2022, 25th international conference Singapore, September 18–22, 2022 proceedings, part VIII
Host publication editor: Wang, Linwei
Dou, Qi
Fletcher, P. Thomas
Speidel, Stefanie
Li, Shuo
Conference: International conference on medical image computing and computer assisted intervention
Type of Publication: A4 Article in conference proceedings
Field of Science: 217 Medical engineering
Funding: The OAI is a public-private partnership comprised of five contracts (N01- AR-2-2258; N01-AR-2-2259; N01-AR-2- 2260; N01-AR-2-2261; N01-AR-2-2262) funded by the National Institutes of Health, a branch of the Department of Health and Human Services, and conducted by the OAI Study Investigators. Private funding partners include Merck Research Laboratories; Novartis Pharmaceuticals Corporation, GlaxoSmithKline; and Pfizer, Inc. Private sector funding for the OAI is managed by the Foundation for the National Institutes of Health. We would like to thank the strategic funding of the University of Oulu, the Academy of Finland Profi6 336449 funding program, the Northern Ostrobothnia hospital district, Finland (VTR project K33754) and Sigrid Juselius foundation for funding this work. Furthermore, the authors wish to acknowledge CSC – IT Center for Science, Finland, for generous computational resources.
Copyright information: © 2022 The Author(s), under exclusive license to Springer Nature Switzerland AG. This is a post-peer-review, pre-copyedit version of an article published in Lecture Notes in Computer Science. The final authenticated version is available online at: