AuthorsS. Hicks, V. Thambawita, A. Storås, T. B. Haugen, H. L. Hammer, P. Halvorsen, M. Riegler and M. H. Stensen
TitleAutomatic Tracking of the ICSI procedure using Deep Learning
AfilliationMachine Learning
Project(s)Department of Holistic Systems
Publication TypeJournal Article
Year of Publication2022
JournalHuman Reproduction
Date Published07/2022
PublisherOxford University Press

Study question

Can deep learning be used to detect and track spermatozoa and the different parts of an ICSI procedure?

Summary answer

Deep learning can be used as a tool to assist and organize the contents of an ICSI procedure.

What is known already

Sperm tracking has been a topic of research and practice for many years, especially in the context of computer-aided sperm analysis (CASA). Recent studies have proposed using deep learning algorithms to track spermatozoa for spermatozoon selection in human and animal samples. One critical part of performing ICSI involves the selection of the “best” spermatozoon for injection, but other parts of the procedure may also be of importance. However, as far as we know, tracking using deep learning has not been applied to the ICSI procedure, where detecting instruments and the oocyte could also be helpful in post-analysis and training.

Study design, size, duration

The study was performed using three anonymized videos of the ICSI procedure. The frames of the videos were manually annotated by data scientists and verified by an embryologist. The annotations were bounding boxes around specific parts of the ICSI procedure, including sperm, pipettes, and the oocyte. We trained a YOLOv5 model on the collected data, where two videos were used for training and one video for validation.

Participants/materials, setting, methods

The videos of the ICSI procedure were captured at 200x magnification with a DeltaPix camera at Fertilitetssenteret in Oslo, Norway. ICSI was performed using a Nikon ECLIPSE TE2000-S microscope connected with Eppendorf TransferMan 4m micromanipulators. The spermatozoa were immobilised in 5 µl Polyvinylpyrrolidone (PVP; CooperSurgical). The videos had a resolution of 1920x1080 and were resized to 640x640 before being processed by the YOLOv5 model. The data will be made public in a later study.

Main results and the role of chance

Mean average precision (mAP) with the threshold of 0.5 (mAP@.5) is the main quantitative parameter measured in the YOLOv5 model. All the experiments were performed using three-fold cross-validation, where we present the average metrics calculated over the three folds. Overall, the method showed an average mAP@.5 of 0.50 across all predicted classes, which means that the method can track the different components with good accuracy. Looking closer at the individual classes, we see that instruments like the holding pipette and ICSI pipette are detected with high accuracy with a mAP@.5 of 0.87 and 0.94, respectively. The oocyte is also easily tracked with a mAP@.5 of 0.92. The first polar body is well detected with a mAP@.5 of 0.65. The model has issues detecting and tracking individual sperm (both outside and within the pipette), where the method achieved a mAP@.5 of 0.46 for tracking sperm outside the pipette and 0.03 for the sperm inside the pipette. The low score of detecting the sperm in the pipette can be explained by the often unclear visibility of the sperm through the pipette and the low number of training samples.

Limitations, reasons for caution

The limited sample size makes the generalizability of the method difficult to determine. A more extensive evaluation is necessary. Moreover, as the currency study focuses on tracking, patient information and clinical outcome were not included in the analysis.

Wider implications of the findings

Deep learning has the potential to aid embryologists to perform successful ICSI through tracking and detection of spermatozoa, pipettes, and the oocyte. This could potentially lead to better internal quality control and teaching possibilities, and hopefully better results.

Trial registration number

not applicable

Citation Key42723