Tracking of particles in temporal fluorescence microscopy image sequences is of fundamental importance to quantify dynamic processes of intracellular structures as well as virus structures. We introduce a probabilistic deep learning approach for fluorescent particle tracking, which is based on a recurrent neural network that mimics classical Bayesian filtering. Compared to previous deep learning methods for particle tracking, our approach takes into account uncertainty, both aleatoric and epistemic uncertainty. Thus, information about the reliability of the computed trajectories is determined. Manual tuning of tracking parameters is not necessary and prior knowledge about the noise statistics is not required. Short and long-term temporal dependencies of individual object dynamics are exploited for state prediction, and assigned detections are used to update the predicted states. For correspondence finding, we introduce a neural network which computes assignment probabilities jointly across multiple detections as well as determines the probabilities of missing detections. Training requires only simulated data and therefore tedious manual annotation of ground truth is not needed. We performed a quantitative performance evaluation based on synthetic and real 2D as well as 3D fluorescence microscopy images. We used image data of the Particle Tracking Challenge as well as real time-lapse fluorescence microscopy images displaying virus structures and chromatin structures. It turned out that our approach yields state-of-the-art results or improves the tracking results compared to previous methods.
Keywords: Biomedical imaging; Deep learning; Microscopy images; Tracking.
Copyright © 2021 Elsevier B.V. All rights reserved.