This code release contains the implementation of the feature tracking algorithms within the event stream, described in two of our papers listed below. In particular, here the sparsity and the asynchronicity of the event stream are explicitly exploited enabling efficient, asynchronous tracking of features using event-by-event processing. We rely on a novel hypothesis-based feature tracking paradigm for event-based features that avoids the need to explicitly optimize the underlying expensive alignment problem. This event-based feature tracking software is publicly available and can be accessed from link.
Users of this software are kindly asked to cite at least one of the following papers, where it was introduced:
Ignacio Alzugaray and Margarita Chli, "Asynchronous Multi-Hypothesis Tracking of Features with Event Cameras" in Proceedings of the IEEE International Conference on 3D Vision (3DV), 2019. Research Collection Videocall
Ignacio Alzugaray and Margarita Chli, "HASTE: multi-Hypothesis Asynchronous Speeded-up Tracking of Events" in Proceedings of the British Machine Vision Conference (BMVC), 2020. Research Collection Presentation Video