Research
-
V4RL Datasets
-
V4RL Code Releases
- Code: MemCDT
- Code: COVINS
- Code: Event-based Feature Tracking
- Code: Multi-robot Coordination for Autonomous Navigation in Partially Unknown Environments
- Aerial Single-view Depth Completion: Code + Datasets + Simulator
- Code: CCM-SLAM
- Code: Real-time Mesh-based Scene Estimation
- Code: Visual-Inertial Relative Pose Estimation for Aerial Vehicles
- Code: COVINS-G
-
Projects
-
V4RL Setup Release
Code: Event-based Feature Tracking
Hypotheses-based Asynchronous Feature Tracking for Event Cameras
This code release contains the implementation of the feature tracking algorithms within the event stream, described in two of our papers listed below. In particular, here the sparsity and the asynchronicity of the event stream are explicitly exploited enabling efficient, asynchronous tracking of features using event-by-event processing. We rely on a novel hypothesis-based feature tracking paradigm for event-based features that avoids the need to explicitly optimize the underlying expensive alignment problem. This event-based feature tracking software is publicly available and can be accessed from link.
Users of this software are kindly asked to cite at least one of the following papers, where it was introduced:
Ignacio Alzugaray and Margarita Chli, "Asynchronous Multi-Hypothesis Tracking of Features with Event Cameras" in Proceedings of the IEEE International Conference on 3D Vision (3DV), 2019. Research Collection Videocall
Ignacio Alzugaray and Margarita Chli, "HASTE: multi-Hypothesis Asynchronous Speeded-up Tracking of Events" in Proceedings of the British Machine Vision Conference (BMVC), 2020. Research Collection Presentation Video