Event-Based Feature Tracking Using the Iterative Closest Point Algorithm (for a DAVIS event camera)

Event-based Feature Tracking

In this project, we implement a feature detection and tracking algorithm for an event-based vision camera: the Dynamic and Active-pixel Vision Sensor (DAVIS) event camera.


Event cameras offer advantages like high temporal resolution, minimal latency, and extended dynamic range compared to traditional cameras; however, specialized algorithms are required to leverage the asynchronous data generated by such cameras. Therefore, we took on this feature detection and tracking project.

The ICP-based tracker is initialized with edge and corner features from the grayscale images of the DAVIS camera using OpenCV. A collection of interest points is generated by drawing bounding boxes around each corner feature. For every ’n’ events generated by the camera, an event image is created by binning these ’n’ events. Using the aforementioned bounding boxes, a collection of event interest points is also obtained. With the ICP algorithm, feature matching is done for each set of corresponding points in both collections, and the resulting displacement is used to transform the original features and their bounding boxes. The process continues iteratively for the entire sequence available.

To mitigate the inevitable loss of features, a feature reinitialization step is also included in the pipeline. With the above, feature detection and tracking by way of events is realized. One key limitation of this inherent loss of the exact asynchronous nature of the event since a group of events are binned into an event image.