Realtime Time Synchronized Event-based Stereo
Alex Zihao Zhu, Yibo Chen, Kostas Daniilidis; The European Conference on Computer Vision (ECCV), 2018, pp. 433-447
Abstract
In this work, we propose a novel event based stereo method which addresses the problem of motion blur for a moving event camera. Our method uses the velocity of the camera and a range of disparities, to synchronize the positions of the events, as if they were captured at a single point in time. We represent these events using a pair of novel time synchronized event disparity volumes, which we show remove motion blur for pixels at the correct disparity in the volume, while further blurring pixels at the wrong disparity. We then apply a novel matching cost over these time synchronized event disparity volumes, which both rewards similarity between the volumes while penalizing blurriness. We show that our method outperforms more expensive, smoothing based event stereo methods, by evaluating on the Multi Vehicle Stereo Event Camera dataset.
Related Material
[pdf] [
bibtex]
@InProceedings{Zhu_2018_ECCV,
author = {Zihao Zhu, Alex and Chen, Yibo and Daniilidis, Kostas},
title = {Realtime Time Synchronized Event-based Stereo},
booktitle = {The European Conference on Computer Vision (ECCV)},
month = {September},
year = {2018}
}