Temporal-Mapping Photography for Event Cameras

Yuhan Bao, Lei Sun*, Yuqin Ma, Kaiwei Wang* ;

Abstract


"Event cameras, or Dynamic Vision Sensors (DVS) are novel neuromorphic sensors that capture brightness changes as a continuous stream of “events” rather than traditional intensity frames. Converting sparse events to dense intensity frames faithfully has long been an ill-posed problem. Previous methods have primarily focused on converting events to video in dynamic scenes or with a moving camera. In this paper, for the first time, we realize events to dense intensity image conversion using a stationary event camera in static scenes with a transmittance adjustment device for brightness modulation. Different from traditional methods that mainly rely on event integration, the proposed Event-Based Temporal Mapping Photography () measures the time of event emitting for each pixel. Then, the resulting is converted to an intensity frame with a temporal mapping neural network. At the hardware level, the proposed is implemented by combining a transmittance adjustment device with a DVS, named Adjustable Transmittance Dynamic Vision Sensor (). Additionally, we collected under various conditions including low-light and high dynamic range scenes. The experimental results showcase the high dynamic range, fine-grained details, and high-grayscale resolution of the proposed . The code and dataset are available in https://github.com/YuHanBaozju/EvTemMap."

Related Material


[pdf] [supplementary material] [DOI]