Event-based imaging, also known as neuromorphic imaging, is a recent visual information modality that relies on emerging event-based vision sensors, which are bio-inspired sensors that try to mimic the sensing behavior of a biological retina. Contrary to conventional cameras, where all sensor pixels acquire visual information simultaneously at regular time intervals (i.e., frame rate), in event cameras each sensor pixel is in charge of controlling the light acquisition process asynchronously and independently, according to the dynamics of the visual scene (e.g. when lighting conditions change or due to scene/camera motion), thus producing a variable data rate output.
Generally, event cameras follow a differential visual sampling model in which time-domain changes in the incoming light intensity (i.e., temporal contrast) are pixel-wise detected and compared to a threshold, triggering a so-called event if it exceeds the threshold. This visual sampling model allows event cameras to provide interesting advantages over conventional cameras, such as high temporal resolution (small time interval at which a sensor pixel can react to scene dynamics), very high dynamic range, low latency, and low power consumption. These are rather compelling properties in scenarios particularly challenging to conventional cameras, such as scenes with high-speed motion and/or uncontrolled lighting conditions. Therefore, event cameras may benefit several application scenarios such as autonomous driving, drones, robotics, etc., where conventional cameras usually fail to provide good performance. Moreover, event cameras may have other applications in industrial automation, visual surveillance, augmented reality, and mobile environments, where fast response, high-dynamic range, or low-power consumption is critically needed.
Nowadays, a standardized event data coding framework is not yet available, which is an obstacle to the deployment of some applications where this type of visual information acquisition (sensing) might be profitable. Indeed, standardization can offer a well-defined format for representing visual information captured by event cameras, allowing sensor manufacturers to interoperate with other technology providers to exploit event data efficiently. For this reason, the JPEG committee believes that the time has come to conduct exploratory research toward the standardization of event-based imaging formats.
In this context, the EVASION project targets the research, development, and evaluation of event data coding solutions towards standardizing an event data coding format in JPEG. This European project was proposed by an international consortium formed by three SMEs (RayShaper, IntoPix, and Prophesee), where Touradj Ebrahimi (RayShaper founder) assumed the role of the Project Coordinator. IT participation will be led by João Ascenso, with Catarina Brites and Ahmadreza Sezavar as participants.
More on the project: