BOOK NOW

Interview with OmniVision

Ahead of the much anticipated Image Sensors Europe 2023, we spoke to industry expert Andreas Süss from OmniVision Technologies.

We asked Andreas to give us a sneak peek of his upcoming presentation at the event, as well as insights into what he thinks are the major challenges and breakthroughs in image sensor technology advancements.

Andreas, you are speaking on “Towards hybrid event/image vision” at this year’s conference. Can you give us a snapshot of what delegates will hear?

Neuromorphic Imaging also known as Event-based Vision Sensing (EVS) uses smart pixels that create events upon changes of light intensity. This allows to save bandwidth and power for portions of the scene that remain static and dedicate it to where things are actually happening. Hence, EVS is a promising imaging technology to capture sparse, fast motion in a power efficient manner. As EVS lacks an absolute reference, a significant number of use-cases utilize a combination of conventional CMOS image sensors (CIS) and EVS. We will present our new hybrid EVS+CIS sensor and showcase how this is beneficial for use-cases such as rolling-shutter correction, deblur or video-frame-interpolation for slow-motion imaging. We will further illuminate how we generate ground truth reference data for algorithm development, which is a key requirement to kickstart these new applications.

What do you see as the major challenges for, and breakthroughs in, image sensor technology advancements in the short to medium term?

Use-cases that combine fast, but reference-free Event-based Vision Sensors (EVS) with high-quality, but slow CMOS Image Sensors (CIS) need to be well aligned in space and time in order to overlay events with image data. This is a significant challenge for mechanical alignment tolerances and the electronic interface between sensors. We solve these aspects using our new hybrid EVS+CIS Sensor. Such technology only recently became possible using advancements in wafer stacking technology. Now that advanced hybrid EVS+CIS Sensors are available, we will see a significant amount of work on the application side. Use-cases based on neural networks, often, require end-to-end training – so we need to be able to synthesize realistic ground truth training data. Another challenge is that a significant number of algorithms – e.g. for deblur or slow motion – require very large networks, which prohibits integration into mobile platforms. We will observe progress in algorithm development to enable mobile solutions as well as neuromorphic computing platforms to utilize the sparse event data in a more efficient manner.

What are you most looking forward to hearing about and discussing with your fellow speakers and delegates at this year’s conference?

I am looking forward to discussing new use-cases for neuromorphic imaging. 

Andreas-Suss-150-circle

Andreas Süss is the Senior Manager Novel Image Sensor Systems; Principal Engineer, CTO Office at OmniVision Technologies.