Autonomous driving in LFM and HDR situations

15th July 2019
Posted By : Lanna Cooper

LED Flicker Mitigation (LFM) and High Dynamic Range (HDR) are common scenarios that many autonomous vehicles have issues in. Learn how to design autonomous vehicles to operate effectively in these dangerous scenarios here.

Baumann Hajji from the automotive solutions division at On Semiconductor presents a number of different challenges for image sensors.

These sensors are used in the cameras which act as the eyes of the vehicle in safety-critical advanced driver assistance systems providing features such as adaptive cruise control and 360° around view systems. These eyes have to be able to operate in extremely low light situations as well as on a bright sunny day.

The image sensor must be able to capture all of the scene detail in these high dynamic range imaging conditions because it is relied on by the brain behind the wheel to make these decisions, whether that brain belongs to the driver or the artificial intelligence, made up by the TAS algorithms traffic lights like the ones you see in the video below.

They are being operated with pulse width modulation to save power and control intensity. The LEDs making up the light are being pulsed on and off at a rate that our eyes can't perceive which can be 90 times or more every second and it isn't just traffic lights that have adopted this.

Pulsed LEDs have become prominent and modern vehicle headlight and taillight design as well as in electronic traffic signs or variable messaging systems which are intended to convey information such as traffic conditions and speed limits to drivers on the road to capture a high dynamic range scene particularly the bright areas of a scene like this a traditional image sensor would use a short exposure to avoid over saturation much shorter than the one period of these LEDs but this results in the appearance of lights flickering in the video, as its frames capture the LEDs, sometimes while they're on and sometimes while they're off.

This undesired effect can be distracting to the driver and confusing to dash algorithms exposing for a long enough period to guarantee capturing the LEDs or on within each frame would lead to an oversaturated image which is also not desirable at ON Semiconductor.

When ON Semiconductor’s engineering teams design and characterise its image sensors, they take all of these considerations in mind. While the company’s testing includes controlled laboratory environments, ON Semiconductor also put their innocents to the test on the road to capture real world scenes. The company also tests the sensors in the same scenarios and situations that its customers do, to ensure they work not just in the lab but in real life too.


You must be logged in to comment

Write a comment

No comments




Sign up to view our publications

Sign up

Sign up to view our downloads

Sign up

ELIV 2019
16th October 2019
Germany Bonn World Conference Center