Author: Bahman Hadji, Senior Product Marketing Manager, Automotive Perception Division, onsemi
Traffic safety is a significant challenge, with over 1.1 million people reportedly dying in road traffic accidents each year, and approximately 20 to 50 million more injured.
One of the main causes of these accidents is driver error. Automakers and government regulators have been seeking ways to enhance safety. In recent years, Advanced Driver Assistance Systems (ADAS) have made significant strides in helping to reduce road fatalities.
This article will explore the role of ADAS in improving road safety and the various sensor technologies that are crucial to achieving this goal.
The Evolution and Importance of ADAS
Since the introduction of Anti-lock Braking Systems (ABS) in the 1970s, the application of ADAS technology in passenger vehicles has steadily increased, leading to improved safety. According to the National Safety Council (NSC) in the United States, ADAS has the potential to prevent approximately 62% of traffic fatalities, saving over 20,000 lives each year. In recent years, features such as Automatic Emergency Braking (AEB) and Forward Collision Warning (FCW) have become increasingly common, with more than a quarter of vehicles equipped with these features to help drivers prevent accidents and ultimately save lives.

ADAS requires multiple technologies to work in concert. A perception suite acts as the system’s “eyes,” detecting the vehicle’s surroundings and providing data to the system’s “brain,” which uses this data to compute decisions for assisting the driver. For example, when a vehicle is detected ahead and the driver has not applied the brakes, AEB will automatically brake the vehicle to prevent a rear-end collision. The ADAS perception suite consists of a vision system that includes an automotive-grade camera, which features a high-performance image sensor capable of capturing video streams of the vehicle’s surroundings to detect vehicles, pedestrians, traffic signs, etc., displaying these images to assist the driver during low-speed maneuvers and parking. Cameras are typically used in conjunction with depth perception systems such as millimeter-wave radar, LiDAR, or ultrasonic sensors, which provide depth information to enhance the camera’s two-dimensional images, increase redundancy, and eliminate ambiguity in object distance measurements.
For automakers and their Tier 1 system suppliers, implementing ADAS systems can be a challenge: the processing power to handle all the data generated by multiple sensors is limited, and the sensors themselves have performance constraints. The demands of the automotive industry dictate that each component must have extremely high reliability, including not only hardware but also the associated software algorithms, necessitating extensive testing to ensure safety. The system must also maintain stable performance under the harshest lighting and weather conditions, be able to cope with extreme temperatures, and operate reliably throughout the vehicle’s lifecycle.
Key Sensor Technologies in ADAS Systems
Now, let’s take a closer look at some key sensor technologies used in ADAS, including image sensors, LiDAR, and ultrasonic sensors. Each type of sensor provides specific types of data, which are processed by software algorithms and combined to generate an accurate and comprehensive understanding of the environment. This process is known as sensor fusion, which can enhance the accuracy and reliability of software perception algorithms through redundancy from multiple sensor modalities, enabling higher confidence decisions for achieving advanced levels of safety. The complexity of these multi-sensor suites can escalate rapidly, requiring increasingly powerful processing capabilities for the algorithms. Meanwhile, the sensors themselves are becoming more advanced, allowing for local processing at the sensor level rather than relying solely on a central ADAS processor.

(1) Automotive Image Sensors
Image sensors are the “eyes” of the vehicle—arguably the most critical type of sensor in any vehicle equipped with ADAS. From machine vision driving assistance features such as Automatic Emergency Braking, Forward Collision Warning, and Lane Departure Warning, to “human perspective” functions like 360° surround view cameras for parking assistance and camera monitoring systems for electronic rearview mirrors, to driver monitoring systems that can detect distracted or fatigued drivers and issue alerts to prevent accidents, the image data provided by image sensors can be utilized for various ADAS functions.
onsemi offers a range of image sensors, including the Hyperlux series, which deliver excellent image quality with low power consumption. The Hyperlux sensor pixel architecture includes an innovative Super Exposure imaging scheme that captures high dynamic range (HDR) frames through LED Flicker Mitigation (LFM), overcoming misreading issues caused by pulsing flicker from LED headlights or LED traffic signs.
The Hyperlux image sensors are designed to handle challenging automotive scene conditions, such as direct sunlight above overpasses, capable of capturing dynamic ranges of up to 150dB. Cameras equipped with Hyperlux image sensors perform far better than the human eye in extreme situations, functioning normally even at light levels below 1 lux.

onsemi’s Hyperlux image sensors include the 8-megapixel AR0823AT and the 3-megapixel AR0341AT. These digital CMOS image sensors utilize Hyperlux 2.1µm super exposure single photodiode pixel technology, offering excellent low-light performance while capturing wide dynamic ranges in both high and low light scenes within the same frame. The super exposure pixels allow for a sufficiently large dynamic range within a single frame, enabling a “worry-free setting” exposure scheme that effectively eliminates the need for automatic exposure adjustments when lighting conditions change, such as when driving out of a tunnel or parking lot on a sunny day.
(2) Depth Sensors (LiDAR)
Accurate measurement of the distance between objects and the sensor is known as depth perception. Depth information can eliminate ambiguity in the scene and is crucial for various ADAS functions as well as achieving higher levels of ADAS and fully autonomous driving.
There are various technologies available for depth perception. When considering depth performance, Light Detection and Ranging (LiDAR) is the best choice. LiDAR can perform depth perception with high depth and angular resolution, and because the system achieves active illumination through near-infrared (NIR) lasers with the sensor, it can operate under all ambient light conditions. It is suitable for both short-range and long-range applications. While low-cost millimeter-wave radar sensors are more common in today’s automotive applications, they lack the angular resolution of LiDAR and cannot provide the high-resolution three-dimensional point cloud environmental information required for advanced autonomous driving beyond basic ADAS needs.
The most common LiDAR architecture is the Time of Flight (ToF) method, which calculates distance by emitting a short infrared light pulse and measuring the time it takes for the signal to reflect back to the sensor. LiDAR sensors replicate this measurement process by scanning light across their field of view to capture the entire scene.

onsemi’s ARRAYRDM-0112A20 silicon photomultiplier (SiPM) array is a single-photon sensitive sensor with 12 channels in a monolithic array, featuring high photon detection efficiency (PDE) at near-infrared wavelengths such as 905nm for detecting returning pulses. This SiPM array has been integrated into a LiDAR that is equipped in some of the world’s first passenger vehicles offering true “eyes-off” autonomous driving capabilities, allowing vehicles to achieve autonomous driving capabilities that go beyond basic driving assistance, where the driver no longer needs to pay attention to the road. This level of autonomous driving functionality has yet to be reliably achieved in consumer vehicles without the support of LiDAR depth perception.
(3) Ultrasonic Sensors
Another technology used for distance measurement is ultrasonic detection, which involves emitting sound waves at frequencies beyond human hearing range from the sensor and then detecting the reflected sound to measure distance through time of flight.
Ultrasonic sensors can be used for close-range obstacle detection and low-speed maneuvering applications such as parking assistance. One advantage of ultrasonic sensors is that sound travels much slower than light, so the time for the reflected sound waves to return to the sensor is typically a few microseconds, while light takes nanoseconds, meaning that the processing performance required for ultrasonic sensors is much lower, thereby reducing system costs.
An example of an ultrasonic sensor is onsemi’s NCV75215 parking distance measurement ASSP. During vehicle parking, this component measures the time of flight to obstacles using a piezoelectric ultrasonic transducer. It can detect objects at distances ranging from 0.25 to 4.5 meters and features high sensitivity and low noise.
Conclusion
onsemi has played a significant role in developing the sensor technologies required for ADAS. onsemi invented dual conversion gain pixel technology and HDR (high dynamic range) modes, which have been adopted by many sensors in the industry, and pioneered innovative super exposure designs that enable sensors to provide excellent low-light performance while capturing HDR scenes without saturation using a single photodiode. Due to this market and technological leadership, most ADAS image sensors on the road today are developed by onsemi. These innovations have allowed onsemi to provide high-performance sensors for automotive applications over the past two decades, significantly contributing to the role of ADAS in enhancing vehicle safety.
The automotive industry continues to invest heavily in ADAS and pursues the goal of fully autonomous driving—moving beyond the basic driving assistance functions defined by SAE (i.e., Level 1 and Level 2) towards true autonomous driving capabilities (i.e., Level 3, Level 4, and Level 5 as defined by SAE). Reducing road fatalities is one of the main driving forces behind this trend, and onsemi’s sensor technologies will play a crucial role in this automotive safety transformation. For more information on ADAS system solution guides, please visit onsemi’s official website (www.onsemi.cn/design/technical-documentation).