
Countries around the world require automotive manufacturers to install various advanced driver assistance systems (ADAS) features. These ADAS features require complex detection capabilities through the use of various sensors (including image sensors (cameras), LiDAR, radar, ultrasound and far infrared (FIR)). In fact, industry analysis firm P&S Intelligence shows that “by 2030, the global ADAS sensor market is expected to reach $40.8 billion, with a compound annual growth rate of 11.7% from 2020 to 2030…”
Here is a relatively typical diagram of cars and their sensors, where each sector represents the area covered by different sensors in ADAS applications.
Typical automotive sensors used for ADAS applications
Cameras can capture high-definition images of the road and surrounding environment, which are analyzed by onboard computers; however, these images are limited to what the camera can “see” in visible light. Additionally, environmental factors such as fog and rain can degrade the quality of these images. Radar and ultrasound devices generate radio and audio pulses and receive signals reflected from objects. These sensors can operate in all weather conditions but cannot distinguish between different types of objects. Similarly, LiDAR devices emit light pulses and receive reflected signals. Besides observing road features such as lane markings, LiDAR sensors (which can work in both bright and dark conditions) can also detect animals, pedestrians, and other vehicles. Meanwhile, far infrared thermal sensors can capture the infrared radiation emitted by objects, can operate in dark environments, and can distinguish between living and non-living objects. All signals from these sensors must be processed. In many cases, sensor data also needs to be displayed to the driver through one or more onboard displays.
The information from multiple sensors often needs to be merged through the sensor fusion process. Cabin applications (such as gesture recognition), ADAS applications, and in-vehicle infotainment (IVI) applications (like electronic mirrors, rearview/backup systems) increasingly require a considerable degree of artificial intelligence (AI) and machine learning (ML). In fact, according to forecasts, by as early as 2025, AI will be utilized in most ADAS and IVI systems. This poses challenges for automotive system designers, as supporting multiple sensors and processing the data they generate requires application processors (AP) with multiple I/O and strong processing capabilities, which are often expensive. To help reduce the overall cost of ADAS systems, designers can consider alternative processors such as CPU and GPU commonly used in ADAS system hardware platforms. One solution is to combine the use of less powerful application processors (with fewer I/O and lower processing capabilities to reduce costs) and low-power embedded vision ICs (like Lattice CrossLink™-NX FPGA).
In this solution, most sensor data is pre-processed by the FPGA, allowing the AP to focus its processing resources on implementing AI functions for the ADAS system. CrossLink-NX FPGA can merge various sensor data streams and send them to the AP through the sensor aggregation process, thereby reducing the number of I/O it needs to support. In other cases, such as acquiring video signals to drive multiple displays, CrossLink-NX can replicate video data streams through the split-screen display process. Additionally, for systems that require redundant data and have stringent safety requirements, one CrossLink-NX FPGA device can duplicate data from one or more sensors and send it to multiple APs.
Developers may wonder what impact adding another IC will have on power consumption and design size. They need not worry, as CrossLink-NX FPGA is developed based on the Lattice Nexus™ FPGA platform, offering up to 75% lower power consumption compared to competing FPGAs, and comes in a small package measuring just 6 x 6 millimeters. CrossLink-NX FPGA also supports various high-speed interfaces, including two dedicated 4-channel MIPI D-PHY transceivers, each with a rate of up to 10 Gbps, providing industry-leading performance for vision processing, sensor processing, and AI inference applications.
CrossLink-NX FPGA also has a soft error rate (SER) that is 100 times lower than competing FPGAs, making it the ideal choice for critical applications like ADAS that must operate safely and reliably. Moreover, CrossLink-NX FPGA is now AEC-Q100 Grade 2 (TA = 105°C) certified, undoubtedly making it an ideal choice for automotive applications.
Finally, Lattice’s mVision™ and sensAI™ solution sets fully support CrossLink-NX FPGA. These solution sets provide developers with modular hardware platforms, reference designs, neural network IP cores, and custom design services to accelerate and simplify automotive vision system design, significantly reducing the time and engineering expertise required to implement automotive embedded vision applications.
Automotive applications supported by mVision and sensAI solution sets
Lattice’s mVision solution set includes all the resources needed for embedded vision system designers to evaluate, develop, and deploy FPGA-based embedded vision applications (such as machine vision, robotics, ADAS, video surveillance, and drones). At the same time, the comprehensive Lattice sensAI solution set also includes all the resources needed for evaluating, developing, and deploying FPGA-based AI/ML applications.
In summary, Lattice achieves automotive applications using AEC-Q100 compliant CrossLink-NX FPGA, with highlights including low power consumption (28nm process has significant energy efficiency advantages), high reliability (best in class among competing FPGAs), high performance (10Gbps MIPI, SERDES, and industry-leading I/O), small size (saving PCB area without compromising performance), high safety (compliant with ISO 21434 standards and NIST SP 800-193 guidelines), high functional safety (planning to develop FPGA toolchains compliant with ISO 26262 FuSa standards for applications with high safety requirements), and low cost, comprehensive features, etc. We are witnessing the arrival of the golden age of automotive ADAS/IVI applications.
