The Dilemma of ADAS and the Bottleneck of Autonomous Driving

Introduction:

This article is authorized for publication by Zhiche Technology, written by Old Hong.

Recently, the State Administration for Market Regulation of China and four other departments held discussions with Tesla. Not to mention autonomous driving, there are even basic safety issues—battery fires, unintended acceleration, and vehicle remote upgrades (OTA), among others. As for automotive ADAS, it is not yet fully developed and is further confused by various “autonomous driving” claims. How should we rationally view the relationship and actual capabilities of ADAS and autonomous driving?

1

The Myth of ADAS Seems Exaggerated

To improve automobile safety, ADAS has been developing for over a decade.The technology that combines a set of sensors (mainly radar and cameras) with a powerful ECU (Electronic Control Unit) has made significant breakthroughs over the past ten years.In 2012, several academic research papers had already proven that Automatic Emergency Braking (AEB) could reduce rear-end collisions by 40% and related fatalities by 15%.This happened at the time when the concept of “Vision Zero” was spreading globally.High technology can indeed change people’s lives, but the myth of ADAS seems to have been exaggerated beyond reason.
In fact, there is another story behind ADAS, with two companies becoming the epitome of the shift in ADAS goals: Mobileye and Tesla. At that time, a new perspective on ADAS was brought to the table, which is autonomous driving. During the implementation of ADAS, sudden events caused turmoil for both companies.
On May 7, 2016, hardcore fan Joshua Brown died when his Tesla Model S Autopilot (which is actually an ADAS system) collided with an 18-wheeler while using Mobileye EyeQ3 technology. This became the first fatality caused by autonomous driving technology.
In July 2016, Mobileye announced that it would no longer provide technical support for Tesla Autopilot after the current contract expired, and their collaboration would only be limited to EyeQ3.
Eight months later, Mobileye was acquired by Intel for $15.3 billion, at the time, it was powering global autonomous vehicles with a $50 SoC. By 2020, for most people, ADAS meant a technology that helps climb the autonomous driving ladder defined by SAE (Society of Automotive Engineers) in 2014—despite the lack of evidence proving it can climb a rung. The technology of ADAS has been exaggerated with L2, L2+, and now L2++ performances. But in terms of autonomous driving, the reality may be another story.

2

How to Advance Computing Power, Sensors, and Chip Technology?

This chart from Yole is a bit complex and worth studying closely. What we need to consider is: Can today’s ADAS reach tomorrow’s AD? In this process, new computing and new sensors are needed; along with the continuously increasing computing power, up to brute force computing; will it be quantum computing or neural computing? With so many companies involved, all being leading enterprises, how should we choose?
As for computing power, what about Moore’s Law? For automotive applications, does semiconductor technology making chips shrink to a few nanometers make that much sense? After all, cars are not phones; what matters is not the small size of the chips themselves but the harsh application environment. How to solve the heat problem of increasingly smaller chips? There are too many issues to research and solve; it’s too difficult!
The Dilemma of ADAS and the Bottleneck of Autonomous Driving
Innovation Directions for Autonomous Vehicles

3

Safety Performance Changes Little

As early as 2012, the European Union enacted regulations requiring all new cars produced in 2014 to be equipped with AEB systems.40 countries, led by Japan and the EU, hope that by 2022, all new cars and light commercial vehicles will be equipped with AEB systems.
China’s performance seems to be acceptable; a survey by Zosi Automotive Research indicated that the AEB system installation rate in China’s passenger car market reached 33.9% in 2020, nearly doubling year-on-year.
However, the safety performance of ADAS has not changed much. In June 2019, the American regulatory body IIHS held a meeting confirming that AEB reduced rear-end collisions by 50%, but the related injury claim rate only decreased by 23%… In the same year, a test by AAA (American Automobile Association) showed that in many cases, pedestrian detection has significant limitations, such as with children, groups of people, or those walking at night.
Despite ten years of technological upgrades, efforts in safety seem to have been set aside.The reason should be the desire to do better in autonomous driving, but unfortunately, the reality is not so. A significant step that everyone in the industry is waiting for is L3—allowing the driver to take their eyes off the road. This level is defined as entering autonomous driving; besides Tesla, Audi was the first company to test L3 on its 2018 A8, but it abandoned it in 2020. This is a billion-euro lesson for other autonomous driving players. Currently, Lexus only labels one model with similar technology, the LS, as “L2”.
Industry experts believe that the current mobility methods are facing five major limitations, the most criticized being the deteriorating safety for pedestrians. The other four limitations are: public transport faces challenges in efficiency and cost; traffic congestion and the cost of owning a vehicle diminish the status of cars as an important mobility solution; air travel is rapidly expanding, but the traffic conditions from cities to airports remain poor; all existing mobility methods generate carbon dioxide emissions, and transformation is urgent.

4

Where Is the Bottom Line for “False Autonomous Driving”?

The Tesla Model S involved in the “5.7 tragedy” used forward radar and cameras as its Autopilot’s vision. Unfortunately, during that crash, neither of its “eyes” saw a white trailer in clear sky.Similar white trucks were targeted by Model 3 in June 2020.
The first pedestrian fatality occurred in March 2018 when an Uber self-driving test vehicle, a Volvo XC90 armed to the teeth, struck a pedestrian. The system recorded radar and LiDAR observations of the pedestrian about six seconds before impact, while the vehicle was traveling at 43 miles per hour.At the time of the incident, the vehicle did not intervene or take action.
The Dilemma of ADAS and the Bottleneck of Autonomous Driving
The first two fatalities involving drivers and pedestrians and the similar fate of the white behemoth
So how do we explain the gap between ADAS expectations and delivery? Is it a problem with the performance of the sensor group? Or is there a need for more sensors (redundancy)? For instance, LiDAR (all-weather) and thermal cameras (to perceive humans)? Is more computing power needed? Or is it a combination of all these factors?
The answers can indeed be many. While many automotive professionals claim that autonomous vehicles are not suitable for tomorrow and may just be a fantasy, there are already over 4,000 autonomous vehicles roaming the streets worldwide, mainly for research purposes.
By the end of 2019, with Waymo launching its commercial ride-hailing service—Waymo One, driverless taxis have become a reality. Currently, about a quarter of its fleet of 600 vehicles provides driverless taxi services to customers, while preparing to retrofit 100,000 vehicles annually when the Michigan factory opens.
Companies developing autonomous vehicles are not alone. General Motors has Cruise, Lyft is backed by Aptiv, Baidu has the Apollo project, Amazon has Zoox, Ford and Volkswagen invest in Argo.ai, Alibaba supports Autox, and Toyota invests in Pony.ai. More than 20 leading companies worldwide have joined this competition, with real autonomous vehicles on the roads.
Yole analysts believe that while it is certainly easy to have a car with some autonomous driving capabilities on the road, very few companies can reach the level of confidence required for true autonomous driving. Waymo may be the first company to provide the necessary “suitable quality” benchmarks for autonomous driving.

5

AV Is a Money-Burning Game

Waymo’s “suitable quality” starts with a sensor group that provides data 10 times that of current ADAS systems, using 2 to 3 times the number of sensors, with a resolution that is also 2 to 3 times that of current ADAS systems.

In terms of computing power, Waymo is looking for more than 100 times the capability.The computing power of a typical ADAS system using Intel Mobileye EyeQ4 chips has increased from 0.25TOP (10 times that of a high-end laptop) to 2.5TOP, while the highest computing power for autonomous vehicles has exceeded 250TOP. Both Zhiji and NIO are using chips with thousands of TOPs, which is a world of difference.But does computing power equate to true performance, or is it more accurate to measure how many frames of images are accurately identified per second? There are differing opinions in the industry.

When the cost of an L2 ADAS system is around $400, the cost of unmanned retrofit equipment has reached $130,000, which is more than 300 times that of ADAS. However, people still question whether this unmanned device performs better than an ordinary human driver.

Players in the ADAS space may eventually climb the ladder to autonomous driving, but it will take more money, more time… and require more innovation.Like every disruptive technology, unmanned taxis have not been valued by most enterprises.

Only emerging manufacturers like Tesla are prepared to invest heavily in developing their own FSD (Full Self-Driving) chips, while FSD can only reach 70TOP (with dual-chip computing power of 144TOP).Despite the heavy promotion of ADAS, true AV disruption is approaching, with Amazon and Alibaba potentially being the biggest threats to Volkswagen and Toyota, even though all the Teslas and XPengs are flaunting AV copyrights.

6

Traditional Automaker Structures Struggle

Even though ADAS vehicles are gradually reaching SAE levels, the E/E (Electrical/Electronic) architecture used by most automakers has not changed since the 1960s and the beginning of electrical architecture.The distributed E/E architecture, which uses the principle of “one ECU, one function,” still exists.

Over time, more sensors have been realized, achieving more safety, sensing, powertrain, infotainment, or comfort functions, with high-end cars embedding up to 150 ECUs.The result is that adding more ECUs has made wiring harnesses increasingly complex, heavy, and bloated.
The Dilemma of ADAS and the Bottleneck of Autonomous Driving
Evolution of E/E Architecture
To solve this problem, automakers must change this architecture by creating domain-specific controllers (DCUs) related to five domains. This is what Audi has started to do with its zFAS domain controller, which connects sensors such as forward ADAS cameras, remote radar, and LiDAR, with all processing related to these sensors being achievable through EyeQ3 visual processors, Altera and Nvidia SoCs, and Infineon MCUs.
With its Autopilot hardware, Tesla is doing this on another scale—with a domain controller consisting of 8 electronic boards and 3 times the components of zFAS.In addition to being able to fuse a large amount of raw data from sensors, Autopilot hardware can also control audio, navigation, and RF communication, and it can implement over-the-air (OTA) updates.For Tesla, developing such an architecture is easier because it started from a blank slate, while traditional automakers will have to shift to a more centralized E/E architecture to achieve autonomous driving functionality.

7

Growth of ADAS-Related Sensing and Computing Expected

According to Yole’s “2020 ADAS Vehicle Sensing and Computing” report:The market revenue for ADAS sensors and computing in 2020 was $8.6 billion, with radar and camera modules necessary for AEB systems accounting for $3.8 billion and $3.5 billion, respectively.It is expected that from 2020 to 2025, this market will grow at a compound annual growth rate of 21%, reaching over $22 billion.As the penetration rate of AEB systems continues to grow, revenues related to radar and camera modules will reach $9.1 billion and $8.1 billion, respectively.ADAS computing will reach $3.5 billion, while the revenue from LiDAR, used only in high-end vehicles, will reach $1.7 billion.
The Dilemma of ADAS and the Bottleneck of Autonomous Driving
2020-2025 ADAS Vehicle Sensor and Computing Forecast
In terms of unmanned vehicles, it is expected that by 2020, revenues from cameras, radar, and LiDAR will reach $145 million, with LiDAR reaching $107 million, followed by cameras and radar at $27 million and $11 million, respectively. By 2025, these sensor revenues are expected to reach $956 million, with a compound annual growth rate close to 46%.

It is worth mentioning that due to the gradual transition from high-end to low-end models, LiDAR will take the lead, reaching $639 million, followed by cameras and radar at $228 million and $89 million, respectively. Meanwhile, the annual production of unmanned vehicles will grow from the current 4,000 units to 25,000 units, including unmanned taxis and shuttle buses.

8

Expectations for ADAS

In conclusion, although there are high hopes for ADAS vehicles to achieve autonomous driving, automakers now realize that leading ADAS vehicles to autonomous driving is much more complex than expected, as different types of sensors must be integrated, massive amounts of data must be processed, and collisions must be avoided.
Meanwhile, unmanned taxis have already become a reality and have been implemented first in China. In this regard, technology giants capable of processing massive amounts of data have set an example, with their chip computing power continuously increasing. Unmanned taxis are disrupting the rhythm of ADAS. Waymo is one example, as it has successively collaborated with Nissan-Renault, Fiat Chrysler, Jaguar Land Rover, and Volvo to develop unmanned taxis, hoping to achieve L4 autonomous driving in scenarios limited to specific geographic locations and weather conditions, with expectations of having such vehicles on highways by 2022.
However, the so-called safest unmanned vehicle, Waymo, has also been involved in two traffic accidents, one in May 2018, where it collided with another car while trying to avoid hitting an incoming vehicle; the other was a rear-end collision caused by “other people’s reckless driving.”
Nevertheless, the steps of ADAS and autonomous driving will not stagnate but will move further.

END

The Dilemma of ADAS and the Bottleneck of Autonomous Driving
The Dilemma of ADAS and the Bottleneck of Autonomous Driving

Unmanned Vehicle Intelligence Bureau Exchange Group

The Dilemma of ADAS and the Bottleneck of Autonomous Driving

Scan to Join the Group for Discussion

The Dilemma of ADAS and the Bottleneck of Autonomous Driving

Leave a Comment