With the rise of China's domestic automotive manufacturing industry, the development of Advanced Driver Assistance Systems (ADAS) is accelerating. Currently, the intelligent process of the automotive industry is in the transition phase from Level 2 to Level 3. Although it has not yet reached Level 3, it has already surpassed Level 2 and is in the Level 2+ phase.
In late April, Counterpoint Research released a set of statistical and forecast data, which believes that by 2024, the global sales volume of passenger cars with Level 3 ADAS will exceed 25,000 units, with the Chinese market being an important driving force. It is expected that by 2026, the installation volume of Level 3 passenger cars in China will exceed 1 million units, accounting for 10% of the total shipments.
Counterpoint Research believes that the Chinese market has several advantages in the development of ADAS, including government support, the issuance of multiple Level 3 test licenses, and the progress and technological accumulation of many suppliers in Level 2 testing.
The above forecast is quite optimistic, especially the level of 1 million Level 3 passenger cars in 2026, which is very desirable. In 2024, the global number of 25,000 Level 3 passenger cars is relatively objective. This is a very small proportion among the tens of millions of passenger cars with ADAS systems worldwide. It is more likely to appear in the commercial vehicle field, such as Baidu's unmanned taxis. In 2024, Level 3 passenger cars are still difficult to enter ordinary people's homes.
Advertisement
So, why is the realization and promotion of Level 3 ADAS so difficult? This requires a look at what specific levels are from L1 to L5.
According to the definition of the American SAE, a Level 1 vehicle can assist the driver in completing some driving tasks in certain situations; a Level 2 vehicle can independently complete some driving tasks, but the driver needs to always observe the surrounding environment and take over when necessary; a Level 3 vehicle can drive autonomously, almost without the driver being ready to take over at any time; Level 4 means that under certain specific conditions, the driver's control is completely unnecessary; a Level 5 vehicle can complete autonomous driving under any conditions.
It can be seen that the current ADAS system is in the transition phase from Level 2 to Level 3. In most cases, it cannot achieve autonomous driving, and in a few cases, it can, but the market scale and proportion are very small.
01
Gains in the process of advancing to Level 3Four years ago, the automotive industry proposed slogans such as "Software-Defined Vehicles," "Computational Power Arms Race," and "Centralization of E/E Architecture." Subsequently, a new track emerged, represented by computational power chips, underlying software, algorithms, domain controllers, and so on.
At that time, companies represented by Pony.ai and WeRide advocated for a "leapfrog" approach to implementation, believing that the best scenario for the landing of L4 autonomous driving was Robotaxi. In contrast, new forces in car manufacturing represented by Tesla believed that autonomous driving should be implemented in a "gradual" manner, starting from L2 and gradually transitioning to L2+, L3, and ultimately achieving L4.
From a technical perspective, there is also a dispute between multi-sensor fusion and pure vision routes. Chinese companies advocate enhancing the perception capabilities of a single vehicle by equipping it with LiDAR, millimeter-wave radar, cameras, and other methods, while companies like Tesla and Mobileye are pursuing a pure vision route, focusing on the development and iteration of visual algorithms.
After several years of iteration, some participants in the L4 track have begun to initially land in demonstration areas across China. However, due to factors such as cost and laws and regulations, the large-scale commercialization of Robotaxi is still far from reality. In contrast, new forces in car manufacturing represented by NIO, XPeng, and Li Auto have successively begun to scale up the push for L2+ functions such as commuter NOA and urban NOA. Under this situation, some businesses that previously focused on L4 have had to participate in L2+ solution businesses in order to survive. Overall, the "gradual" approach has become the mainstream method for the implementation of autonomous driving.
From a technical perspective, Tesla's "BEV+Transformer," "Occupancy," and other visual algorithm technical architectures have been successful in North America through the FSD system. Chinese manufacturers have also begun to follow Tesla's example, rebuilding their own perception architecture, with ADAS perception algorithms becoming the focus of business pursuit.
02
ADAS Hardware Iteration
Under the traditional distributed E/E architecture, the assisted driving system is composed of several independent subsystems (such as forward ADAS, side rear ADAS, parking assistance system, panoramic surround system, etc.), each with an ECU. The main structure of the ECU is "microcontroller + peripheral circuit." In this architecture, Tier1 manufacturers provide a "black box" delivery form that packages hardware and software to the OEM, with Mobileye being a typical representative.
As the overall E/E architecture of the vehicle moves from distributed to centralized, the ECUs corresponding to the ADAS subsystems also integrate into the assisted driving domain controller, the main control chip evolves from MCU to a higher-performance SoC chip, and the software architecture also upgrades to the SOA architecture, including three parts: system software (virtual machine, middleware, etc.), algorithm module, and application layer, achieving "hardware-software decoupling." The entire ADAS industry chain is also divided into several major segments, including chips, hardware integration and production, software development, algorithm development, and application development.
In the early stages of industry transformation, segments such as chips, middleware, and algorithm development have spawned a group of startups, whose technical barriers lie in whether they have sufficient development capabilities and mass production experience in their respective segments. For example, in the past three years, Desay SV has won numerous orders from car companies based on its mass production experience with ADAS domain controllers based on NVIDIA Orin chips. However, as some low to medium computational power intelligent driving domain controllers gradually become standardized, for excellent Tier1 companies (including chip suppliers, integration suppliers, algorithm suppliers, etc.), their capabilities are no longer limited to a single segment of the industry chain. Instead, they need to integrate the upstream and downstream of the industry chain with a leading advantage, possessing a comprehensive supply capability that integrates chips, algorithms, manufacturing, etc., to establish an ecosystem.Based on the development situation since 2023 and 2024, Chinese manufacturers that have developed ecological capabilities include Huawei (with full-stack self-developed capabilities from underlying chips to upper-level algorithms) and DJI (with self-developed algorithm capabilities and manufacturing capabilities, and the ability to maximize chip performance).
At present, Nvidia's Orin chip occupies 75% of the market share for NOA main control chips, and Chinese manufacturer Horizon Robotics released a new product, the Journey 6, in April of this year, which supports urban NOA functions. Both Nvidia and Horizon are striving to improve their algorithm capabilities and gradually possess the ability to provide complete intelligent driving solutions. In addition, Momenta, the leader in intelligent driving algorithms, has also established a chip team to supplement its underlying hardware capabilities.
To date, the hardware system of ADAS needs to be further iterated to prepare for the implementation of Level 3 driving, such as domain controllers, cameras, various radars, etc.
The functions of Level 3 ADAS are more intelligent, requiring the underlying chips (mainly domain controllers) to have higher computing power, while the requirements for low power consumption and compatibility levels will be increased.
To achieve Level 3 ADAS, the vehicle needs to have strong perception capabilities, and the requirements for the installation and performance of perception equipment such as cameras, millimeter-wave radars, and lidars will be increased. Among them, cameras will evolve towards higher pixel counts, and millimeter-wave radars and lidars are expected to provide stronger road information collection capabilities for ADAS before the pure vision scheme is mature, with the penetration rate expected to continue to increase.
Compared with traditional mechanical hydraulic braking/steering, wire-controlled braking/steering has the advantages of fast response, high compatibility with electrified architecture, energy recovery, and the ability to configure multiple redundant mechanisms, making it more suitable for Level 3 assisted driving vehicles. As technology gradually matures, wire-controlled braking/steering is expected to become a standard configuration for intelligent driving at Level 3 and above.
The driver monitoring system (DMS) is used to detect the driver's identity, fatigue driving, and dangerous behavior. For Level 3 autonomous driving vehicles, it is required that the driver can take over the control of the vehicle in special circumstances. Some national regulations have also made provisions on whether drivers can make or receive calls, watch entertainment systems, etc., under Level 3 autonomous driving conditions, which requires the configuration of DMS for monitoring to determine responsibility in the event of an accident. It is expected that DMS will become a standard configuration for Level 3 driving.
China has issued the "Draft for Comments on Light Signal Devices and Systems for Motor Vehicles and Trailers," which requires intelligent driving vehicles at Level 3 and above to be equipped with at least four blue-green autonomous driving logo lights, located at the front, back, left, and right of the vehicle, to inform surrounding vehicles of their autonomous driving state. With the official implementation of the "Light Signal Devices and Systems for Motor Vehicles and Trailers" standard, the autonomous driving state indicator lights will become a new market for the automotive lighting industry.
03
Algorithm and Software IterationIn 2016, Tesla began to collect a large amount of data from its vehicles. By 2018, it had preliminarily established a data closed-loop system and gradually improved cloud computing resources, automated annotation, simulation, and other links. By 2023, Tesla's perfected data closed-loop system had enabled its models to iterate at an extreme speed, with the public beta version of FSD BetaV11 being updated every 20 days.
At present, intelligent driving manufacturers represented by Xiaopeng and Huawei are gradually improving in terms of infrastructure and data closed-loop systems, becoming the leaders in China's ADAS industry.
Huawei's accumulated engineering experience in multiple fields such as chips, communication, and mobile terminals can deeply empower the intelligent driving data closed-loop system. Huawei can develop its own cloud training chips and vehicle intelligent driving chips, and there are very few manufacturers that can do this, so the company can achieve true deep collaboration between software and hardware to improve efficiency. Yu Chengdong disclosed at the launch of the 2023 September M7 model change that its cloud computing power is 1.8E FLOPS, and it can learn 10 million kilometers of data every day. By November 2023, the cloud computing power has been increased to 2.8E FLOPS (2-3 times that of other domestic manufacturers), and it can learn 12 million kilometers of data every day, with the model being updated every 5 days.
Xiaopeng is the earliest new force car company to establish a data closed-loop system and layout a cloud super computing center. Since 2023, the company's data closed-loop efficiency has been greatly improved, reflected in the efficiency improvement of the entire chain of data collection, model training, deployment, and simulation. In the simulation link, in 2022, it was possible to simulate data based on real scenes, and by 2023, it had the ability to use AI to generate extreme scenes and integrate them into a large amount of training data. With the support of such data closed-loop capabilities, Xiaopeng's software release speed has been significantly improved.
Starting from the second half of 2023, the iteration speed of intelligent driving function software versions has significantly accelerated. In the fourth quarter of 2023, the top host manufacturers have begun to concentrate on the urban NOA landing targets set at the beginning of the year. At the same time, commuting NOA is also popularizing.
Commuting NOA, also known as memory driving or AI driving, refers to the point-to-point single route navigation assistance driving that can be achieved after learning on the vehicle on a specific route set by the user.
The algorithm technology stack of commuting NOA is exactly the same as that of urban NOA, but it greatly narrows the scope of the scene. By learning the same route multiple times, a lightweight high-precision map of a single route is "reconstructed". In terms of hardware cost, because the working conditions under a single route are relatively simple and controllable, the requirements for the generalization of the algorithm model are also lower, and the demand for chip computing power and sensors is far lower than that of urban NOA.
From the actual application effect, high-speed + commuting NOA has covered 85% of the user's travel scenarios. Therefore, for car companies, in the early stage of scaling up the promotion of urban NOA functions, taking the lead in landing the commuting mode can not only meet user needs to the greatest extent with limited capabilities, and gradually cultivate users' usage habits for advanced intelligent driving functions, but also provide data accumulation for the subsequent upgrade or promotion of urban NOA functions by the host manufacturer. Under the demonstration effect of leading car companies, Tengshi, Zhiji, Zero Run and other car manufacturers have all put the launch schedule of commuting NOA functions on the agenda.Obstacles to L3 Adoption
Unlike L2/L2+, L3 is no longer seen as an assisted driving system but as a conditional autonomous driving system, where the vehicle's driving tasks are primarily the responsibility of the intelligent driving system itself, and the driver does not need to be ready to take over at all times. However, the current development situation shows that the large-scale commercialization of L3 is difficult, and in addition to technical factors, regulations and ethics are insurmountable obstacles.
The regulatory issue is particularly complex.
Autonomous driving technology involves driving safety and life, especially for L3 and above autonomous driving technology, where more driving responsibility is borne by the vehicle itself, and the uncertainty of driving is further increased. Therefore, governments around the world are cautious about the implementation of high-level autonomous driving, and the pace of the introduction of supporting laws and regulations is relatively slow, which to some extent restricts the development of high-level autonomous driving technology.
The main operation of L3 and above autonomous driving vehicles is completed by the vehicle itself, so traffic accidents that occur when the autonomous driving system is operating normally should be the responsibility of the vehicle manufacturer. However, according to the current traffic policies of various countries, L3 technology has not been widely recognized, and the first person responsible for traffic accidents is mostly the driver.
Taking the current "Road Traffic Safety Law of the People's Republic of China" as an example, it clearly stipulates that "during the driving process, the driver must not engage in behaviors that affect safe driving," indicating that the driver still needs to be responsible for the driving task at all times, and the driver will be the first person responsible in the event of an accident.
The current U.S. federal traffic regulations have made a clear division of responsibility for traffic accidents involving autonomous driving vehicles, stipulating that "if an autonomous driving vehicle is involved in a traffic accident, the human driver as a backup driver needs to bear the responsibility," and the regulations also add that the car manufacturing company does not evade the responsibility for traffic accidents.
Germany's "Autonomous Driving Law" enacted in 2021 stipulates: "L3-level autonomous driving vehicles can drive on 13,200 kilometers of highways throughout Germany at a speed not exceeding 60 km/h, can release hands but cannot sleep, and do not allow the driver to look back continuously or leave the seat. The driver still needs to take over the vehicle in necessary situations." If a traffic accident occurs in a vehicle that meets the above conditions, the responsibility will belong to the vehicle manufacturer.
Japanese related laws stipulate that when an L3-level autonomous driving vehicle within its territory is involved in an accident, the driver is responsible in principle, and the manufacturer's responsibility is limited to cases where there are clear defects in the car system. Accidents caused by system hacking are applicable to the government relief system.
Leave your email and subscribe to our latest articles
2021 / 4 / 18
2022 / 5 / 28
2024 / 6 / 22
2022 / 4 / 12