Designing a Centralized Computing Architecture for Software-Defined Vehicles

The software-defined vehicle is inseparable from the future central computing architecture, as the two complement each other.

Designing a Centralized Computing Architecture for Software-Defined Vehicles

The above image is the central computing architecture conceived by Texas Instruments. The author has designed a vehicle central computing architecture, choosing two NVIDIA Orin chips for the vehicle computing center: one focused on ADAS, named Orin A, and the other focused on the cockpit, named Orin B.

Designing a Centralized Computing Architecture for Software-Defined Vehicles

The above image shows Renesas’s central computing architecture. Since Renesas does not have the best MCU, it only illustrates the MCU and the central computing center (server), which includes both the AD/FAS intelligent driving server and the IVI cockpit server. For the Zonal module, it consists of sensors, actuators, and gateways; switches and MCUs effectively act as gateways. For those parts mainly using the CAN network, switches are unnecessary as the CAN network is not that complex, while the Ethernet network part requires Ethernet switches.

Designing a Centralized Computing Architecture for Software-Defined Vehicles

The Ethernet switch is the latest third-generation automotive Ethernet switch 88Q5112 from Marvell, which is a 9-port Ethernet switch. It integrates 6 physical layers, including 3 for 100/10BASE-T1(S), two for 1000/100BASE-T1, and one for 100BASE-T1/TX. It can correspond to 5 Zonal ECUs plus one diagnostic OBD. The diagnostic OBD uses 1000Base-T1. Currently, except for intelligent driving and the cockpit, 100Mbps bandwidth is not required; Zonal mainly involves body and comfort networks. The chassis and power transmission can sufficiently use CAN, CAN-FD, or Flexray.

Designing a Centralized Computing Architecture for Software-Defined Vehicles

The above image is the internal framework diagram of 88Q5112, which supports TSN extensively, including 802.1AS-2020, 802.1Qat/Qav/Qbu/Qbv/Qci/Qcr, and ensures safety with 802.1CB.

Designing a Centralized Computing Architecture for Software-Defined Vehicles

The image above shows the camera input diagram for the two Orins, with Orin supporting a maximum of 16-lane MIPI CSI-2 inputs. Additionally, there are C-PHY and D-PHY for virtual channels. An 8-megapixel camera requires 4 lanes, while a camera below 2 megapixels only requires 1 lane; anything over 2 megapixels requires 2 lanes. To be safe, 2 lanes are used for 2-megapixel cameras, with all sensor models for reference only. For the intelligent driving part, the front main camera can utilize three 2.5-megapixel cameras corresponding to short, medium, and long distances, or one 8-megapixel camera, or two 2.5-megapixel cameras for stereo vision. The remaining 4-6 cameras can use either 2.5-megapixel or 2-megapixel cameras. Currently, the only deserialization chip corresponding to over 2-megapixels is ADI’s MAX9296, while MAX9295 is for serialization, effectively monopolizing the market. MAX9296 has adjustable rates of 1.5G, 3G, and 6G, and can handle up to 16 million pixels input.
The electronic rearview mirror, also known as the Camera Monitor System or E-mirror, replaces the rearview mirror with a screen and requires two video inputs. Due to the small display screen, a resolution of 1 million pixels is sufficient. This display can also add ADAS functions, thus categorizing it within the ADAS system.
In the cockpit field, video inputs mainly include 360-degree surround view, AR navigation, and driver monitoring DMS. The 360-degree surround view employs advanced 2-megapixel cameras, outputting 4-in-1 to MAX9286. AR navigation does not require high resolution, with 1.3-megapixels being sufficient. DMS is currently trending towards higher resolutions and the introduction of ToF cameras, tentatively set at 2-megapixels.
Camera data can also be transmitted via Ethernet, with a 2-megapixel camera in RAW10 format at 30fps having a bitrate of approximately 0.6Gbps. If using RGB888 format, the bitrate is about 1.44Gbps, while an 8-megapixel camera has a bitrate of around 5.7Gbps. Currently, there are physical layer chips supporting up to 10Gbps Ethernet, making the transmission of 8-megapixel images completely feasible. High-resolution cameras generally output in RAW8 or RAW10 formats, while low-resolution cameras typically use YUV422 format. The advantage of Ethernet transmission is that it can use a single pair of unshielded twisted pairs, whereas MIPI CSI-2 requires coaxial cable transmission. The cost of Ethernet connectors is also significantly lower than that of MIPI CSI-2 connectors. Ethernet also facilitates data shaping and traffic control actions using TSN, making communication through Ethernet switch routing more flexible than MIPI CSI-2. The cost of Ethernet physical layer chips is also lower than that of deserialization chips, but higher than that of serialization chips. The downside of Ethernet is that debugging is exceptionally complex, and debugging equipment is very expensive, with most engineers being unfamiliar with Ethernet debugging.
Ethernet can also bridge cameras, such as Marvell’s 88QB5224.

Designing a Centralized Computing Architecture for Software-Defined Vehicles

If the MIPI CSI-2 interface of the SoC is exhausted, but it is still necessary to connect cameras, Ethernet bridging can be considered, as shown in the figure. 88QB5224 supports the IEEE802.3ch standard, supporting up to 10G, allowing for the connection of 4 2-megapixel or 2.5-megapixel cameras without issue.

Designing a Centralized Computing Architecture for Software-Defined Vehicles

The above image shows the input cluster for lidar and millimeter-wave radar sensors.
Lidar may require 1-3 units, with one main lidar. Considering the technological advancements in lidar, sufficient bandwidth is reserved, supporting up to 10Gbps, which is equivalent to the resolution of a 12-megapixel camera. Of course, lidar is 3D. Currently, any type of lidar requires a bandwidth of 1Gbps. Auxiliary lidar usually serves as a blind spot supplement, with very low resolution, requiring only 100Mbps bandwidth. Corner radars are generally millimeter-wave radars, currently mostly outputting via CAN-FD, although supporting CAN-FD can be cumbersome for Orin, hence Ethernet is also used, and corner radars usually support Ethernet output, typically requiring 10Mbps bandwidth. 4D millimeter-wave radars output large data volumes, and 100Mbps bandwidth is sufficient. Finally, for controlling the chassis, the MCU selected is Infineon’s latest TCC4XX series, with a bandwidth of 1Gbps being adequate.
Marvell’s 88Q4364, along with Broadcom’s BCM8989X, are currently the only two physical layer chips supporting 10G automotive Ethernet.

Designing a Centralized Computing Architecture for Software-Defined Vehicles

88Q6113 is Marvell’s latest third-generation 11-port Ethernet switch.

Designing a Centralized Computing Architecture for Software-Defined Vehicles

The above image is the internal framework diagram of 88Q6113, supporting TSN with 802.1Qav/Qbv, 802.1Qat, and 802.1AS.

Designing a Centralized Computing Architecture for Software-Defined Vehicles

The image above shows the video output cluster for Orin B, with Orin’s display output being limited to DisplayPort (DP) 1.4a, which can also be configured to HDMI or eDP output. However, Orin has up to 7 USB outputs, including 3 USB3.2 and 4 USB2.0. USB and DisplayPort can be easily converted, and DisplayPort can be converted to other formats. Currently, most high-resolution displays use DP or eDP input, while the automotive field mainly uses LVDS, with a development trend towards eDP input, as seen in the current 2K or 4K automotive OLED screens. This saves a lot of format conversion time, and DP can also be converted to LVDS, such as the PS8625 from Surui, with a maximum bandwidth of 2.7Gbps and a maximum resolution of 1920*1200. By the way, this is the chip used by Tesla.USB2.0 has a relatively low bandwidth, sufficient for small rear screens with a resolution of approximately1280*720, and is also adequate for AR-HUD.USB3.2 is sufficient to support2K resolution. With Orin’s powerful GPU capabilities, outputting to5 2K screens should be relatively easy.Orin A also requires three video outputs, one transmitting to the central display CID, with manual signal source switching. The DP output from Orin A, along with the other two outputs for the electronic rearview mirror, can convert to LVDS.

Designing a Centralized Computing Architecture for Software-Defined Vehicles

The audio section of the cockpit can refer to ADI’s design, which is top-notch cockpit audio, including main vehicle noise reduction and road noise reduction. One important prerequisite for good sound quality is low cabin noise. The HeadUnit is Orin B, while the outdated MOST bus requires fiber optics, which is very cumbersome and has been abandoned by the times. The image also lacks a branch to the T-Box for E-Call.

Designing a Centralized Computing Architecture for Software-Defined Vehicles

The image above shows the peripherals of the cockpit. Generally, the SoC’s PCIe ports connect to the PCIe of Bluetooth and WiFi wireless modules. However, both Orins will occupy the highest level of PCIe. Additionally, board-to-board connections sometimes also use PCIe. Therefore, a PCIe to Ethernet conversion is designed using Microchip’s LAN7431. The Ethernet switch selected is the 7-port 88EA6321. Since many USB interfaces of Orin are occupied, an additional USB HUB is needed, choosing Microchip’s USB4914. The cockpit’s MCU is still selected as Renesas’s S4 or the traditional RH850. USS refers to ultrasonic sensors, which generally use the DSI3 bus connection, with 1 main and 11 slaves. The ultrasonic sensor system can connect to Orin using UART+SPI, which is relatively cumbersome.
NVIDIA DRIVE 6.0 Network Topology Safety Mode

Designing a Centralized Computing Architecture for Software-Defined Vehicles

Finally, here is the network topology diagram of NVIDIA’s latest autonomous driving system DRIVE 6.0 for reference. DRIVE 6.0 has two network topologies: one is BASE, and the other is Safety, as shown in the image. In Safety mode, the Ethernet switch’s CPU is bypassed, thus not supporting OTA. Simultaneously, the maximum MTU size is reduced from 16k to 1500 bytes. The Tegra here refers to Orin, while Aurix refers to Infineon’s TC397T. The Ethernet switch 1 is Marvell’s 88Q5072, and Ethernet switch 2 is 88Q6113. LAN7431 is Microchip’s PCIe to Ethernet chip. Safety mode sacrifices some switch performance for safety, thus adding a higher-level Ethernet switch 88Q6113, while in reality, one 88Q5072 is sufficient.
NVIDIA DRIVE 6.0 Network Topology Base Mode

Designing a Centralized Computing Architecture for Software-Defined Vehicles

This is the BASE mode of DRIVE 6.0’s network topology, with the Ethernet switch being 88Q5072. Orin has two 10Gbps lines, one connecting to the Ethernet switch and the other connecting to the central gateway, which is the backbone Ethernet line within the vehicle.
NVIDIA also specifically mentioned the API for the automotive Ethernet TSN standard, including time synchronization 802.1AS(gPTP), sequencing and traffic shaping with 802.1Qav, 802.1Qbv, 802.1Qbu/802.3br, 802.1CB, and 802.1Qci, as well as failure tolerance with 802.1Qat and 802.1Qcc.
The future centralized computing architecture resembles a PC, with all data processing relying on a single CPU. The foundational middleware AUTOSAR, along with operating systems like Linux, QNX, and Android, form a system similar to Windows, which is essentially transparent to the majority of programmers, who merely design application software. This is the true software-defined vehicle.
Statement: This article represents the author’s personal views only.

More Reports from Zosi

For report orders and cooperation inquiries, please contact:
Ms. Zhang: 13716037793 (same as WeChat)

Zuo Researcher: 18600021096 (same as WeChat)

Zosi’s 2023 Research Report Writing Plan

Panorama of the Intelligent Connected Vehicle Industry Chain (November 2022 Edition)

Autonomous Driving of Independent Brands Automotive Vision (Domestic) High-Precision Maps
Autonomous Driving of Joint Venture Brands Automotive Vision (International) High-Precision Positioning
ADAS and Autonomous Driving Tier1 – Domestic Surround View Market Research (Local) Automotive Gateway
ADAS and Autonomous Driving Tier1 – International Surround View Market Research (Joint Venture) Data Closed Loop Research
Autonomous Driving and Cockpit Domain Controllers Infrared Night Vision Automotive Information Security Hardware
Multi-domain Computing and Regional Controllers Autonomous Driving Simulation (International) Automotive Information Security Software
Passenger Car Chassis Domain Control Automotive Simulation (Lower) OEM Information Security
Domain Controller Ranking Analysis Lidar – Domestic Edition Wireless Communication Modules
E/E Architecture Lidar – International Edition Automotive 5G Integration
L4 Autonomous Driving Millimeter-Wave Radar 800V High-Voltage Platform
L2/L2+ Autonomous Driving Vehicle Ultrasonic Radar Fuel Cells
Passenger Vehicle Camera Quarterly Report Radar Disassembly Integrated Battery
ADAS Data Annual Report Lidar and Millimeter-Wave Radar Ranking Integrated Die Casting
Joint Venture Brand Vehicle Networking Special Vehicle Autonomous Driving Automotive OS Research
Independent Brand Vehicle Networking Mining Autonomous Driving Steer-by-Wire Chassis
Autonomous Heavy Trucks Unmanned Shuttle Vehicles Skateboard Chassis
Commercial Vehicle ADAS Unmanned Delivery Vehicles Electric Control Suspension
Commercial Vehicle Intelligent Cockpit Unmanned Retail Vehicle Research Steering Systems
Commercial Vehicle Vehicle Networking Autonomous Driving in Ports Charging and Swapping Infrastructure
Commercial Vehicle Intelligent Chassis Modular Reports Automotive Motor Controllers
Automotive Intelligent Cockpit V2X and Vehicle-Road Collaboration Hybrid Power Reports
Multi-Screen and Linked Screens in the Cockpit Roadside Intelligent Perception Automotive PCB Research
Intelligent Cockpit Design Roadside Edge Computing IGBT and SiC Research
Instrument and Central Control Display Automotive eCall System EV Thermal Management System
Intelligent Rearview Mirror Automotive EDR Research Automotive Power Electronics
Dash Cameras Personalization of Intelligent Vehicles Electric Drive and Power Domain
Automotive Digital Key Automotive Multi-Modal Interaction Automotive Wiring Harness
Automotive UWB Research Vehicle Voice Recognition Automotive Audio
HUD Industry Research TSP Manufacturers and Products Automotive Seats
Human-Machine Interaction Autonomous Driving Regulations Automotive Lighting
Vehicle DMS Autonomous Driving Standards and Certification Automotive Magnesium Alloy Die Casting
OTA Research Intelligent Connected Testing Base Denso’s New Four Modernizations
Automotive Cloud Service Research PBV and Automotive Robotics New Forces in Vehicle Manufacturing – NIO
Automotive Functional Safety Flying Cars New Forces in Vehicle Manufacturing – Xpeng
AUTOSAR Research Integrated Parking Research New Forces in Vehicle Manufacturing – Li Auto
Software-Defined Vehicles Smart Parking Research Autonomous Driving Chips
Software Suppliers Automotive Car-Sharing Cockpit SOC
Passenger Car T-Box Shared Mobility and Autonomous Driving Automotive VCU Research
Commercial Vehicle T-Box Digital Transformation of Automotive Enterprises Automotive MCU Research
T-Box Ranking Analysis Intelligent Surfaces Sensor Chips
NIO ET5/ET7 Intelligent Features Disassembly Model Supplier Research
Zosi Research Monthly Report
ADAS/Intelligent Vehicle Monthly Report | Automotive Cockpit Electronics Monthly Report | Automotive Vision and Radar Monthly Report | Battery, Motor, and Control Monthly Report | Vehicle Information System Monthly Report | Passenger Vehicle ACC Data Monthly Report | Front View Data Monthly Report | HUD Monthly Report | AEB Monthly Report | APA Data Monthly Report | LKS Data Monthly Report | Front Radar Data Monthly Report

Leave a Comment

×