ADAS stands for Advanced Driver Assistance Systems. Since ADAS is an “assistance system”, it is natural to know that it is somewhat different from “autopilot”. But from another perspective, the development of ADAS systems is also the only way for us to move towards the ultimate goal of autopilot.
From the functional point of view, at present, ADAS mainly includes:
In addition, including: emergency braking system, automatic headlight opening and closing system, automatic parking assist system, night vision system, crosswind stabilization system, driving fatigue detection system, etc., also belong to the category of ADAS.
From the perspective of system architecture, ADAS is mainly composed of three modules: sensor, processor and actuator.
(1) Sensor: Sensors are used to detect various external signals. For ADAS, ultrasound, Radar, LiDAR and Camera…etc. are mainly used to detect the distance. As Figure 1 below, it shows the relationship of sensors’ functions and applications.
Figure 1 : Overview of sensing technologies used in response to different functional requirements of ADAS
(Quoted from: https://www.synopsys.com/automotive/what-is-adas.html)
(2) Processor: The function of processor is to process the incoming signal, which is usually called “Electronic Control Unit (ECU)” for cars. After receiving the signal, ECU makes appropriate classification and processing, and then outputs control signal to the actuator. Not only ECU, there are also microprocessor (MPU), digital signal processor (DSP)…etc.
(3) Actuator: According to the control signal sent by the processor, actuators control various actuated devices and let the related devices complete the actions. For example, starting the automatic brake to stop the car, displaying a warning message through the screen, starting the buzzer to make a warning sound, etc.
The full English name of LiDAR is Light Detection and Ranging. LiDAR can be applied to adaptive cruise control system, emergency braking system, pedestrian detection system and front collision warning system of advanced driver assistance system (ADAS). Its main function is precise ranging.
LiDAR is basically composed of three parts: laser light source, light sensor and imaging mechanism. Laser light sources generally use semiconductor lasers; light sensors generally use photodiodes (PD) or avalanche photodiodes (APD); imaging mechanisms are divided into scanning or non-scanning type. The commonly used distance measurement method in automotive LiDAR is Time of Flight (ToF) technology. Regarding ToF technology, it will be explained in detail in the next chapter of this article.
At present, the development of autopilot cars is divided into two factions according to the choice of “whether the LiDAR is adopted or not”: the first faction is the camp headed by Tesla. This camp only focuses on millimeter-wave radar and cameras, and does not use LiDAR. The second faction is headed by Google. Google not only uses millimeter-wave radar and cameras, but also uses the LiDAR of the Velodyne H64E to capture 360-degree 3D images.
The Tesla camp decided not to use LiDAR because it was too expensive. However, judging from the actual cases accumulated by the two camps in the past ten years: Tesla’s autopilot cars have had some serious car accidents; while Google’s autopilot cars have experienced more than 3 million miles of actual tests, only more than a dozen minor incidents have occurred. In addition, from the functional point of view, LiDAR can provide 0.1-degree angular resolution, 100-meter ranging and 5~10Hz picture update rate. This has led many teams engaged in the development of autopilot cars around the world to have a general consensus, that is: according to the current level of autopilot technology, autopilot cars that do not use LiDAR as sensors have no problem reaching the standards of Level 2~3; However, to reach Level 4~5, that is, to the level of ” High Driving Automation ” or even ” Full Driving Automation “, LiDAR must be used.
Table 1 The rating scale of autopilot by Society of Automotive Engineers (SAE)
LiDAR is so important for the development of autonomous driving, which also prompts the development of products towards lower cost, more durable and safer. The specific actions are as follows:
ToF is the abbreviation of Time of Flight. When we know the time of flight of light, we can calculate the distance by multiplying the speed of light by the time of flight. For example: the distance that light travels in one year is called a light-year.
The ToF technology can be subdivided into two types: (1) iToF (Indirect Time of Flight) technology, (2) dToF (Direct Time of Flight) technology. To realize these two technologies, both a transmitter and a receiver are required, and the difference is mainly in the formula for calculating the distance.
The transmitter of iToF technology uses modulated light with a specific period and amplitude. When the incident light of this particular modulation is reflected from the surface of the object, the receiver will receive the reflected light of the same period, but there is a phase delay between the incident light and the reflected light. When we measure this phase difference to be delayed by a few cycles, the distance can be calculated by the following formula.
The distance calculation formula of dToF technology is more direct:
The light source of dToF generally uses pulsed light. Pulsed light refers to a beam emitted in a very short time. The sensor of dToF records the current time when the pulsed light is emitted, and receives the reflected light. Calculate the time difference (Δt), and then directly multiply the time difference by the speed of light and divide by 2 to calculate the distance between the object and the car.
So, what are the advantages and disadvantages of iToF and dToF? We can see from the detailed comparison in the table below. Here is a small conclusion: In the short term, iToF technology should be able to occupy a certain market share with the cost advantage of CMOS. However, with the continuous evolution of SPAD process technology, it is expected that the cost of dToF technology will be greatly improved. At that time, the advantages of dToF technology in detection distance will occupy more market shares and product applications.
Table 2 Comparison of advantages and disadvantages of iToF and dToF
The Single Photon Avalanche Diode (SPAD) is a semiconductor photodetector. When we apply a higher reverse bias voltage (usually 100-200 V for silicon materials) across the SPAD, after the photons enter the silicon material, we can obtain about 100 times the effect of ionization collision (avalanche breakdown). This internal current gain in turn induces a cascading multiplication effect. At this time, the current will be very large and can be easily detected by the circuit. In terms of process, through the difference of doping technology, SPAD can allow higher voltage to be applied without being broken down, so as to obtain greater gain. In general, the higher the reverse voltage, the greater the gain.
Figure 2 Schematic diagram of the multiplication effect of single-photon avalanche diodes
SPAD is mainly used for LiDAR and long-distance optical fiber communication, in addition, it is also beginning to be used in fields such as positron tomography and particle physics. SPAD arrays have also been commercialized, well-known manufacturers include: SONY, STMicroelectronics and ON Semiconductor…etc.
More about SPAD characteristic testing, including bias region and current-voltage characteristics, Dark count rate (DCR), Jitter, etc., will be introduced more in the second article of this series. Coming soon!