What is ADAS?

Definition

Almost all vehicle accidents are caused by human error, which can be avoided with Advanced Driver Assistance Systems (ADAS). The role of ADAS is to prevent deaths and injuries by reducing the number of car accidents and the serious impact of those that cannot be avoided.

Essential safety-critical ADAS applications include:

  • Pedestrian detection/avoidance
  • Lane departure warning/correction
  • Traffic sign recognition
  • Automatic emergency braking
  • Blind spot detection

These lifesaving systems are key to ensuring the success of ADAS applications, incorporating the latest interface standards and running multiple vision-based algorithms to support real-time multimedia, vision co-processing, and sensor fusion subsystems.

The “SmartPhonezation” of ADAS applications is the beginning steps to realization of autonomous vehicles.


What is ASIL?

For more information on applications supporting ADAS in functional safety and ISO 262620 standards.

How does ADAS work?

Automobiles are the foundation of the next generation of mobile-connected devices, with rapid advances being made in autonomous vehicles. Autonomous application solutions are partitioned into various chips, called SoCs (systems on a chip). These chips connect sensors to actuators through interfaces and high-performance ECUs (electronic controller units).

Self-driving cars use a variety of these applications and technologies to gain 360-degree vision, both near (in the vehicle’s immediate vicinity) and far. That means hardware designs are using more advanced process nodes to meet ever-higher performance targets while simultaneously reducing demands on power and footprint. 

Advanced Driving Assistance System | Synopsys

ADAS Applications

Significant automotive safety improvements in the past (e.g., shatter-resistant glass, three-point seatbelts, airbags) were passive safety measures designed to minimize injury during an accident. Today, ADAS systems actively improve safety with the help of embedded vision by reducing the occurrence of accidents and injury to occupants.

The implementation of cameras in the vehicle involves a new AI ​​function that uses sensor fusion to identify and process objects. Sensor fusion, similar to the human brain process information, combines large amounts of data with the help of image recognition software, ultrasound sensors, lidar, and radar. This technology can physically respond faster than a human driver ever could. It can analyze streaming video in real time, recognize what the video shows, and determine how to react to it.

Some of the most common ADAS applications are:

 

1. Adaptive Cruise Control

Adaptive cruise control (ACC) is particularly helpful on the highway, where drivers can find it difficult to monitor their speed and other cars over a long period of time. Advanced cruise control can automatically accelerate, slow down, and at times stop the vehicle, depending on the actions other objects in the immediate area.

 

2. Glare-Free High Beam and Pixel Light

Glare-free high beam and pixel light uses sensors to adjust to darkness and the vehicle’s surroundings without disturbing oncoming traffic. This new headlight application detects the lights of other vehicles and redirects the vehicle’s lights away to prevent other road users from being temporarily blinded. 

 

3. Adaptive Light Control

Adaptive light control adapts the vehicle’s headlights to external lighting conditions. It changes the strength, direction, and rotation of the headlights depending on the vehicle’s environment and darkness.

 

4. Automatic Parking 

Automatic parking helps inform drivers of blind spots so they know when to turn the steering wheel and stop. Vehicles equipped with rearview cameras have a better view of their surroundings than traditional side mirrors. Some systems can even complete parking automatically without the driver’s help by combining the input of multiple sensors.

 

5. Autonomous Valet Parking

Autonomous valet parking is a new technology that works via vehicle sensor meshing, 5G network communication, with cloud services that manage autonomous vehicles in parking areas. The vehicles sensors provide the vehicle with information about where it is, where it needs to go, and how to get there safely. All this information is methodically evaluated and used to perform drive acceleration, braking, and steering until the vehicle is safely parked.

 

6. Navigation System

Car navigation systems provide on-screen instructions and voice prompts to help drivers follow a route while concentrating on the road. Some navigation systems can display exact traffic data and, if necessary, plan a new route to avoid traffic jams. Advanced systems may even offer Heads Up Displays (HuD) to reduce driver distraction.

Advanced Driving Assistance System Applications | Synopsys

 

6. Navigation System

Car navigation systems provide on-screen instructions and voice prompts to help drivers follow a route while concentrating on the road. Some navigation systems can display exact traffic data and, if necessary, plan a new route to avoid traffic jams. Advanced systems may even offer Heads Up Displays (HuD) to reduce driver distraction.

 

7. Night Vision

Night vision systems enable drivers to see things that would otherwise be difficult or impossible to see at night. There are two categories of night vision implementations: Active night vision systems project infrared light, and passive systems rely on the thermal energy that comes from cars, animals, and other objects.

 

8. Blind Spot Monitoring

Blind spot detection systems use sensors to provide drivers with important information that is otherwise difficult or impossible to obtain. Some systems sound an alarm when they detect an object in the driver’s blind spot, such as when the driver tries to move into an occupied lane. 

 

9. Automatic Emergency Braking

Automatic emergency braking uses sensors to detect whether the driver is in the process of hitting another vehicle or other objects on the road. This application can measure the distance of nearby traffic and alert the driver to any danger. Some emergency braking systems can take preventive safety measures, such as tightening seat belts, reducing speed, and adaptive steering to avoid a collision.

 

10. Crosswind Stabilization

This relatively new ADAS feature supports the vehicle in counteracting strong crosswinds. The sensors in this system can detect strong pressure acting on the vehicle while driving and apply brakes to the wheels affected by crosswind disturbance. 

 

11. Driver Drowsiness Detection

Driver drowsiness detection warns drivers of sleepiness or other road distractions. There are several ways to determine whether a driver’s attention is decreasing. In one case, sensors can analyze the movement of the driver’s head, and heart rate to determine whether they indicate drowsiness. Other systems issue driver alerts similar to the warning signals for lane detection.

 

12. Driver Monitoring System

The driver monitoring system is another way of measuring the driver’s attention. The camera sensors can analyze whether the driver’s eyes are on the road or drifting. Driver monitoring systems can alert drivers with noises, vibrations in the steering wheel, or flashing lights. In some cases, the car will take the extreme measure of stopping the vehicle completely. 

 

13. 5G and V2X

This hot new 5G ADAS feature, with increased reliability and lower latency, provides communication between the vehicle and other vehicles or pedestrians, generally referred to as V2X. Today, millions of vehicles connect to cellular networks for real-time navigation. This application will enhance existing methods and the cellular network to improve situational awareness, control or suggest speed adjustments to account for traffic congestion, and update GPS maps with real-time updates. V2X is essential to support over-the-air (OTA) software updates for the now-extensive range of software-driven systems in cars, from map updates to bug fixes to security updates and more. 

 


The Adoption of 5G for Automotive Applications

New 5G data communications enables V2X applications.

Why is ADAS important?

According to the August 2016 Traffic Safety Facts Research Note by the National Highway Traffic Safety Administration (NHTSA), “The Nation lost 35,092 people in crashes on U.S. roadways during 2015.” This 7.2% increase was “the largest percentage increase in nearly 50 years.” An analysis revealed that about 94% of those accidents were caused by human error, and the rest by the environment and mechanical failures.

The opportunity to reduce car accidents is making automotive ADAS even more critical. Automatic emergency braking, pedestrian detection, surround view, parking assist, driver drowsiness detection, and gaze detection are among the many ADAS applications that assist drivers with safety-critical functionality to reduce car accidents and save lives. 

The future of ADAS

The increasing amount of automotive electronic hardware and software requires significant changes in today’s automobile design process to address the convergence of conflicting goals:

  • Increased reliability
  • Reduced costs
  • Shorter development cycles

The trend is shifting from distributed ADAS electronic controller units (ECUs) to a more integrated ADAS domain controller with centralized ECUs. This means that we are currently at what SAE International designates as Level 2 (Partial Driving Automation), where the vehicle can control both steering and accelerating/decelerating but falls short of self-driving because a human sits in the driver’s seat and can take control of the car at any time.

Levels of driving automation  | Synopsys

Shifting toward fully autonomous cars—vehicles capable of sensing their environment and operating without human involvement—requires an increase in the electronic architecture of these vehicles.

With the increase in electronic architecture comes an increase in the volume of data. To handle this data, the new integrated domain controllers require higher computing performance, lower power consumption, and smaller packaging.

The adoption of 64-bit processors, neural networks and AI accelerators to handle the high volume of data requires the latest semiconductor features, semiconductor process technologies, and interconnecting technologies to support ADAS capabilities.

The reduction of electronic modules leads to centralized computing architectures, requiring critical automotive building blocks, including processors with vision processing capabilities, neural networks, and sensor fusion. All while addressing the need for quality, safety, and security.

Every aspect of the car is designed to be more connected, requiring subsystem and SoC designers to expand the scope of safety measures beyond the traditional steps taken to ensure physical safety. Applying the latest embedded computer vision and deep learning techniques to automotive SoCs brings greater accuracy, power efficiency, and performance to ADAS systems.

Drive Automotive Innovation from the Inside Out