| 3 min read

ADAS Sensors and Beyond

If one needs an illustration of the sensor complexity that is arising from ADAS-capable vehicles evolving into new autonomous and semi-autonomous modes of transport, look no further than the highly automated Delphi Drive concept vehicle. External awareness sensors such as radar, LiDAR, and cameras are just the tip of the iceberg for this vehicle. On board one can find touch sensors on the steering wheel, driver-facing cameras in the A-pillars, a fingerprint reader in console, and logic-activated haptic feedback motors in the seats.

With its myriad sensors, and feedback devices, and information processing strategies, and so on, this highly modified Audi SQ5 is in use now as a rolling test bed and demonstration platform for all manner of human-to-vehicle and vehicle-to-infrastructure (V2X) possibilities. Oddly, this test bed of today may be evidence that we are not far from a time when even “normal” production vehicles carry this level of sensing capability.

Sensory Overload

Most Tier 1’s and OEMs assert that when it comes to ADAS and autonomous system development, there’s much to do in a short amount of time – both inside specific vehicle developments, and in a much larger context. Many prefer testing on public roads due to its potential to inject certain realities, so this will likely remain a cornerstone of verification methodology for years to come. However, the difficulties and potential perils of testing new systems on public roads are obvious.  In some key areas Driver-in-the-Loop (DIL) simulator labs have become mainstays for ADAS development programs, usually due to special safety and/or Human Machine Interface (HMI) requirements. 

The use of DIL simulators is a logical approach for addressing “sensory overload” while reducing risk, since, by default, real people are interacting with the new technologies while they are being developed, but in a controlled setting. For example, a virtual test drive conducted inside the safe confines of a DIL simulator lab enables one to carefully measure and study both the human and vehicle responses to extreme situations such as sensor fault or unavailability -  due to, say, temporary radar sensor blindness that might be caused by slush build-up on the front of the vehicle. 

Making Sense of Sensors

One special focus area for vehicle and system development is the tuning of human task intervention thresholds and severities. Adapting an intervention strategy to the demands of real world scenarios is a common way of mitigating the risk of a false positive; e.g. a brake pre-fill rather than actual application when a higher vehicle speed is sensed.

Moreover, different OEMs may have different approaches for tuning functions such as Automated Emergency Braking (AEB), with more conservative companies looking to minimize the risk of a false activation, whereas others may be prepared to accept a higher level of false activations in the name of more effective system performance. OEMs must walk a tightrope in this area, as evidenced by the emergence of customer complaints and lawsuits in their key markets from both sides of the system performance fence – those that were claimed to be too conservative and those deemed to intervene too aggressively.

In the short term, ADAS systems are becoming more complex as Tier 1 suppliers and vehicle manufacturers fuse the complementary strengths of different sensor capabilities. Delphi’s own RACam unit, for example, seen on newer Volvos, combines radar and camera into one behind-the-windshield package with shared processing.  So it’s not just about converting measurements into driving decisions – it’s about doing it in the most efficient and cost-effective way.

Shifting Gears

The relationship between people and cars is shifting. On-board automotive sensing technology is undergoing a shift as well (it toggles between cause and consequence!) from information-only or warning-only sensors and systems aimed at augmenting the vehicle control task, towards fully autonomous functionality.  As such, the accompanying functional safety management considerations are becoming increasingly complex.

With ISO 26262 functional safety requirements informing ADAS test programs and Safety of the Intended Functionality (SOTIF) implications to consider, Tier 1 and OEMs are understandably drawn to development tools such as DIL simulators that can increase efficiency (as measured by both cost-per-test and tests-per-day) and/or directly reduce the consumption costs incurred by traditional on-road testing.

There can be little doubt that vehicle development concerns now extend beyond the ordinary, into areas such as examining human interactions with on-board functions and alerts, as well as proving out system performance, robustness, and reliability. Vehicle manufacturers realize that trust and effective communication pathways are required if any deployed system is to be accepted.  For example, drivers might be asked to simultaneously believe that their car knows best, while also believing that their vigilance and skill might be required at any minute.  Practically speaking, will this be possible?  We’ll soon have a sense of it.

To learn more about how OEMs and Tier 1 use engineering class Driver-in-the-Loop (DIL) simulators as product development tools, download our FREE white paper, Look Down the Road: Driving Simulator Technology & How Automotive Manufacturers will Benefit.

New Call-to-action