| 3 min read

Real-time Models for Automotive Sensors

real time sensor model automotivee autonomous

In the rush to innovate, include complex on-board systems into vehicle programs, and conduct the required virtual sign-offs for such systems, vehicle constructors are naturally keen to integrate real-time sensor (software) models into their vehicle simulation programs. 

Unfortunately, most sellers of silicon – suppliers of sensors and complete ECUs and anything in-between – do not typically have ready-to-go, Software-in-the-Loop (SIL) models that allow inclusion of their offerings inside vehicle physics simulation environments.

Timing is Everything

Unlike, say, tires or dampers or other traditional hardware bits that have mathematical model / simulation-executable counterparts, automotive sensors (LiDAR, radar, cameras, etc.), and the logic that accompanies them, are typically on-boarded and released into the marketplace without having been vetted by simulation in the traditional sense.  Instead, most sensors are evaluated relatively late in the product development cycle, in parallel with prototype vehicle testing.  For some reason, this has not caused much concern to date.  But the cart is now catching up with the horse, so to speak.

See the World Differently

When SIL sensor models are not available, Hardware-in-the-Loop (HIL) tethering to real-time simulation environments is a possibility.  For example, a physical sensor, residing elsewhere in a HIL test bench (such as dSPACE system, etc.) can be aimed at a miniature model, or instructed to communicate directly with information from an executable simulation environment.

If human participation is required (as is often the case, for virtual test driving, sign-off, etc.), sensor + HIL/SIL environments can be connected to engineering class Driver-in-the-Loop (DIL) simulator labs.  The “engineering-class” definition is key, because this allows co-execution of world-space content that is appropriate for “sensor eyes” as well as “human eyes.” (As discussed in previous articles, what is pleasing to the human eye may or may not be pleasing to HIL or SIL sensors!)

Should all engineering-class DIL simulator laboratories have communication bridges with HIL benches? At this slice in history, the answer is probably “yes.”  After all, a DIL simulator lab's primary purpose is to provide human participants – drivers / occupants / evaluators – with salient sensory information, so they will behave as if they are interacting with real cars.  And these days, cars are equipped with devices that can't be staged for evaluation in any other way.

Connect the Dots

Some may be unsure about how real people might connect, subjectively and objectively, with this brave new world of sensors and AI logic.  After all, where are the lines in the sand between hardware, software, and human involvement (in any vehicle experiment)?  It's a good question. 

Since DIL, HIL, and SIL simulation technologies tend to present themselves as moving targets, no one reputable should claim to know the answer.  But we can assert that human assessment of proposed technologies is mission critical for product development.  After all, cars, at this slice in history, are still, ultimately, consumer products.  And product success and brand identity ultimately depend upon customer acceptance / assessment in the marketplace.

Is there any comfort in that?  Probably not.  After all,  customers can be fickle, unpredictable.  But perhaps there is  some comfort for automotive engineers, vehicle constructors, and product developers in knowing that, in connectivity terms, tethering DIL simulation laboratories with complex sensor HIL benches is possible without too much fuss.  So it's entirely possible to evaluate new concepts subjectively and objectively, before any metal is cut.

For DIL simulators, the tasks are usually the same: Swap information about what the driver/occupant is doing, what the ego vehicle physics model is doing, and what the virtual surroundings are doing.  In technical terms, such software/hardware connectivity challenges are really no different than connecting, say, a CarSim vehicle model to a multi-projector graphics rendering application.  

The Human Touch

Including sensors in DIL simulator environments, as either HIL or SIL (or both!), is a positive step towards ensuring the early-and-often human evaluation opportunities that are required for unleashing any proposed systems into the wild with confidence. Is the vehicle accelerating or decelerating too quickly in response to sensor information? Has a particular ADAS intervention been over-gained, thus making vehicle occupants uncomfortable?  Are there any environmental disruptions that encourage a driver to take actions that contradict on-board system intervention logic, resulting in a conflict? There is really no reason to wait to learn how these situations might unfold.

Engineering-class DIL simulators are already  positioned to help validate new on-board systems at the earliest stages of product development.  To learn more about Ansible Motion's unique approaches and technologies, please consider downloading our FREE white paper, “10 Advantages of Ansible Motion DIL Simulators”.

Download the 10 Advantages of Ansible Motion's Driving Simulator