News about Ansible Motion driver-in-the-loop simulation solutions

How Immersive Driving Simulators Help Car Makers

Written by Phil Morse | Oct 23, 2018 11:26:00 AM

There’s a uniquely human aspect to testing and developing new cars. Despite the quantifiable data available, a human driver is arguably the ultimate arbiter of how well a car handles and performs. Beyond subjective appeal, cars also have to be robust to withstand the unpredictable nature of human drivers. That’s a challenging element to model in a virtual world.

Driving simulators appear to offer the best of both worlds. They allow engineers and test drivers the opportunity to sample a car’s behavior first-hand and even how everyday drivers react to new technologies in a controlled and safe laboratory environment. Historically, car companies and simulator manufacturers have struggled to unlock this potential, but it seems the driving simulator may finally have come of age thanks to new technology.

Manipulating the senses

The effectiveness of a simulator depends entirely on its ability to manipulate a driver’s senses. Professional test drivers and vehicle dynamics engineers are quick to discern real world from virtual world experiences; they can pick up on very low levels of latency, so it’s vital that all the sensory cues – the sights, sounds and motions – are delivered at the right time.

This is harder than it sounds, and it’s not helped by the fact that traditional driving simulators tend to use hexapod motion machinery originally designed for the aircraft industry. The dynamics of road vehicles are characterized by short, sharp movements, as opposed to the low-frequency pillowing created by aircraft wings. Believe it or not, even the most agile fighter jet lags behind a family hatchback when it comes to directional changes.

In response to this, Ansible Motion has developed a unique ‘stratiform’ motion system, a multi-layered machinery and control strategy that is designed expressly for ground vehicle dynamics. This stratiform layout provides lateral and longitudinal movement in a defined X-Y stage, while the layers above generate the yaw, pitch, roll and Z-plane ‘bounce’ motions.

Read the full article on Virtual Perception Magazine