Driver-in-the-Loop (DIL) simulators have become indispensable tools for modern automotive developments. They allow engineers and human evaluators to assess vehicle attributes, control systems, and human-machine interactions in a controlled laboratory environment long before physical prototypes are available, and they also play a valuable role alongside physical testing once it’s underway.
As the complexity of modern vehicles increases – especially with the rise of advanced driver assistance systems (ADAS), electrification, onboard system advancements, and autonomous driving – the value of DIL simulation has grown dramatically.
However, the effectiveness of a modern DIL simulator depends not only on its fidelity as a sensory immersion platform (via motion, vision, audio, haptic systems, etc.), but also on its ability to integrate seamlessly with external software and hardware tools. Vehicle physics models, Software-in-the-Loop (SIL) and Hardware-in-the-Loop (HIL) systems, and sensor simulation frameworks must all interact in real time with the simulator. For vehicle engineering teams, a properly-configured DIL simulator lab can become the hub of an expanded, interconnected virtual testing ecosystem.
We’ve touched upon this topic in previous articles such as Driving Simulator Hardware and Software Compatibility, but this article looks at toolset integration from a different perspective by exploring why open connectivity is essential for engineering-class DIL simulators. Along the way, we’ll highlight several widely used software and hardware tools that we have successfully integrated.
Driver-in-the-Loop Simulation in Modern Vehicle Developments
DIL simulation connects a real human driver or participant with a simulated vehicle and environment. When we place the human at the centre of DIL, we see that all the simulated elements – motion systems, vision systems, physics models, world-space representations and virtual scenarios – are really meant to serve as a sensory immersion and behavioural interaction environment.
DIL simulation immersion is analogous to the human experience of a real car in the real world, but the systems in play are different. To help visualise this, let’s look at how we might consider attribute assessment of a real car (as shown on the left; image courtesy of AVL Vehicle Composer™), compared to an “attribute assessment” of a DIL simulator (as shown on the right):

Vehicle engineers are, of course, intimately familiar with the real car paradigm. For example, improvements in on-centre steering feel criteria are achieved by systematic tuning of physical components (tyres, steering racks, etc.) and vehicle control systems (EPAS mapping, controller gains, etc.). But the DIL simulator configuration can be a bit more mysterious. Namely, we might ask: What simulated systems need to be included in a DIL simulator environment to provide useful insights for the corresponding real-world assessment category? In the case of on-centre feel, a DIL simulator may need to communicate with specific tyre models, road surface representations, steering torque emulator settings, Hardware-in-the-Loop (HIL) test benches, and so on.
These DIL-integrated systems need to be identified and specified in order to allow engineers to evaluate specific vehicle behaviours and human evaluation criteria.
Bottom line: To be useful for engineering work, a DIL simulator itself needs to operate within a larger simulation workflow that includes appropriate physics modeling, controller development toolsets, and hardware validation systems. And this needs to occur in a way that is less like a “science project” and more like a “plug-and-play” scheme.
The Need for Open and Flexible Integration
Modern automotive development is highly distributed, across multiple tools and teams. Vehicle dynamics engineers may use one set of tools, control engineers another, and hardware validation teams yet another. A DIL simulator that cannot interface easily with the various systems required by various functional groups quickly becomes isolated and limited in its usefulness.

In order to cross the divide, so to speak, and help meet the needs of all stakeholders, DIL simulator integration capabilities are critical in four key areas:
1. Vehicle and Tyre physics modeling
2. Visual and Environment modeling
3. Software-in-the-Loop
4. Hardware-in-the-Loop
The most useful DIL simulators are those that provide open co-simulation and I/O exchange capabilities that establish rich communication pathways with the above.
At Ansible Motion, we have developed our Distributed Data Bus (DDB) co-simulation environment to specifically address this. DDB is a powerful, flexible, synchronous real-time computing environment with open and modular software architecture that has proven itself in the field, time and again, over the last 15 years.
And it’s worth noting that DDB receives continuous development to meet the needs of our customers. At the time of this article, our DDB is evolving into a less distributed, lower latency implementation that will soon be re-badged as RTC+SCC. For those who are interested: RTC is our Real Time Controller that primarily serves as the data and communication backbone for our DIL simulators; SCC is our Simulation Control Centre that serves as a supervisory / operator interface for all the DIL-specific and externally-connected systems.
Stay tuned for more about this progression. But for now, let’s get back on track by taking a closer look at the four key integration areas identified above.
Integration with Vehicle and Tyre Physics Models
Accurate vehicle and tyre modeling is the foundation of any driving simulator. Some simulator providers are locked into specific physics modelling toolsets (or at least have “highly preferred” solutions). In keeping with our modular approach, Ansible Motion opens the door for its customers to use the modeling tools that best suit their needs. These can be commercial tools or in-house “black box” modeling environments.

The tyre modeling realm is extensive, covering a wide range of predictive performance attributes that significantly influence the overall DIL simulator driving experience. Most trye models – including MF-Tire (with both Siemens and non-Siemens solvers), MF-Swift, Fraunhofer ITWM’s CDTire, Cosin FTire, and many more – are fully integrated with Ansible Motion’s DIL simulator environments, so specific virtual test driving sessions can be easily configured.
We’ll refer you to our previous articles such as 3 Approaches for Driving Simulator Tire Models that describe our integrations with advanced, real-time tyre models – including thermal, empirical, and physical, structure-based models – to simulate tyre-road-vehicle interactions.
In the realm of commercially-available vehicle modeling tools, there are many options that also smoothly integrate with Ansible Motion DIL simulators.
One widely used tool is CarSim, part of the broader VehicleSim toolset. CarSim provides high-fidelity, lumped-parameter models of suspension systems, tyres, powertrains, and vehicle dynamics, allowing engineers to simulate braking, handling, ride, and stability across a wide range of scenarios. Ansible Motion also connects seamlessly with IPG CarMaker and Hexagon’s Adams/VTD simulation environments.
AVL VSM™ is another established and trusted vehicle dynamics simulation tool, especially for the development and optimisation of vehicle energy and driving attributes from initial concept, though to testing and sign-off phases. AVL is also a market-leading supplier of automotive lab test equipment, so there is a convenient bridge to HIL integrations (which are discussed below).

Another widely-used tool is Simpack, a multi-body dynamics (MBD) simulation software from Dassault Systèmes. Simpack is often used for detailed mechanical system modeling, including suspension kinematics, drivetrain dynamics, and flexible body modeling. Because it can generate real-time capable models, integration with Ansible Motion DIL simulators is seamless, allowing high-fidelity representations of complex mechanical systems.
Typically, tools like the ones mentioned above support integration into our DIL simulator systems’ co-simulation framework through APIs and real-time interfaces. There are also standardised model exchange frameworks such as the Functional Mock-up Interface (FMI), by which models can be packaged as Functional Mock-up Units (FMUs) and re-used across multiple simulation environments in addition to Ansible Motion DIL simulators. This “model portability” can help maintain consistency between teams and across vehicle development stages.
Integration with Visualisation and Environment Simulations
Beyond vehicle and tyre physics, DIL simulators must also integrate with visualisation and environment simulation platforms. This is necessary to place the simulated vehicle (and human participant) into a content-rich, interactive, virtual world-space.
Historically, we’ve seen a number of game-engine technologies adopted for this purpose. For example, the widely-used Unreal Engine and Unity environments have evolved in recent years to provide high-fidelity rendering capabilities for roads, traffic, weather, and lighting conditions. Other, related tools have emerged as well – such as CARLA, an open-source, environment simulation that is built atop Unreal Engine, that is used primarily for autonomous vehicle simulations. CARLA provides detailed urban environments and sensor simulation capabilities, allowing engineers to test perception algorithms and autonomous driving stacks in realistic conditions.

One of Ansible Motion’s sister companies, rFpro, deserves special mention. rFpro provides engineering-class world-space simulation environments that are used exclusively for developing ground vehicles. Its solutions are used for the development, testing and validation of autonomous vehicles, ADAS, sensors, on-board control and hardware systems, vehicle dynamics and human factor studies.
rFpro specialises in geo-specific environment models (digital twins) that are in a class of their own when it comes to integration with DIL simulators. rFpro environments provide more than photo-realistic imagery across various weather and lighting conditions – they also deliver engineering-class road surface (terrain) representations that are critical for proper tyre model interactions, as well as the extremely low-latency image rendering capability that is necessary for the mutli-channel display systems used in large, dynamic DIL simulator labs. rFpro has also developed a specialised environment simulation solution called AV elevate that enables the tuning of sensor systems, the training of perception and control algorithms, and testing of full AV technology stacks – all of which are important for closed-loop virtual perception testing and the creation of accurate synthetic training data for AI and sensor fusion systems.
Since Ansible Motion DIL simulators can be integrated with powerful visualisation and environment simulations such as these (and more), it means that engineering teams can use their simulator labs to replicate anything from standard proving ground tests to extremely complex driving scenarios.
Integration with Software-in-the-Loop (SIL)
SIL testing is typically aimed at evaluating control algorithms within simulation environments – both off-line and DIL – before code is compiled and deployed on real hardware. This step is essential for validating vehicle strategies such as stability control, torque vectoring, and autonomous driving functions.
One of the most common environments for SIL development is the MathWorks’ toolset, consisting of MATLAB and Simulink. These tools are widely used for model-based development and control system design in the automotive industry.
For instance, Simulink can integrate vehicle dynamics solvers such as those mentioned above (CarSim, Simpack, etc.) through block interfaces – which can be thought of as “co-simulation links.” In this configuration, vehicle physics can be executed inside the DIL simulator (or with an external solver), control algorithms can be executed within Simulink, and I/O signals can be exchanged between the various systems in real time.

In addition, code-generation tools – such as dSPACE’s TargetLink – can then automatically convert validated Simulink models into production code for electronic control units (ECUs). This workflow enables engineers to move from DIL-simulator-assisted algorithm design to production-ready software while maintaining traceability and validation within the simulation environment(s).
Integration with Hardware-in-the-Loop (HIL)
While SIL validates software models, HIL evaluates real embedded hardware interacting with simulated vehicle systems and live DIL simulator drivers and participants.
In a HIL-connected DIL setup, the DIL simulator replaces the real vehicle, as usual, while an actual ECU implementation and/or mechanical test bench – with production or prototype components – contributes to the running of control software and/or logic elements. The DIL simulator might be responsible for generating vehicle responses, environmental inputs, and sensor signals in real time, while real hardware is called upon “in the loop.”
Ansible Motion has integration expertise that covers different virtual testing configurations that might be characterised as “HIL,” each requiring different levels of active communications with our DIL simulators. Broadly speaking, we term these Traditional HIL and Mechanical HIL (mHIL).
Our experience with Traditional HIL applications involves integrating our DIL simulators with a number of commercially available HIL platforms such as real-time systems from dSPACE, PXI-based systems from National Instruments, real-time target machines from Speedgoat, and real-time “simulators” from OPAL RT Technologies – among many others.
We also have experience integrating our DIL simulators with specialised real-time computing platforms such as Concurrent Real-Time systems. These platforms are specifically designed for low-latency, deterministic simulation workloads. In a sense, a Concurrent RT system (or similar) replicates what we accomplish natively with our DDB (soon to be RTC+SCC) co-simulation environment. Meaning: The “master clock” governing all the DIL simulator systems and models can easily be shared and synchronised with external systems. This really calls direct attention to the extremely modular and flexible nature of our DIL simulator ecosystems! Many simulator providers simply do not (or cannot) provide this level of openness.
One natural extension of this is our ability to integrate with Mechanical HIL (mHIL) test benches and lab equipment. The key to understanding how this is possible, is to view our DIL simulator ecosystems as we do: as ecosystems that we’ve strategically designed to invite attribute enhancements and extensions (as depicted in the leading image in this article), and ecosystems that are agnostic as to whether they are communicating with real hardware or virtual models, e.g., a real powertrain running on a chassis dyno is treated the same as a computer model of a powertrain.

We work with a number of test bench and lab equipment specialists to integrate our DIL simulators with mHIL systems. As mentioned above, we work closely with AVL, so we can offer seamless DIL integrations with AVL SPECTRA™ series chassis dynos and other equipment. We also work closely with Mdynamix on real hardware integrations for steering systems, ECUs, and control software. These are but two examples of many.
One interesting footnote is that Ansible Motion DIL simulators also integrate with themselves in the same way. Meaning: We can easily connect several DIL simulators to create multi-sim configurations, which enables multiple people to participate in the same simulator session. For example, several people can drive their own virtual vehicles at the same time, and interact with everyone within the same virtual world-space.
Ansible Motion goes the extra mile to ensure all connected models and systems can run in real time while exchanging synchronous signals with ECUs and mechanical hardware, motion and vision systems, sensor arrays, and anything else that might be necessary. This truly open architecture enables full closed-loop testing between human participants, software, hardware, and simulated vehicle behaviours in virtual environments.
Benefits of an Open Simulation Ecosystem
The ability to connect Ansible Motion DIL simulators with external tools provides several important advantages:
-
Re-use of Models Across Development Phases - Vehicle models developed using any of the tools mentioned above (and others) can be re-used in SIL, HIL, and DIL environments. This ensures consistent behavior across the development process and supports model correlation and validation.
-
Faster Vehicle Development Cycles – DIL simulation allows engineers to actively test and evaluate control strategies and vehicle configurations long before physical prototypes are available, reducing cost, development time, and resource consumption. When simulation is used in parallel with physical testing it reduces much of the guesswork involved, allowing engineers to make the most of scheduled test sessions and pursue excellence criteria (as opposed to only being able to address minimalist, acceptance criteria due to testing time constraints.)
-
Early Hardware Validation - HIL system integrations allow real ECUs and other real hardware systems to interact with simulated vehicles and scenarios, enabling embedded software validation at early stages of development. In addition, DIL simulation allows failsafe and edge case testing in a repeatable, controlled, safe environment.
-
Cross-Domain Collaboration - Open interfaces allow different engineering teams – vehicle dynamics specialists, control engineers, software developers, and hardware validation teams – to connect with a unified simulation ecosystem while still being able to use their preferred tools.
This last bullet hints at the heart of the matter: DIL simulation labs, by default, invite human participation in virtual test-driving sessions, so DIL labs can naturally serve as collaboration hubs. But in order to serve vehicle development programmes in this capacity, DIL simulators must have the ability to connect seamlessly with a wide range of external software and hardware systems.
As automotive systems continue to grow in complexity, so too do the tools necessary to support their development. With the expansion of autonomous driving, electrification, software-defined vehicles, and other new technologies, it’s clear to see that the importance of open, modular, and interoperable simulation architectures will only increase. Ansible Motion stands at the ready to support this, providing truly open and modular DIL simulation ecosystems supported by deep field experience that result in DIL simulators that go far beyond what might have been considered “state of the art” just a few years ago.

