Real-World ADAS Testing: Measuring Driver Interaction with Dewesoft DAQ

May 15, 2026

Understanding ADAS in modern vehicles
Advanced Driver Assistance Systems (ADAS) have become a defining feature of modern automotive engineering. These systems are designed to enhance safety, improve driver comfort, and reduce human error by automating critical driving functions.
Early ADAS implementations included relatively simple features such as anti-lock braking systems (ABS) and rear-view cameras. Today, however, ADAS technologies have evolved into highly sophisticated systems with safety features such as lane departure warning, adaptive cruise control, collision warning, blind-spot detection, driver drowsiness detection, and emergency braking. At its most basic level, ADAS saves lives when it provides warnings that alert the driver to dangerous conditions. Higher-level ADAS actually takes control, applying the brakes or steering the car to avoid an accident.
ADAS systems use RADAR, LiDAR, ultrasonic, acceleration, force, position, and vision sensors, combined with powerful computing, to perform these complex tasks. As the automotive industry moves toward higher levels of autonomy, understanding how drivers interact with these systems has become just as important as validating them. This is where ADAS testing and driver behavior analysis play a critical role.
Why driver interaction matters in ADAS testing
One of the most complex challenges in ADAS validation is quantifying the interaction between the human driver and automated vehicle systems. Unlike purely mechanical or electrical systems, ADAS introduces a human-in-the-loop dynamic that is difficult to measure with traditional test-and-measurement approaches.
Historically, capturing this interaction in a repeatable, data-driven way was limited by the technology available at the time. But today, modern automotive data acquisition (DAQ) systems enable engineers to capture synchronized, multi-domain data streams that provide a complete picture of both vehicle behavior and driver response. As ADAS capabilities evolve, empirical testing plays a key role in shaping future safety standards and ensuring that these systems behave predictably under real-world conditions.
By combining video, vehicle bus data, sensor inputs, and audio signals into a single time-synchronized dataset, engineers can empirically evaluate how drivers respond to ADAS interventions under actual driving conditions.
Common methods for empirically testing human interaction with ADAS technology include:
● Video of the driver to monitor where their eyes are looking and what their hands are doing
● Video of the dashboard to monitor what the driver is seeing
● Video of the exterior of the car to track its position
● GNSS/GPS data to track location and speed
● Sensor inputs from the steering wheel, accelerometers, and braking system
● Microphones to record interior sounds and alerts
● CAN (Controller Area Network) bus data to measure vehicle response and background processes
A multiphysics approach to ADAS measurement
To explore this interaction in practice, testing was conducted on a modified BMW XM platform equipped with a high-performance vehicle-testing DAQ system.
The measurement strategy focused on capturing several key parameters simultaneously. These included vehicle position, speed, and trajectory, as well as driver inputs such as braking behavior. Just as importantly, the system recorded what the driver was seeing and hearing through synchronized video and audio acquisition.
Multiple cameras were installed to monitor the road, the instrument cluster, and the driver’s environment. At the same time, high-precision GNSS positioning enabled accurate tracking of the vehicle’s movement. Microphones placed near the driver captured audible alerts, such as warning chimes and parking assistance signals.
All of this data was acquired and synchronized using Dewesoft SIRIUS DAQ systems and processed within DewesoftX, allowing engineers to correlate driver actions, vehicle responses, and ADAS behavior in a single unified environment.
In order to capture driver and ADAS interactions, we determined that important metrics were the following:
● Vehicle position on the road
● Gauge cluster display data
● Vehicle alerts and chimes
● Vehicle speed
● Vehicle location
● Driver interaction with the brake pedal
Real-world ADAS testing methodology
To ensure realistic results, the test scenario was designed around a familiar driving route near the Dewesoft USA headquarters in northern Ohio. The driver was experienced with the route but had not previously used the vehicle’s ADAS features.
Figure 2. Sensors installed in or on the BMW XM test vehicle:
Left: external camera monitors lane-keeping. Center: Wide-angle camera to record the dashboard instrument cluster. Right: a microphone records what the driver hears.
Analyzing ADAS performance in various driving scenarios
During assisted driving tests, the collected data provided insight into how the vehicle maintained lane position, responded to traffic conditions, and interacted with driver inputs.
By correlating GNSS position data with synchronized video feeds, engineers could visualize the vehicle’s trajectory relative to the road. At the same time, CAN bus data revealed how the ADAS system interpreted and responded to its environment.
This type of ADAS performance analysis is increasingly relevant not only for OEM validation but also for regulatory bodies such as the National Highway Traffic Safety Administration, which rely on real-world data to evaluate system safety and effectiveness.
With the versatile DewesoftX software, collecting this data and correlating it with video was streamlined and intuitive. We used the SIRIUS 8xACC, SIRIUS 8xSTG, and SIRIUS-SBOX computer from previous tests. The SIRIUS 8xACC provides eight high-dynamic range inputs for IEPE accelerometers or voltages. The SIRIUS 8xSTG provides eight high-dynamic-range strain inputs that can accommodate a wide variety of sensors, including strain gages of all popular types and configurations.
In addition, we added external cameras to monitor the vehicle’s position within the lane (one on each side) and an internal wide-angle camera to record the gauge cluster. The SBOX’s onboard 100 Hz GNSS/GPS receiver recorded the vehicle’s absolute multi-axis position during the test drives.
The test consisted of driving a known route around the Dewesoft USA headquarters and concluded with the driver parking in the parking lot. During open-road testing, storage was triggered only when the vehicle speed exceeded 35 mph. This was easily programmed in DewesoftX and automatically triggered, so the driver could focus on driving without worrying about operating the DAQ system.
Assisted drive test
The screenshots below demonstrate how multiphysics and even multidomain data can be displayed and correlated in DewesoftX. Video, CAN BUS, latitude and longitude data from GNSS/GPS satellites, maps, and analog sensor data are displayed and recorded synchronously. The test results can be used to study real-world driving conditions and drivers' comfort with varying levels of driver assistance.
“Multi-domain” recording refers to processing data in both the time and frequency domains, as shown in the screen capture below, where the audio from the interior microphone is displayed in a CPB graph and a Y/T recorder graph.
Users can design multiple display screens within the DewesoftX application. Screens can contain any combination of display widgets, allowing each one to focus on a different aspect of the test. These screens can be selected and even edited freely during and after recording. A sampling of the displays created for the assisted driving tests is shown in Figures 4-7.
Evaluating parking assist and driver feedback systems
In addition to open-road testing, parking assistance functions were evaluated to understand how ADAS systems communicate with the driver during low-speed maneuvers. Parking scenarios provide a unique opportunity to analyze human-machine interaction (HMI), particularly through audible alerts and visual feedback.
Using Dewesoft’s sound measurement capabilities, engineers quantified the intensity and frequency of the vehicle's warning chimes. These acoustic signals were then correlated with distance measurements, camera footage, and dashboard indicators. This allowed for a detailed assessment of how effectively the system communicates risk to the driver, an essential factor in both safety compliance and user experience design.
DewesoftX’s Sound Level software plugin allowed us to measure the intensity of the alert noises generated by the parking sensors. This same plugin can be used to measure the intensity of other noises in the vehicle, such as pre-collision alerts, seatbelt chimes, and more. We then processed the output from the sound level plugin using the Dewesoft Basic Statistics module to determine the maximum and average volume of the chimes. This data can then be used to validate that the vehicle’s safety systems meet regulatory requirements and that the vehicle’s user experience is positive.
From data to insight: the value of synchronized measurement
One of the key advantages of modern vehicle testing systems is the ability to synchronize all measurement channels into a single timeline.
This enables engineers to answer complex questions, such as:
How quickly does a driver respond to an ADAS warning?
Does the system intervene before or after driver action?
Are visual and audible alerts aligned with real-world conditions?
By analyzing these interactions holistically, engineers can move beyond component-level validation and toward full system-level understanding.
The future of ADAS testing and automotive validation
As ADAS systems become more advanced, testing methodologies must evolve accordingly. High-precision positioning technologies, such as RTK-enabled GNSS and inertial measurement units (IMUs), are enabling centimeter-level accuracy in vehicle tracking.
When combined with advanced simulation environments and object-based testing frameworks, engineers can evaluate collision avoidance, sensor fusion, and autonomous decision-making in both controlled and real-world scenarios.
Modern DAQ platforms support this evolution by offering scalable, synchronized measurement across multiple domains, from electrical signals and vehicle networks to video and acoustics.
Conclusion
ADAS is transforming the automotive landscape, but this shift also introduces greater complexity into testing and validation. Understanding how drivers interact with these systems is essential for ensuring safety, reliability, and user acceptance.
By leveraging advanced automotive data acquisition systems, engineers can capture a complete, synchronized view of vehicle behavior and human response. This enables more accurate validation of ADAS features, supports regulatory compliance, and ultimately contributes to safer roads.
Solutions such as Dewesoft SIRIUS DAQ systems, combined with DewesoftX, provide the flexibility and performance required to meet the demands of modern ADAS testing and vehicle development.
Related articles and webpages:
Blogpost: What is ADAS?
Blogpost: ADAS Standards and ADAS Safety Protocols
Blogpost: Types of ADAS Sensors in Use Today
Blogpost: How Dewesoft USA Equipped BMW XM with the Ultimate Vehicle Testing DAQ
Blogpost: How Are ADAS Systems and Autonomous Vehicles Tested?
Webpage: ADAS Testing




