Friday, June 23, 2023 · 0 min read
Haptic Rendering - A Virtual Reality Challenge
The word “haptic” means related to the sense of touch, particularly the perception and manipulation of objects using the sense of touch. A group of students at the Sapienza University of Rome and the National Institute of Applied Science (INSA) of Lyon launched a pilot project to evaluate the brain response in conjunction with mechanical signals - forces, friction, and vibrations - induced by exploring surface textures. Dewesoft helped them get the feel of the future.
The haptic or tactile rendering project is a joint venture researching tribology and dynamics. Tactile interaction generates mechanical stimuli between the skin and the explored surface. A great social and technological challenge is creating a tactile rendering, just as we can for sight and hearing.
In fact, visual and auditory rendering technology has been advancing for decades, and our daily lives are populated by ever more sophisticated displays and acoustic systems.
But are we able to simulate the sense of touch?
For example, imagine having tactile feedback integrated into the touchscreen of our smartphone and being able to perceive the texture of the object we want to buy online at a distance or having tactile gloves that simulate object textures in virtual environments - augmented reality, video games, movies, etc.
Tactile rendering would have applications in all fields of daily life, from entertainment to the medical and biomedical fields, to digital commerce, etc.
It is no coincidence that many international research groups try better understand, in a multidisciplinary way, the mechanisms that underlie tactile perception and then master and recreate them artificially.
But let's take a step back.
The human being learns and interacts with the world around him through the five senses: sight, hearing, touch, taste, and smell. The sense of sight is mediated by electromagnetic waves, hearing by pressure waves, and taste and smell by biochemical interactions.
The sense of touch is the most complex and least understood of the five senses: it requires interaction with the object of perception. At its base, it is a set of mechanical stimuli coming from the entire musculoskeletal system. It is still being studied how these stimuli are generated by interacting with surfaces and how they act synergistically to form tactile perception.
The current project aims to study the mechanical signals underlying the tactile perception of surface textures and the development of a tactile rendering device capable of recreating the perception of surfaces at a distance.
The study is carried out in a collaboration between the Tribology and System Dynamics groups of the Department of Mechanical and Aerospace Engineering (DIMA) of the "La Sapienza" University of Rome and the Laboratory of Mechanics of Contacts and Structures (LaMCoS) of the National Institute of Applied Sciences (INSA) in Lyon, France.
The project is part of a broader, strongly international context, which sees Engineering, Neuroscience, and Psychology laboratories interface to understand and simulate the most complex of the five senses in a multidisciplinary way.
Measurement and analysis of tactile stimuli
Tactile interaction between the finger and the explored surface triggers mechanical stimuli such as contact forces, friction, vibrations, temperature, and skin deformation. These are detected by mechanoreceptors present in the skin, muscles, and ligaments. The deformation of the mechanoreceptors due to mechanical stimuli causes the creation of electric potential at the nerve endings connected to the receptors. A current is transmitted through the nervous system to the brain, where the stimuli are decoded.
An experimental setup - see Figure 1 - was implemented to measure the tactile stimuli generated on the finger exploring and sensing sample surfaces with known topography. The testing setup developed allows measuring both the contact forces and the Friction-Induced Vibrations (FIV), i.e., the vibrations that are generated by the sliding contact between the finger and the surface and which, according to recent studies, are among the mechanical stimuli most important for the perception and discrimination of fine surface textures.
The sample surface was glued onto a triaxial force transducer to measure the tangential and normal components of the contact forces. During exploration, an accelerometer was glued to the nail to measure the FIV globally acting on the finger.
A SIRIUSi DAQ eight-channel in- and output systems were used to acquire the accelerometer signal and the three contact force components. The DualCore system and the high isolation of the channels allowed the acquisition of the signals induced by tactile perception with extreme precision and with very low background noise - in particular, the induced vibrations characterized by very weak amplitudes.
Spectral analyzes, FFT, PSD, and Spectrograms, were performed on the acceleration signals to investigate the link between the spectral characteristics of the induced vibrations, the perception of textures, and the topographies of the surfaces. Starting from the three components of the contact force, the coefficient of friction was also calculated - see Figure 2.
Haptic rendering based on friction-induced vibrations
A tactile rendering device has been developed to recreate the perception of textures from a distance. The aim is to simulate tactile stimuli previously measured during the exploration of surfaces through this device.
The device consists of an electro-active polymer piezoelectric actuator and the chain needed to drive it. A methodology for processing the vibrational signal measured on surfaces has also been developed, essential for the correct simulation of mechanical stimuli utilizing the actuator [1].
By touching the surface of the actuator guided by the suitably processed signal, the device allows the same vibrational stimulus previously measured by exploring actual surfaces to be generated on the user's fingertip.
A Texas Instruments electronic board was used to drive the piezoelectric actuator. As input, this board accepts an analog signal that can be generated by the Signal Generator integrated with Dewesoft – see Figure 3.
To correctly reproduce the measured acceleration signal using the tactile device, you need to take the transfer function of the electro-mechanical system and the user's finger into account. In doing this, a random signal can be sent to the haptic device using the signal generator integrated with Dewesoft. An output channel was connected to the Texas Instruments board to drive the piezoelectric actuator. The accelerometer signal, positioned on the fingernail, is acquired as input.
The Modal Testing plug-in available in Dewesoft allows you to simply calculate and display the Frequency Response Function between the random signal sent as input to the tactile device and the accelerometric signal acquired through SIRIUS – see Figure 4.
The Frequency Response of the system, once it’s characterized, is used to process the FIVs previously measured in the exploration of real surfaces [1]. The accelerometric signal is divided, in frequency, by the previously calculated system Frequency Response Function. The signal obtained, reported over time, is input to the tactile device to get the correct reproduction of the FIV on the user's finger using the piezoelectric actuator.
Using the Function Generator integrated with Dewesoft, it is possible to reproduce an arbitrary signal through the output channels of the SIRIUS system – see Figure 5. This way, it is possible to drive the actuator with the desired signal sent directly as an analog input to the Texas Instruments control board.
By comparing the acceleration signal (FIV) measured on the real surface and that recovered with the accelerometer on the finger during the vibration generated by the actuator, it is possible to verify the correct reproduction of the vibration induced through the device. The signals associated with the real surface and the simulated one are correctly superimposed – see Figure 6. In conclusion, the piezoelectric device for tactile rendering simulates correctly the vibration induced directly on the user's fingertip by the surfaces.
Following the development and verification of the tactile rendering device and of the methodology necessary to obtain a correct signal simulation, we carried out discrimination campaigns – see Figure 7 - on sample surfaces with different topography (periodic, isotropic…) on groups of volunteers. These have shown excellent results in the discrimination of real surfaces and those simulated by the Vibrations Induced by Friction.
Perspectives: tactile stimuli and brain response
A new project is being born from the collaboration between the engineers of "La Sapienza", INSA, and the Laboratory of Cognitive Neurosciences of the University of Marseille. This is aimed to combine Tribological and Neuroscientific analysis of tactile perception by simultaneously measuring and analyzing the mechanical and electroencephalographic (EEG) stimuli on real and simulated surfaces.
The aim is to reconstruct the chain of tactile perception starting from the surface texture, passing through mechanical stimuli, up to the brain response.
A pilot campaign was launched to evaluate the brain response in conjunction with mechanical signals (forces, friction, and vibrations) induced by exploring surface textures.
The experimental setup for the mechanical stimuli is the one previously described: the contact forces were monitored employing a triaxial force transducer placed under the sample surface, while the friction-induced vibrations were measured using an accelerometer fixed on the nail of the finger – see Figure 8.
Again, Dewesoft was very helpful in obtaining precise measurements with low background noise. A custom 8-channel SIRIUSi system (four BNC input channels, three STG input channels, and a MULTI channel that can act, using the appropriate DSI adapters, as both input and output) was used to acquire and generate the analog signals.
To allow the marking of the electroencephalographic signals, the voltage signals coming from the triaxial force transducer were split through T-connectors, while the IEPE accelerometer signal was acquired as input and by the SIRIUS output channel sent towards the EEG acquisition system.
References
[1] L. Felicetti, E. Chatelet, A. Latour, P.-H. Cornuault and F. Massi, “Tactile rendering of textures by an Electro-Active Polymer piezoelectric device: mimicking Friction-Induced Vibrations,” Biotribology, vol. 31, p. 100211, 2022.