Products
Get the Headband
Virtual reality and electroencephalography for modulation of emotional states through visual and auditory stimulus
January 11, 2022.
Blog Image

Combining Virtual Reality and EEG technologies, by adding visual and auditory stimulus, Iker López has created a software application for the modulation of emotional states with binaural waves and neurofeedback techniques.

As part of his graduation work as Computer Science Engineer in the Escola Politècnica Superior of Universitat de les Illes Balears (UIB), Iker López de Suso Sánchez developed a software application motivated by providing technological support for mental health care, combining software games with Unity and Virtual Reality. Iker worked under the supervision of Dr. Francisco José Perales López and Dr. José María Buades Rubio in the program 2020/2021 in the city of Palma, Majorca, Spain.

Iker analyzed brain waves in non-invasive EEG devices like Muse Band 2, which has 4 channels plus one for reference, using Bluetooth 4.2 communication:

● Gamma (32-100Hz)

○ High cognitive processing.

○ Learning.

○ Problem solving.

● Beta (13-32Hz)

○ Concentration.

○ Decision making.

● Alpha (8-13Hz)

○ Relaxation.

○ Well-being.

● Theta (4 - 8Hz)

○ Imagination.

○ Internal processing.

○ Dreams (REM), fears.

● Delta (0.5 - 4Hz)

○ Deep meditation.

○ Deep sleep (without dreaming).

 

A binaural beat is an auditory illusion perceived when two different pure-tone sine waves, both with frequencies lower than 1500 Hz, with less than a 40 Hz difference between them, are presented to a listener dichotically (one through each ear).

For example, if a 100 Hz pure tone is presented to a subject's right ear, while a 104 Hz pure tone is presented to the subject's left ear, the listener will perceive the auditory illusion of a third tone, in addition to the two pure tones presented to each ear. The third sound is called a binaural beat, and in this example would have a perceived pitch correlating to a frequency of 4 Hz, that being the difference between the 104 Hz and 100 Hz pure tones presented to each ear.

Figure: Binaural Beats

Binaural-beat perception originates in the inferior colliculus of the midbrain and the superior olivary complex of the brainstem, where auditory signals from each ear are integrated and precipitate electrical impulses along neural pathways through the reticular formation up the midbrain to the thalamus, auditory cortex, and other cortical regions.

Then the neurofeedback (NFB), also called neurotherapy, is a type of biofeedback that presents real-time feedback from brain activity in order to reinforce healthy brain function through operant conditioning. In this case, electrical activity from the brain is collected via sensors placed on the scalp using electroencephalography (EEG Muse Band 2), with feedback presented using video displays or sound.

Figure: Neurofeedback

“There’s decades of innovations ahead. We’re at the very beginning, where it’s just at the stage where we can bring in consumers but there’s so much further to go from there” said Brendan Iribe, CEO of Oculus Rift, the device Iker used for the application.

Also in his work, Iker cited Mark Zuckerberg, CEO of Facebook, now rebranded as Meta, to enter with these applications in the so called Metaverse: “The incredible thing about the technology is that you feel like you’re actually present in another place with other people. People who try it say it’s different from anything they’ve ever experienced in their lives.”

Iker considered different reality technologies: Augmented reality (AR), Virtual reality (VR) and Mixed Reality (MR).

Augmented reality (AR) adds digital elements to a live view often by using the camera on a smartphone. Examples of augmented reality experiences include Snapchat lenses and the game Pokemon Go.

Virtual reality (VR) implies a complete immersion experience that shuts out the physical world. Using VR devices such as HTC Vive, Oculus Rift or Google Cardboard, users can be transported into a number of real-world and imagined environments such as the middle of a squawking penguin colony or even the back of a dragon.

In a Mixed Reality (MR) experience, which combines elements of both AR and VR, real-world and digital objects interact. Mixed reality technology is just now starting to take off with Microsoft’s HoloLens one of the most notable early mixed reality apparatuses.

The Oculus Quest 2 device has a resolution per eye of 1920 x 1832 pixels, a refresh rate of 90Hz, and a FOV (Field of View) of 90°.

Figure: Oculus Quest 2

Unity is a cross-platform game engine developed by Unity Technologies, first announced and released in June 2005. The engine has since been gradually extended to support a variety of desktop, mobile, console and virtual reality platforms. The engine can be used to create three-dimensional (3D) and two-dimensional (2D) games, as well as interactive simulations and other experiences.

Oculus Quest and Quest 2 deliver the freedom of wireless, standalone VR with the industry leading power and performance to drive your next immersive app. Both of these devices include spatially tracked controllers, integrated open-ear audio, and support for Oculus Link which enables users to access their Oculus Rift library of apps from their gaming compatible PC.

For this applications Oculus Quest 2 has been integrated with Unity to create the VR environment, scene, game objects, the components defining the game object behavior and the materials that add texture and colors to objects.

For the integration of the EEG device with the software application it was used the Naxon Explorer API to get the data at the right moment. With Naxon Labs platform and an EEG device like Interaxon’s Muse, you can create a mark derived from an external event in a sequence of brain activity expressed in waves. Naxon Explorer is a useful tool and neurofeedback system for researchers in Neuroscience, Psychology and Medicine. You can record brain data, get measurements and sessions data that will let you use machine learning and automatic pattern analysis. With the API, you can analyze brain behavior and its response to an external activity.In this application, Iker exposes the brain to visual and auditory stimulus at the same time informs Naxon Explorer through an API to register the moment accurately. With this, you can analyze the continuous brain waves and check what was the impact of the external event in the brain activity.

Figure: Data flow

In the application you can work with the session, the configuration, course of session, summary of results (graphics and information). You can also configure the environment, the binaural waves and the music.

As future directions of this work, Iker Lopez indicated the statistical study to validate the effectiveness of the techniques used, increase the stimulus perceived by the user (shaders, particles), the expansion of neurofeedback techniques, the development of a backend project including communication with a server and a data base, the Inclusion of alternative input systems (head movement, voice control) and design and develop an in-game tutorial.

This is one of the latest initiatives developed by Dr. Francisco Perales workgroup in the Universitat de les Illes Balears. The UIB research team has been working a lot in Brain Computer Interfaces and VR / AR. They have several projects where they normally apply technology to therapeutic subjects. They have used these technologies with children with CP, Autism, ADHD, etc. Recently they were involved in national project for the elderly and social robots. One of the applications of VR as indicated above, was intended to assess the emotional state of the person and modulate it, for example to control chronic pain in diseases. For this, the team uses physiological parameters (heart rate, EDA, etc). The team also used EEG addressing the challenges of using a VR headset with an EEG headband with Naxon Explorer API.

The paper Evaluation of a VR system for Pain Management using binaural acoustic stimulation explains the process which was written by Francisco J. Perales, Laia Riera, Silvia Ramis and Alejandro Guerrero.

Abstract
The system proposed is oriented to evaluate the pain perceived by the user under a high controlled virtual reality environment (VR). AVR system is an implementation of a virtual world that the user perceivesasthe realone.Thesensationofimmersionaffectsthestimulus (visual,acousticandhaptic) perceived by the user and it is able to promote change in the brainwaves power and produce an activation of Autonomic Nervous System (ANS). The Electro-Dermal Activity (EDA) allow measuring the electrical properties of the skin by the sweat activity. This work proposes a VR environment combined with binaural beats (binaural sounds) and visual stimulus to evaluate the perceptionthatthe user has and comparingtheir sensation with real physiological data. It is believed thatthe use of different binaural beatsin along period oftime can help patientstoinduce a relaxation state (mood) and consequently modulate the perception to pain. In this study we show two experiments. The first one applies 8 types of acoustic stimulus (4 binaural and 4 monaural) in a standardsimpleVRscenarioandweproposetheenduserstoselecttheexperimented feelingthey felt in any case, in parallel using the Empatica wristband we contrast the subjective users answers with physiological values given by the device. In the second experiment, an immersive environment based ont he wholeVRapplication is proposed for control and real users to evaluate chronic pain.The users are immersed in three VR equal scenarios butwith random sound stimulation.Withthe results obtained, we can conclude that binaural beats work better than non-binaural beats if we talk about relaxation and meditation.