Products
Get the Headband
Naxon Labs: marking brain reaction to external stimulus through new Naxon Explorer API
August 19, 2021.
Blog Image

With Naxon Labs platform and an EEG device like Interaxon’s Muse, you can create a mark derived from an external event in a sequence of brain activity expressed in waves. Naxon Explorer is a useful tool and neurofeedback system for researchers in Neuroscience, Psychology and Medicine. You can record brain data, get measurements and sessions data that will let you use machine learning and automatic pattern analysis. With the new API, you can analyze brain behavior and its response to an external activity. For example you can expose the brain to certain activity at the same time you inform Naxon Explorer through an API to register the moment accurately. With this, you can analyze the continuous brain waves and check what was the impact of the external event in the brain activity.

 

Through this API Naxon Explorer can be integrated with many applications. One of them is Unity (https://unity.com/), a multiplatform game engine (smartphone, computer, video and web game consoles) developed by Unity Technologies. It is one of the most widespread in the video game industry, both for large studios and for independents because of its rapidity in prototyping and that it allows games to be released on all media.

 

In Naxon Explorer platform, there are commands available, and the backend is ready to connect through WebSocket so external applications can connect from a Unity application to start receiving the packages. From the external application you must do some programming, like simply go to record a new session, and then a new button will appear that opens a modal with all the necessary commands to connect to WebSocket. Then you can not only put Play / Pause (or end the recording), for example, but also receive the data that the server reads.

 

Then, using Chrome extensions, or some tool that allows WebSocket connections (for example Hoppscotch https://hoppscotch.io/) you can connect to the backend and send the messages that are there in the modal, and receive the readings, brainwaves, and events from the back end.

 

This is a great mechanism to invoke Naxon Explorer when an event occurs and test the Muse device with a virtual reality set like Oculus.

First you must connect, then subscribe and there you can just start / pause / end recording.

The ping is is a control mechanism and it is something that keep the connection alive and that lets you know if you are still connected. 

 

When you send a subscribe it returns a message like this:

 

{

  "identifier": "{\"channel\":\"RecordingsChannel\",\"room\":\"RecordingsRoom\",\"session_id\":\"675\"}",

  "type": "confirm_subscription"

}

 

When you send an end recording it returns a message like this:

 

{

  "identifier": "{\"channel\":\"RecordingsChannel\",\"room\":\"RecordingsRoom\",\"session_id\":\"675\"}",

  "message": {

    "message": {

      "change_status": "end_recording",

      "action": "change_recording_status"

    }

  }

}

 

We tested Oculus with Muse and everything works well. You must adjust the band a little higher than normal so that the Oculus fits well.

 

This help to mark events in the brain activity. From Naxon Exporer we open a specific Web Socket for a particular session and then the external application can send packages out there with certain content that identifies what the event is, if Naxon Explorar has to start recording, if it has to stop, etc. It is a a kind of remote control for the session.

 

Among the pioneers using this functionality we highlight a Spanish team from the UIB (Universitat de les Illes Balears), which is working a lot in BCI and RV / AR.

 

They have several projects, and one of the Virtual Reality applications is to evaluate the emotional state of the person and modulate it, for example to control chronic pain in diseases. To do this, they use physiological parameters (heart rate, EDA, etc ...). They contacted Naxon Labs about their interest in using EEG as well, combining the use of a VR headset with an EEG headband.

 

The UIB team developed MUSE 2 tests with Naxon Explorer to synchronize the EEG captures with the visual stimuli sent in the Unity RV application.

 

The UIB research team has been working a lot in BCI and VR / AR. They have several projects where they normally apply technology to therapeutic subjects. They have used these technologies with children with CP, Autism, ADHD, etc. Recently they were involved in national project for the Elderly and social robots. One of the applications of VR was intended to assess the emotional state of the person and modulate it, for example to control chronic pain in diseases. For this, they use physiological parameters (heart rate, EDA, etc). The team also used EEG addressing the challenges of using a VR headset with an EEG headband.

 

If you are interested in marking external events in the brain wave activity, just connect with us and we will help you with the Naxon Explorer API: contact@naxonlabs.com