Products
Get the Headband
Measuring concentration and alertness with Naxon Emotions
June 26, 2022.
Blog Image

Naxon Emotions is a tool to objectively measure and record a person's emotions and cognitive states in real time and at low cost using portable electroencephalography (EEG) headbands.

This real-time emotion recognition system is based on neurophysiological data from EEG, cloud computing and AI.

 

Measuring concentration and alertness

Naxon Emotions can be used to measure and record in real time the state of concentration and alertness of a person. This record can be viewed on the platform or downloaded in an Excel format for further analysis with other tools.

The possibilities of using these records are multiple, such as providing support and brain correlates to psychometric measures, evaluating clinical interventions, conducting field research in the area of neuromarketing, among others.

 

How to use the Platform?

Get your EEG headband first

Muse S Headband for Naxon Emotions

Emotions uses portable EEG, in particular the Muse headset by Interaxon Inc: Muse 1, 2 and S. If you already own a Muse device, you are set to go. If you don’t have a headband yet, you can get one following this link: Get headband.

Under the section “Products”, choose the Muse 2 or Muse S headset which are the ones currently available and compatible with our platforms.

To learn how to set your headband properly, follow this link: How to fit Muse S.

 

Next steps

Naxon Emotions Sign-In & Sign-Up

The platform is accessed through the website: https://emotions.naxonlabs.com/

Remember that to access Emotions and for the best experience you will need a wide screen device with Bluetooth connection such as a PC, MAC or tablet. You will also need to use Google’s Chrome Browser.

Once you enter the platform, you can register or log in with your username and password. Select the “Start Free Trial” option to register or just log in by clicking the option in the upper right-hand corner.

Naxon Emotions Sign In

Next you will see your project folders and within them, recorded sessions, to which you can enter, analyze and modify as desired.

Naxon Emotions Main Screen

To record a new session, just press the “New Session” button. There you can put a name, description and predetermine events.

Naxon Emotions New Session

Next, by clicking “New Session” you can connect the device via Bluetooth. Make sure you have your laptop or tablet Bluetooth connection enable and the Muse device powered on. Next, the central part of the platform is displayed, where you can visualize data in real time on a graph across time and segmented by emotion or state.

Naxon Emotions Visualization

While taking data you can post notes using the tools on the right-hand side, adding comments that will be subject to the time of the graph where they were made.

Events: this tool allows to predetermine events, that is, for example, if a person is receiving a certain visual stimulus every 5 seconds, you can predetermine that every 5 seconds the name of that event is automatically marked on the graph so that the event is correlated with the states. You can do so using the options on the right-hand side.

Also, there is an option of displaying a graph based on and overlapped view that allows us to see in real time and on the same screen as the central graph the emotions or states selected.

Naxon Emotions Visualization Details

During the recording of a session, there is the possibility of visualizing changes in any selected emotion or state as a circle, the larger the circle the more of that state is detected.

Naxon Emotions Circle Visualization

Once the session is recorded, data can be accessed through the project and session folders and also downloaded in XLS format to cross with other analysis and visualization tools. You can filter the data bars according to the events marked.

Naxon Emotions Graph Bar

 

With all these procedures, you can use Naxon Emotions to measure and record in real time the state of concentration and alertness of a person and then analyze the data.