Get the Headband
Blog Image
July 31, 2022.
Measuring concentration watching Tesla Armor Glass Demo
Naxon Emotions platform consists of an emotion monitoring system that translates brain information into objective visual markers of states such as anxiety, relaxation, concentration, joy or sadness, among others.   We exposed several participants to different videos to measure concentration levels using Naxon Emotions. In this video we show a real-time monitoring of a participant's brain waves and her exact level of concentration, while watching a video of Elon Musk in the Tesla Armor Glass Demo. The device we use at Naxon Labs uses wearable EEG technology, it sends brain data via Bluetooth to our cloud platform where it is processed. The results of the concentration levels are accentuated when Elon breaks the window, the conclusions are evident.   In the left hand side, at the bottom, are shown the raw data of the brain waves. In the right hand side, at the bottom, you can see the concentration level coming from Naxon Emotions.   In the left hand side, at the top, the test participant shows a concentration level not exceeding the dotted line in the Naxon Emotions screen.   When Elon Musk breaks the window the concentration level rises sharply.   Two moments of high variation of brain data were found.   This test was performed using the Muse II EEG device from Interaxon Inc.   Naxon Emotions can enhance objective patient records for diagnosis and treatment in mental health, allowing tracking patient evolution, remote monitoring and therapy, improve the implementation of techniques for the regulation of mental disorders and also allowing applications in various specific disorders.   You can start a free trial of Naxon Emotions in this link:   Naxon: A portable EEG technology company.
Blog Image
June 26, 2022.
Measuring concentration and alertness with Naxon Emotions
Naxon Emotions is a tool to objectively measure and record a person's emotions and cognitive states in real time and at low cost using portable electroencephalography (EEG) headbands. This real-time emotion recognition system is based on neurophysiological data from EEG, cloud computing and AI.   Measuring concentration and alertness Naxon Emotions can be used to measure and record in real time the state of concentration and alertness of a person. This record can be viewed on the platform or downloaded in an Excel format for further analysis with other tools. The possibilities of using these records are multiple, such as providing support and brain correlates to psychometric measures, evaluating clinical interventions, conducting field research in the area of neuromarketing, among others.   How to use the Platform? Get your EEG headband first Emotions uses portable EEG, in particular the Muse headset by Interaxon Inc: Muse 1, 2 and S. If you already own a Muse device, you are set to go. If you don’t have a headband yet, you can get one following this link: Get headband. Under the section “Products”, choose the Muse 2 or Muse S headset which are the ones currently available and compatible with our platforms. To learn how to set your headband properly, follow this link: How to fit Muse S.   Next steps The platform is accessed through the website: Remember that to access Emotions and for the best experience you will need a wide screen device with Bluetooth connection such as a PC, MAC or tablet. You will also need to use Google’s Chrome Browser. Once you enter the platform, you can register or log in with your username and password. Select the “Start Free Trial” option to register or just log in by clicking the option in the upper right-hand corner. Next you will see your project folders and within them, recorded sessions, to which you can enter, analyze and modify as desired. To record a new session, just press the “New Session” button. There you can put a name, description and predetermine events. Next, by clicking “New Session” you can connect the device via Bluetooth. Make sure you have your laptop or tablet Bluetooth connection enable and the Muse device powered on. Next, the central part of the platform is displayed, where you can visualize data in real time on a graph across time and segmented by emotion or state. While taking data you can post notes using the tools on the right-hand side, adding comments that will be subject to the time of the graph where they were made. Events: this tool allows to predetermine events, that is, for example, if a person is receiving a certain visual stimulus every 5 seconds, you can predetermine that every 5 seconds the name of that event is automatically marked on the graph so that the event is correlated with the states. You can do so using the options on the right-hand side. Also, there is an option of displaying a graph based on and overlapped view that allows us to see in real time and on the same screen as the central graph the emotions or states selected. During the recording of a session, there is the possibility of visualizing changes in any selected emotion or state as a circle, the larger the circle the more of that state is detected. Once the session is recorded, data can be accessed through the project and session folders and also downloaded in XLS format to cross with other analysis and visualization tools. You can filter the data bars according to the events marked.   With all these procedures, you can use Naxon Emotions to measure and record in real time the state of concentration and alertness of a person and then analyze the data.  
Blog Image
May 28, 2022.
Visualizing brain wave data in real time
Doing Research with Naxon is Easy   Naxon Explorer is an affordable useful tool and neurofeedback system for researchers in Neuroscience, Psychology, Medicine, Engineering and Information Technology. It is a web platform dedicated to exploring brain data taken with portable electroencephalographs (portable EEGs),  where both an experienced researcher or recently graduated professional can easily explore the brain. The central part of the platform is displayed where you visualize brain wave data in real time on a graph of voltage and time, divided by channel.   How to work with Naxon Explorer's visualization tools?   As a first example, this is a comparison point of two EEG electrodes on the scalp:   Before and during data recording we can modify various parameters:   Notch filter: this filter removes the frequency band corresponding to the alternating current so as not to generate "noise" on the brain data. High pass filter: is a filter that passes signals with a frequency higher than a certain cutoff frequency and attenuates signals with frequencies lower than the cutoff frequency. Low pass filter: is a filter that passes signals with a frequency lower than a selected cutoff frequency and attenuates signals with frequencies higher than the cutoff frequency Sensitivity: refers to how many micro volts per scale we see on the graph, that is, how large do we see the wave. Time window: refers to how many seconds we in the screen. Blink and clench detector: it detects and marks when there are blinks or clench movements.     In addition, while taking data we can post notes using the tools on the right-hand side, adding comments that will be subject to the time of the graph where they were made.   Events: another tool of great utility for research is the one that allows to predetermine events. That is, for example, if a person is receiving a certain visual stimulus every 5 seconds, we can predetermine that every 5 seconds the name of that event is automatically marked on the graph so that the event is correlated with brain waves. You can do so this the options on the right-hand side.     Also, we have the option of displaying a graph based on "frequency power" (in relative percentages) that allows us to see in real time and on the same screen as the central graph, the powers of the different brain frequencies (delta, theta, alpha, beta and gamma) on average for all channels.     One of the most important tools is the frequency power graph, which can be viewed and selected by channel. Also, in real time. This allows the different brain frequencies already processed to be correlated by topography and obtain much more precise conclusions.     At any time you can pause and start the session without having to start a new one. Thus, it is possible to maintain order and consistency within the same session that requires interim of any kind. Once the session is over, brain data can be accessed through our project and session folders and also downloaded in XLS format to cross with other analysis and visualization tools.     At this point we can also make post session notes that can be added to the recording. You can start a free trail of Naxon Explorer in this link, and you will need a Muse device from Interaxon Inc like Muse I, Muse II or Muse S.    
Blog Image
April 18, 2022.
Using Naxon Explorer with Muse-S
Muse S is a multi-sensor meditation device that provides real-time feedback about your brain activity, heartbeat, breathing and movement. The device integrates with Naxon Explorer. Enas Aljohani and Muhannad Saber worked with the native Muse S functions and tested it with Naxon Explorer. The design of Muse S enables other scenarios of use compared to Muse I and Muse II, fitting better for sleep and sports management.   By Enas Aljohani and Muhannad Saber   Naxon Labs' main focus is the EEG technology and the possible uses of it. Naxon Labs has developed a custom software and interface which uses artificial intelligence to identify patterns and process the data recorded by such devices in products like Naxon Explorer and Naxon Emotions. To achieve this goal Naxon Labs uses BCI (Brain-Computer interface) which is a direct communication pathway between an enhanced or wired brain and an external device, which in this case is portable, wearable EEG device by the name of MUSE, the line of products of Interaxon Inc from Canada. The data obtained can be used in many fields and have many applications such neurofeedback, advertisement, product development, medical field, research, self-improvement and control of other devices such as TVs or drones. Naxon Labs uses the MUSE device to obtain the data. MUSE is an EEG device which is available to consumers to purchase and use to record their own brain waves in different settings like sleep, concentration periods, different emotions and many more. The device is light, portable, easy to set up and has different models and generations. The impressive results mitigate the fact of being portable, that could have the limitations of using less electrodes than the conventional EEG devices, or being more prone to interferences. Naxon Labs’ main product is a platform called Explorer. Naxon Explorer is a platform integrated with machine learning tools and automatic pattern analysis like Naxon Emotions, and the results can be used in research in many fields and can have many applications. The data recorded in the sessions can be downloaded to be processed in other tools for different analysis including training for machine learning in specific use cases.   Muse S is a multi-sensor meditation device that provides real-time feedback on your brain activity, heartbeat, breathing and movement. The device integrates perfectly with Naxon Explorer.      Muse S specifications   Example of Sleep Session using Muse-S App   Session A In the first session, it was a case where we did not check the MUSE S battery before start recording. Accordingly, the device was able to record only 2h30min of a 6h23min of sleep cycle. However, it was a long day and having insomnia it explains the short time spent in deep sleeping and frequent position changing.   Session B In the second session, we managed to fully charge the battery, thus the device recorded the full 5h46min sleep session. However, although the analysis showed lower heart rate and longer deep sleep stage, it showed also, decreased stillness and increased intermitting awaking time.     Recordings with Muse-S and Naxon Explorer   First session: In the first interactions with Naxon Explorer, you can see the brain waves pattern, and you can realize if you are not adjusting the sensors properly in the head band. The MUSE S device has four electrodes. Two to be placed on the forehead and two behind the ears. In this first session, the ears electrodes (TP9+TP10) signals were chaotic due to the artifact position and its electrodes. To avoid this, it is better to adjust the electrodes before start the recording. Also, you may have to band your hair and wipe the electrodes to get the best waves quality.   Second session: In the second session we adjusted correctly the electrodes. In this case it was practiced meditation for approximately 1.5min. In this record you are able to see the low frequency waves which appears with stillness and calmness.   Third session: In this session, we started to examine the waves associated with various random actions including clinching, blinking and talking. The highest frequencies were associated with clinching, talking and blinking respectively.   For an advanced use of Naxon Explorer and EEG techniques in sleep, you can refer to the blog post "The use of portable EEG technology in sleep disorders". Leandro Castelluccio, CEO of Naxon Labs, explains that "one of the areas of gradual development of Naxon Labs refers to applications that evaluate the quality of sleep and can be used personally and individually by users of portable EEG devices and professionals in the areas of Clinical Psychology and Sleep Medicine. The advantages of applying this technology to this area are multiple, for example, remote monitoring and better patient follow up. At Naxon Labs, through the development of our integrated software platform with pattern analysis tools, we seek one of the applications to be the staging of sleep and the completion of preliminary analysis of the person's sleep quality". In the article Castelluccio shows the current problems in the area and how this new technology is introduced to it, and talks about Sleep disorders in today's world, Polysomniography, Portable EEG applications and Sleep Staging and portable EEG: new data processing techniques.   Enas and Muhannad joined Naxon Labs as part of the Virtual Internships program in Saudi Arabia. This program aims to be the leading provider of virtual internship programs, partnering with universities and other academic institutions around the globe and are committed to fostering a community of global career-ready graduates with tangible skill sets and international competencies. Raksha Kini is the Head of Company Account Management of Virtual Internships, and remarked the vision and mission of "aiming to bridge the gap between university and the workplace, reducing barriers and widen participation, at the time help to prepare students and graduates for the future of work”. Raksha believes in the democratization of the global workforce through talent pipelines and access to jobs worldwide.     About MiSK and Virtual Internships On the 25th of April,2016, the Saudi crown prince Mohammed Bin Salman (and the founder of MiSK) announced the 2030 Saudi vision, which is a strategic framework to reduce Saudi Arabia's dependence on oil, diversify its economy, and develop public service sectors such as health, education, infrastructure, recreation, and tourism. Since then, my life as a Saudi woman where never the same. As women empowerment contributed an important aspect of the National transformation project, the opportunities for a female to actively participate in the public life expanded markedly. Saudi women finally had the freedom to drive her car, travel alone and achieve her goals and help her community. Virtual internships -est.2018- is a program provided by MiSK, which is a non-profit foundation aimed to empower the youths and help them to get the needed support, guidance and training to start their career while embracing their values and culture and of course, their country. This internship program helps the graduated Saudi university students to work with international startups and companies in various domains. Such internships definitely build and shape important skills. Working internationally with people from different cultures with different backgrounds is a life time experience for a young, fresh graduate local university student, which includes Enas Aljohani. Enas and Muhannad were introduced to the brain to computer interface concepts through the internship, and assumed that many of their colleagues in Saudi Arabia did not hear about it too. Therefore, Enas prepared a simple Arabic brochure explaining the brain computer interface concept and the Naxon Labs software.     About Enas Aljohani   Senior Medical Student, College of Medicine, Medina, KSA (expected to graduate in 2022).   Among many trainings attended in the medicine field, Enas participated in "Mental Health First Aid Course Held by National Committee for Mental Health Promotion (NCMH), Medina, 2018". She was leader of a team that organized the "mental health first aid" course in association with National Committee for Mental Health Promotion (NCMH), held by Taibah medical club, 2019. Also she was member of "I Understand You" Team for Mental Health Awareness A Team of National Committee For Mental Health Promotion (NCMH), 2018. Enas participated in many research activities including data collection in “prevalence of depression, anxiety, and stress among Infertile women in Madinah; Maternity and Children Hospital and associated risk factors, Saudi Arabia: A case control study”, 2019; data collection in "knowledge and awareness of Tourette's syndrome among the general population in Saudi Arabia", 2020. As Clinical Experience Enas joined the Summer Training in Addiction and Psychiatric Hospital, Medina, 2018.   About Muhannad Saber Bachelor in Medicine & Surgery (MBBS), King Khalid University, Abha, Saudi Arabia United States Medical Licensing Examination (USMLE)   Worked in the UCI Department of Emergency Medicine, Irvine, CA, USA (2018-2019), shadowing with Dr.Shadi Lahham in the Emergency Room. Also worked in South Loop Urgent care, Chicago, USA and Kaplan Test Prep, Chicago - MD.  Muhannad also worked in the King Abdulaziz General Hospital, Jeddah, SA - MD.
Blog Image
April 11, 2022.
Brain week in France: a view from Lyon
Organized every year in March since 1999, Brain Week is coordinated in France by the Society of Neurosciences. This international event, organized simultaneously in a hundred countries and more than 120 cities in France, aims to raise public awareness of the importance of brain research. This is an opportunity for many volunteer researchers, doctors and students to meet the public and share with them the advances obtained in neuroscience research laboratories, to present the challenges for the knowledge of the brain and the implications for our society. Throughout this week, the general public was able to meet researchers to get to know the brain better and learn about current research. It is a spectacular event by its national and international dimension, by the number of people mobilized, by the public success encountered, and by the quality of its programming.   From March 14th to 20th, 2022, the neuroscience ecosystem met, discussed and debated on the brain and more broadly on "neuro-info" as part of the 24th edition of Brain Week. Brain Week is the mobilization of the entire neuroscience community in more than 100 countries around the world.   Dr. Annie Andrieux, President of the Neuroscience Society in France, remarked that “in France, with 800 researchers and research actors within nearly 40 local committees throughout the territory, the people could attend evenings with entertainment, conferences, debates or even shows. In 2021, most of the events took place virtually, this year, the exchanges took place either face-to-face or digitally”. “The brain has always been passionate, it participates in all our vital functions (eating, sleeping, etc.) but also in other more “superfluous” activities such as laughing, feeling, loving… This fascinating brain can be approached in so many different ways”.   “The inaugural national conference took place in Grenoble this year, on the theme “Brain, electro musician”. Music is universal, it is present in all cultures, wherever there is human. During this evening, classical music, improvised music and detection of brain waves were skillfully mixed. This musical experience of real-time brainwave music created a unique concert”, indicated Dr. Annie Andrieux. In Lyon there were many activities that we present in the following section, as the city prepares for NeuroFrance 2023, on May 24-26th, 2023. After 2021’s virtual event, Lyon will present a real-live full conference with a varied scientific programme, mentoring sessions for young researchers, interventions from clubs and affiliated research groupings, a trade exhibition and a social programme that will allow attendees to continue their conversations in a relaxed atmosphere.   BRAIN WEEK ON STAGE: Nose to nose - The 5 senses.   Smells have unsuspected powers, they act on our brain at the same time as they strike our heart. Through this creation, the poet and the scientist become the instigators of a new contemporary ritual that puts man face to face with his mystery in the midst of the clouds and vapors of his imagination. Nose to nose, olfactory experience, is a show for the senses, in which smells play the leading role. They invite themselves into the stage space and infiltrate inside the spectators. This is the starting point of an intimate and introspective journey. Smells come to awaken a world populated by memories and fantasies and offer a space of freedom for our imaginations. This Center Imaginaire show was created in close collaboration with the Center for Neuroscience Research in Lyon. It was held in CCO La Rayonne, 24B Rrue Alfred de Musset, 69100 Villeurbanne, and the speakers were Nathalie Buonviso and Alexandra Veyrac, CNRS researchers at the Lyon Neuroscience Research Center.   At the end of the performance, the scientists involved (Nathalie and Alexandra) and the members of the company offered a time for discussion.   “THE NOSE-TO-NOSE MULTI-SENSORY EXPERIENCE RESTORES THE EMBLEM OF THIS UNKNOWN SENSE IN MAN. EMOTIONS ARE AVOIDED BY A SCENARIO WHERE THE SMELL IS CHALLENGED BY SOUNDS, MUSIC, LIGHTS, BREATHS. »   THE 7TH ART IN THE BRAIN WEEK: Eternal sunshine of the spotless mind: is memory a hard disk?   “Joël and Clémentine only see the bad sides of their tumultuous love story, to the point that Clémentine has all traces of this relationship erased from her memory. Collapsed, Joël contacts the inventor of the Lacuna process, Dr. Mierzwiak, so that he can also eradicate from his memory everything that linked him to Clémentine.”   Eternal sunshine of the spotless mind is a film by Michel Gondry (2013) starring Jim Carrey, Kate Winslet and Kirsten Dunst. The screening at Aquarium Ciné Café, 10 rue Dumont, 69004 (La Croix-Rousse, Lyon) was be followed by a discussion with Hanna Chainay, professor of neuropsychology and cognitive psychology at Lyon 2 University, specialist in the link between memory and emotions. She is head of the “Memory, emotion, attention” team in the Study of Cognitive Mechanisms laboratory.   WHY ARE YOU LOOKING AT ME LIKE THAT? THE CONTRIBUTION OF RESEARCH TO THE SERVICE OF PSYCHIATRY   A conference about understanding the brain, held at the Ludgunum, Roman museum and theatre, 17 Rue Cléberg, 69005 Lyon.   Perceiving and detecting the gaze of others is an essential factor for social interactions, right from birth. What do we know fundamentally about the development of gaze perception and detection in neuro-typical development? How atypicalities lead to behavioral particularities and can be the first symptoms of psychiatric disorders?   Conducted by:   Marine Fabrowski, psychiatrist at GénoPsy – Reference Center for Rare Diseases (Le Vinatier Hospital Centre), ADIS university hospital center (Autism and Intellectual Disabilities), iMIND Center   Marie-Noëlle Babinet, neuropsychologist at GénoPsy – Reference Center for Rare Diseases (Le Vinatier Hospital Centre), ADIS university hospital center (Autism and Intellectual Disabilities), iMIND Center and doctoral student in the Study of Cognitive Mechanisms laboratory   HALLUCINATIONS: BETWEEN NORMAL PERCEPTION AND PATHOLOGIES   A round table and debate about understanding the brain, held at CNRS Rhône Auvergne, 2 avenue Albert Einstein, 69100, in Villeurbanne.   What do we know about hallucinations? These manifestations are associated with many pathologies, and understanding them contributes to better care for those concerned. But their study also allows us to better understand the functioning of conscious, perceptual and subjective human experience. So why do we hallucinate?   Conducted by:   Sara Salgae, doctoral student in cognitive psychology at the Laboratory of Cognitive Mechanisms   Marie-Noëlle Babinet, neuropsychologist at GénoPsy – Reference Center for Rare Diseases (Le Vinatier Hospital Centre), ADIS hospital-university center (Autism and Intellectual Disabilities), iMIND Center and doctoral student in the Study of Cognitive Mechanisms laboratory   Priscille Perraud, student in Master 2 Cognitive psychology of learning, University Lumière Lyon 2   CHILD CARE IN 2022: FRAGILE CARE!   A conference about brain and society, held at the town hall of the 8th arrondissement, Espace Citoyen, 12 avenue Jean Mermoz, 69008 Lyon.   In a hyper-connected and information overloaded world, staying attentive to a task becomes almost mission impossible. Children are at the forefront of this societal transformation, with particular consequences for their learning. This conference invited to reclaim the attention with two specialists in attention disorders in children. It was discussed in particular: the main characteristics of this function, essential to the regulation of our behavior and the development of learning, as well as the different brain networks that underlie it. They explained how attention, which relies on a delicate balance, can be undermined by our way of life, emphasizing in particular the importance of sleep. So many elements to try to answer this big question: is it possible to be attentive in 2022? Marine Thieux and Vania Herbillon presented a research project carried out in collaboration between the Hôpital Femme Mère-Enfant and the Lyon Neuroscience Research Center which aims to capture the micro-fluctuations of vigilance in order to explore their impact on attentional functioning during of the day.   Conducted by:   Marine Thieux, doctoral student in neuroscience at the Lyon Neuroscience Research Center   Vania Herbillon, psychologist specializing in neuropsychology in the Epilepsy and Sleep department and Functional Neuropediatric Explorations of the Woman Mother-Child Hospital, member of the Center for Research in Neurosciences of Lyon     HISTORY OF NEUROSCIENCE: THE DOCTOR FACING PAIN, 16TH-18TH CENTURIES   Exhibition at the media library of Vaise, place Valmy, 69009 Lyon.   Pain management is sometimes perceived as a novelty, a practice neglected in the past. However, pain is already a subject of concern in the 16th-18th centuries. Even if the medicine of that time is partly powerless to remedy it, doctors often mention it and always seek to relieve it. Dwelling on the period of the 16th-18th centuries allows us to disorient our gaze on this problem: the detour through the past contributes to the renewal of current questions and practices. This exhibition, under the scientific direction of Raphaële Andrault and Ariane Bayle, members of the Institute for the History of Representations and Ideas in Modernity, included interviews with neurologists, pain specialists. It can also be discovered online in a web documentary version.   This web documentary is the result of multidisciplinary research work (literature, language, history and philosophy). It was born from an observation: when one reads the medical texts of the 16th-18th centuries, one is struck by the omnipresence of the problem of pain, which goes against the widespread idea according to which the pain would not have been a real concern for physicians and philosophers before the end of the 18th century.   Researchers, by crossing various sources (texts, images, music), bring to light recurring questions, which undo some of our prejudices and which sometimes resonate more strongly than one could imagine with today's medicine. today.   An exhibition at the Rockefeller Health BU (Lyon), presented to the public in the winter of 2020-2021, made it possible to show an initial state of this research. It is developed in this web documentary, consisting of a virtual exhibition, works from the period (to read and listen to), as well as interviews with neurologists. These filmed interviews confront old and contemporary conceptions of pain: they show in particular that the problem of the signs of pain and of the language used to express it still remains an object of exploration, both for medical sciences and for the humanities.   OTHER EVENTS   BRAIN AND SOCIETY: BRAIN AND COGNITION: THEIR ROLES IN MOBILITY   The research of the Ergonomics and Cognitive Sciences for Transport Laboratory (LESCOT) aims to understand humans in a situation of displacement to allow mobility adapted to their needs. This visit allowed to discover various equipment used by scientists: functional near infrared spectroscopy, which measures a person's brain activity and the driving simulator.   Venue: Gustave Eiffel University-Lyon Campus, City of Mobility, 25 avenue François Mitterrand, 69500 Bron     BRAIN AND ART: SECRETS OF THE GLASS ARMONICA   Conference at the Confluence Museum, 86 quai Perrache, 69002 Lyon   An ancient instrument as rare as it is mysterious, the glass armonica raises many questions. Discover the secrets of this fascinating musical object, from its mode of production of sounds to their perception by our brain. This conference will be followed by an exceptional concert, a unique opportunity to hear the glass armonica within rarely performed classical works.   Conducted by:   Sébastien Ollivier, teacher-researcher at the University Claude Bernard Lyon 1 and member of the Laboratory of Fluid Mechanics and Acoustics (LMFA)   Nicolas Grimault, CNRS research director and member of the Lyon Neuroscience Research Center (CRNL) with the complicity of Thomas Bloch, musician     HOW THE BRAIN WORKS: THE BRAIN MAKES ITS WORLD: THE ILLUSION OF REALITY   Conference at the Confluence Museum, 86 quai Perrache, 69002 Lyon   Is the world really as we see it? Through amazing optical illusions, this lecture will give you some secrets about our brain and how our perceptual abilities are built from our personal experience, and how much it affects our relationship to each other...   Conducted by Yves Rossetti, teacher-researcher at the University Claude Bernard Lyon 1 and member of the Lyon Neuroscience Research Center     EMOTIONS: IS IT IN MY HEAD?   Conference at the Part-Dieu municipal library, 30 boulevard Marius Vivier-Merle, 69003 Lyon, conducted by Sylvain Delplanque, researcher at the Interfaculty Center for Affective Sciences, University of Geneva.   The fact that we sometimes cry of joy at the happy ending of a romantic comedy, that we have difficulty falling asleep because of stress, that we feel serene during a walk in the forest … emotions are at the heart of our daily lives, and sometimes feel like a roller coaster. The key to our emotional states lies at the heart of our most complex organ, the brain. How can we identify our emotions, describe them, study them? What strategies can be put in place to regulate them?     SLEEP: UPDATE ON CHILDREN'S SLEEP IN 2022!   On the occasion of the 22nd Sleep Day, an afternoon of meetings was offered. Child snorer, insomnia, impact of screens on sleep... many thematic presentations provided an overview of current topics of interest on children's sleep.   Held at 59 boulevard Pinel, 69500 Bron, conducted by   Patricia Franco, pediatric neurologist, Sleep Unit, ESEFNP, HCL/HFME & INSERM 1028, Lyon Neuroscience Research Center, Lyon 1 University   Claude Gronfier, neurobiologist, INSERM researcher, Lyon Neuroscience Research Center   Priscille Bierme, pneumopediatric allergist, Woman Mother Child Hospital   Florian Lecuelle, psychologist, Woman Mother Child Hospital, Lyon Neuroscience Research Center   Stéphanie Mazza, professor of neuropsychology, Reshape laboratory (Research on Healthcare Performance), Lyon 1 University   It included the following sessions:   The sleep of children aged 6 months to 10 years and their parents – results of the 2022 INSV/OpinionWay survey in France. C. Gronfier, Neurobiologist, INSERM researcher, Specialist in circadian rhythms   What to do in front of a snoring child? P. Bierme, Allergist pneumopediatrician, Woman Mother Child Hospital, Lyon   Insomnia in children: Therapeutic approaches. F. Lecuelle, Psychologist, cognitive-behavioral therapy specialist, HFME, Lyon   Impact of screens on sleep. P. Franco, Neuropaediatrician, child sleep specialist, Woman Mother Child Hospital, Lyon   Sleep, children and school. S. Mazza, Professor of Neuropsychology, Reshape laboratory (Research on Healthcare Performance) U1290, University Lyon 1   What does the French Longitudinal Study from Childhood (ELFE) teach us about the sleep of young French children? S. Plancoulaine, Public Health Physician, Sleep Physician, Epidemiologist, Research Team on the Early Determinants of Health (EAROH), U1153, INSERM, University of Paris   THE SICK BRAIN: BRAIN STIMULATION IN PSYCHIATRY, AN INNOVATIVE THERAPEUTIC: MEETING THE PSYR² RESEARCH TEAM   The PsyR² team of the Lyon Neuroscience Research Center opened its doors for two half-days on the theme of brain stimulation in psychiatry: how stimulating the brain can treat and better understand psychiatric illnesses. Several workshops were offered to present their work: demonstration of techniques, testimonials, scientific speed-dating.   At the Center Hospitalier Le Vinatier, building 416-1st floor, 95 boulevard Pinel, 69500 ​​Bron, conducted by:   Marine Mondino, neuroscience researcher, member of the Lyon Neuroscience Research Center   Frédéric Haesebaert, neuroscience researcher, member of the Lyon Neuroscience Research Center   Jérôme Brunelin, lecturer and hospital practitioner, neuroscience researcher, member of the Lyon Neuroscience Research Center   Delphine Janin, clinical research nurse, member of the Lyon Neuroscience Research Center   Leslie Wallart, executive assistant   Ondine Adam, Laure Fivel and Laëtitia Imbert, doctoral students at the Lyon Neuroscience Research Center   BRAIN AND ARTIFICIAL INTELLIGENCE: FIVE NEWS FROM THE BRAIN   As researchers gradually uncover the mysteries of the human brain, the race is on between human intelligence and artificial intelligence. Five Brain Stories takes us to the heart of today's science, discovering the work of five scientists, at the crossroads between the brain, consciousness and artificial intelligence.   A the Le Comoedia cinema, 13 avenue Berthelot, 69007 Lyon, it was presented a film directed by Jean-Stéphane Bron (cinema release March 16, 2022) The screening of the documentary was followed by a discussion with Emanuelle Reynaud, teacher-researcher in cognitive sciences and Amélie Cordier, researcher and president of the Lyon-iS-Ai association, both specialists in artificial intelligence.   Emanuelle Reynaud, lecturer at Lyon 2 University and member of the Cognitive Mechanisms Study laboratory   Amélie Cordier, lecturer at Lyon 1 University, scientific director of OFA and president of the Lyon-iS-Ai association       THE 5 SENSES: FROM SOUND PERCEPTION TO AUDITIVE ILLUSIONS   Sound perception involves mechanisms that make it possible to transcribe acoustic vibrations into relevant information for our senses. Under certain conditions, for example when there is a lot of noise, the acoustic information received may be imperfect or degraded, and other cognitive mechanisms then take over to fill this information gap. Our brain also constructs a mental representation of the perceived sound: what is its nature, where does it come from in space, what does it mean?   This event was intended for inmates of the Villefranche-sur-Saône Penitentiary Center - it was not open to the general public (260 Rue Lavoisier, 69400 Villefranche-sur-Saône).   Two scientists from the Lyon Neuroscience Research Center gave all the keys to understanding what happens in our brain when we perceive a sound, and you can realize that what we hear is the result of our experience, and does not correspond always to the reality of the sounds emitted. Amazing visual and auditory illusions could be discovered.   Conducted by:   Nicolas Grimault, CNRS research director and member of the Lyon Neuroscience Research Center   Fabien Perrin, lecturer at Lyon 1 University and member of the Lyon Neuroscience Research Center     HISTORY OF NEUROSCIENCE: PAIN AND THE BRAIN: HISTORICAL PERSPECTIVES / CURRENT PERSPECTIVES   Media library of Vaise, place Valmy, 69009 Lyon   Raphaële Andrault, philosopher and historian of science, Institute for the History of Representations and Ideas in Modernity   Luis Garcia-Larrea, neurobiologist, Lyon Neuroscience Research Center   How is pain transmitted to the brain? What exactly is going on in the brain when my hand hurts? These questions, studied today by neurophysiology, have a long history. In the 16th century, for example, surgeons, physicians and philosophers wondered about the role of the brain in pain, starting from a surprising observation: some people with amputated hands felt pain in the same hand, although lost during of the operation. It was put into perspective ancient and modern knowledge about pain, described both persistent misunderstandings and brilliant intuitions about the role of the brain in the complexity of pain sensation, and showed how certain current ideas respond to the discourse of our predecessors.   UNDERSTANDING THE BRAIN: COMA, WHEN REALITY EXCEEDS FICTION   From hordes of zombies roaming the streets of a devastated city in slow motion to patients leaping from their hospital beds in an instant upon waking up after years of a vegetative state, from a brain speaking from a jar to a perfect consciousness locked in an inert body, the representation of the coma and more generally of consciousness in the cinema very often stems from a fantasized imagination. What really happens in the brain during a coma? How can scientific research help intensive care physicians improve patient care? Based on extracts from films and series, two neuroscientists offered a journey between fiction and reality, to explore the functioning of our brain during these episodes of altered consciousness.   Florent Gobert, neuro-resuscitator doctor at the HCL (Neurological Hospital) and researcher at the Lyon Neuroscience Research Center   Maude Beaudoin, post-doctoral researcher at the Lyon Neuroscience Research Center   At the CNRS Rhône Auvergne, 2 Avenue Albert Einstein, 69100 Villeurbanne     BRAIN AND SPORT: ANTS IN THE LEGS: MOVING IS GOOD FOR HEALTH   Claude Bernard Museum, 414 route du Musée, 69640 Saint-Julien   The benefits of physical activity, even light, are no longer to be proven. Every movement in daily life has an impact on health. It was an invitation to share a bucolic stroll around the Claude Bernard Museum, in the heart of the Beaujolais vineyard and appreciate local products.   Valérie Gaveau, lecturer at the University Claude Bernard Lyon 1, member of the Lyon Neuroscience Research Center, conducted the conference-debate "The thousand and one movements to become one with our environment".     NEUROSURGERY: 30 MINUTES HEALTH SHOW. PARKINSON, TREMBLING, OCD: THE EXPLOITS OF SURGERY   RADIO-TV SHOW, Paris: a program hosted by Paul de Brem and offered on the occasion of Brain Week – March 14 to March 20, 2022   Pierre Jannin, Inserm Research Director, Medical Imaging Researcher, LTSI Laboratory, MediCIS Team, Inserm UMR 1099 - University of Rennes 1   Claire Haegelen, neurosurgeon at the Hospices Civils de Lyon, specialist in deep brain stimulation Daniel Quatreboeufs, patient   To see the tremor of his hands stop when he is awake, thanks to a high-frequency current injected by an electrode into a very precise region of his deep brain, is a miracle! For 15 years, Pierre Jannin has been developing computer-assisted neurosurgery software tools that he developed with his team at the Signal and Image Processing laboratory in Rennes. For this program, he was surrounded by Claire Haegelen, neurosurgeon at the Lyon University Hospital, and a patient who has benefited from deep brain stimulation technology.     BRAIN AND ART: BRAIN AND MUSIC   ROUND TABLE at Pasteur Institute, 28 Rue du Doctor Roux, 75015 Paris   It accompanies us everywhere and almost always, takes the most varied forms, transports us, impregnates our brain, moves us and makes our bodies dance. Music is one of the most fascinating art forms that continues to attract the interest of neuroscientists. Our brain is irreducibly musical.   How are musical sound elements created and travel through our brain? Does listening to and playing music have any influence on brain development and that of our cognitive functions such as language, memory, empathy? What if music was more than just entertainment for the majority of us?   But what about the 4 to 5% of the world population who bring together people with amusia for whom rhythms, melodies and harmonies do not reveal any meaning, without these same people showing hearing deficits, language disorders or behavioral problems in general. Would humanity and its cultural diversities have taken the same paths without music?   Just as it unites and brings people together, music mobilizes a set of brain circuits that underlie various functions, the auditory, motor, visual, tactile areas, emotions and memories. Finally, for several years, there seems to be a particular interest in seeing music as a potential ally in the care of certain patients (premature children, subjects with neurodegenerative diseases, autistic disorders, etc.).   The conference-debate 'Neurosciences and Music' was held at the Institut Pasteur on March 19th, 2022 and aimed to solicit various specialists in neuroscience, listening and musical practice, on these many questions that arouses the music and our brain.   Emmanuel Bigand, Burgundy Franche-Comté University   Séverine Feron, director of the 6th Jean-Philippe Rameau conservatory, musicologist and associate researcher at the Center Georges Chevrier of the University of Burgundy, Paris   Laura Ferreri, Lumière Lyon 2 University   Boris Gourevitch, Hearing Institute, Paris   Pierre Legrain, Pasteur Institute, Pasteur Institute, Paris   Nicolas Michalski, Hearing Institute, Paris   Alain Perez, journalist     SLEEP: WHY DO WE DREAM AND MEDITATE?   Why do we have to sleep? How is sleep triggered? Why do we dream and what are our dreams for even if they are sometimes forgotten when we wake up? Are the states of mindfulness meditation related to those recognized in our sleep? Finally, what are their impacts on our well-being?   These are the many questions that were addressed during this evening debate as part of Brain Week 2022 in the presence of two speakers from the Lyon Neuroscience Center, Pierre-Hervé Luppi and Antoine Lutz, at 10 rue de Concy, 91330 Yerres     THE SICK BRAIN: STIMULATING THE BRAIN TO TREAT AND BETTER UNDERSTAND HALLUCINATIONS IN SCHIZOPHRENIA   CONFERENCE at the Tours City Hall 1 to 3 rue des Minimes – 37000 Tours   Dr Marine Mondino, Le Vinatier Hospital Center, Lyon Neuroscience Research Center, INSERM U1028/ CNRS UMR5292 / Claude Bernard Lyon 1 University   Hearing voices, also called auditory hallucinations, is common in people with schizophrenia. The voices heard are often experienced as distressing or threatening. In one out of four patients, these voices are not or not sufficiently reduced by existing treatments. The work aims to better understand the mechanisms involved in the appearance of these symptoms in order to develop new therapeutic strategies and improve the overall care of patients with schizophrenia. In particular, they study what happens in the brain when these voices arise. They are also studying how this phenomenon is associated with confusing our imagination or thoughts with real events. To do this, they use techniques that allow to modify brain activity in a transient, safe and painless way. During this conference, there were presented these stimulation techniques and how they use them in psychiatry, both to study the functioning of the brain and to modify its dysfunctions in order to reduce the symptoms.   Winner of the Young Researcher Prize of the Thérèse and René Planiol Foundation, Marine Modino received her Prize before the conference.     EMOTIONS AND SOCIAL BEHAVIOR   What is the link between emotion recognition and social behavior? Children's ability to recognize emotions (facial and vocal transmission) is an essential factor for social interactions, especially in the context of genetic pathologies.   So what do we know about the links between emotion recognition, social behavior and psychiatric pathologies?   Laboratory involved: Study of cognitive mechanisms laboratory (EMC – Lyon 2 University)   Speaker: Marie-Noëlle Babinet, neuropsychologist at GénoPsy – CRMR (Centre Hospitalier Le Vinatier) and doctoral student in the Study of Cognitive Mechanisms laboratory (EMC – Lyon 2 University)   Location: Gerland Library, 34 Rue Jacques Monod, Lyon     POP’SCIENCES MAG – “UNDER THE INFLUENCE OF EMOTIONS” They exalt our daily lives and constitute the cornerstone of the exchanges between our brain, our body and what surrounds us.   Emotions are today the subject of numerous research works which aim to identify their mechanisms, their origins and the way in which they influence our actions.   In light of recent advances in the field, we are better prepared than ever to manage and master them. But aren't we also better equipped to counterfeit them, create them, even manipulate them?   For Brain Week, the University of Lyon invited to question the mechanics of our emotions, and those we share with others, through a new edition of its Pop'Sciences Mag "Under the influence of emotions”, to discover online or in paper version.   Pop'Sciences Mag aims to decipher major societal and topical issues through the lighting of different scientific perspectives. This issue proposed on the occasion of Brain Week was produced by the Pop'Sciences team and the local steering committee of the event, under the scientific direction of Rémi Gervais, professor emeritus of the University Claude Bernard Lyon 1 and member of the Neuroscience Research Center of Lyon.       The mysteries of the human brain deciphered in Lyon   Rémi Gervais, professor emeritus of neurosciences in Lyon, is the guest of 6 minutes chrono. As scientific advisor to Brain Week in Lyon, he detailed the event and underlined the importance of neuroscience. An article by GUILLAUME LAMY           References:  
Blog Image
April 07, 2022.
Presenza brought Naxon Labs to the Campus Party of Punta del Este 2022
Last Friday, on April 1st, Naxon Labs attended the Campus Party  in Punta del Este, as featured tool in the booth of Presenza. This is the largest technology event in the world, bringing together 83 editions in 15 countries, where youngsters, companies, communities, and various institutions come together to live an incredible experience of innovation and creativity. The event took place from March 31st to April 2nd, 2022, at the Convention Center in Punta del Este, Uruguay, and gathered about 9,000 people.   Naxon Labs was present at the booth of Presenza, an ecosystem that aims to democratize access to psychotherapy. This space provides tools to psychologists and patients to grow according to their needs and preferences, that is why they offer services such as psychotherapy, psychodiagnosis, vocational guidance, or neuropsychological evaluation.               In this Campus Party’s edition, Presenza’s representatives received countless visits at the stand, where those interested were able to perform a one-minute guided meditation -individually-, reaching high deep relaxation levels. In addition, thanks to the use of the Muse  headbands and the Naxon Explorer platform, they were able to analyze the levels of dispersion and relaxed attention of the participants. Federica and Lucia, Co-Founders of Presenza said: “We truly feel that marrying psychology and technology ultimately benefits greatly all the people getting treatment. Tools like Naxon Labs’ empower people by giving them factual information about their integral health status and providing treatment suggestions and mindfulness tips”. They also talked about mental health and neuroscience, and informed on the application of technology in these areas. They explained how biofeedback is obtained, what are its benefits, and how it impacts in terms of rights and accessibility. Federica and Lucia added that “Naxon Labs shares our love for wellness and democratizing mental health accessibility, so working with them was a no brainer and we were fascinated by people's response to Muse’s headbands and Naxon’s user friendly interpretative software. It was amazing to see people connect over mental health and we hope to keep doing it over and over”. The event allowed young entrepreneurs to discover Naxon Labs’ platforms, boosting the awareness our software products supported by ANII, one of the official partners of Campus Party.    
Blog Image
March 05, 2022.
Communication through brain-computer interface technology for people with disabilities
Naxon Fabulari is a new initiative for developing forms of communication or control through brain-computer interface technology for people with disabilities. With the help of a low-cost EEG device, which in addition to capturing brain waves can detect facial movements, a person can associate a pattern with a message or action. In this way a person can communicate a phrase or control connected external devices through a gesture. Within the framework of an agreement with ORT University (Montevideo, Uruguay), a team of computer science engineering students developed an advanced proof of concept for this initiative: Diego Klappenbach, Juan Ruiz and Mauricio Pastorino. This app can be used with products from the Interaxon Muse line (Muse 1, Muse 2 and Muse S).  The main screen of the application is the dashboard. This view is for the purpose of reminding the user which sentence they have for each gesture, to shape the communication they want. Currently six combinations have been implemented, and as the number of actions expands, the summary will be even more relevant. For situations where the headband is not available to execute the gestures, playback is offered through a button placed to the right of the cards that represent the association of the gesture and its sentence. The user with the headband placed and linked to the application, can begin to perform muscle activations, or cervical movements to produce the voice outputs of the sentences associated with the gesture. Playback is done through the speaker. In turn, cervical extension is reserved (tilt the head backwards, looking up), as direct access to the selection of profiles. The user without navigating to another module, in addition to reproducing sentences with gestures, can also navigate between the available profiles, select the desired one, and operate with it.  Carousel Support is a plugin for activating gestures from the Dashboard. On the main page, up to six sentences can be activated, and in turn, from the same place you can exchange the available profiles by interacting through the headband. Each user can create up to six profiles per account, meaning that 36 sentences can be easily accessed from the Dashboard. Depending on the use case, the combination of sentences per profile can depend on the context in which the user is. For example, the set of sentences used by the user in a clinic may differ from those used in an educational center as well as in a social event. Well, it is expected that of the total number of sentences that you have available, not all of them are applicable in the context in which you find yourself. An alternative may be to modify your profiles online to adapt the sentences based on the moment you are in. The combination of Dashboard and Support Carousel, allows the user to generate very specific profiles and at the same time, enjoy the generality of the expressions of habitual use. The Support Carousel is subdivided into three carousels. The first of them brings together the sentence subjects: I, you, he, she, we, and we. Next, in a vertical arrangement, is the carousel of predicates. It is worth noting that depending on the chosen pronoun, the application automatically conjugates the words presented in the following carousels to favor the syntax and, therefore, the understanding of what is stated. Finally, the last carousel fulfills a double function. Based on the chosen predicate, its elements can take one of two forms: verbs or adverbs. This distinction is resolved automatically by the application without the need for the user to intervene in the definition. In case the user composes a sentence that allows the modification of the verb, the third and last carousel updates its elements in adverbs to enrich the statement. Otherwise, it collects and presents a set of verbs.   If you are interested in exploring Naxon Fabulari, using EEG for communications, just connect with us and we will help you:  
Blog Image
January 25, 2022.
FIBRAS: United we promote projects of social impact and technology
Naxon Labs is part of #Fibras as founding member! Fibras is an ecosystem made up of several organizations in Uruguay joining forces to create #socialimpact through different disciplines. On November 29th, 2021, Fibras became an official Civil Association Purpose Fibras is an ecosystem where people, companies and organizations come together to support and accelerate projects and ideas using technology as a platform to generate social impact in the following dimensions: About Fibras It is a multidisciplinary collective with the ability to unite technology and humanism. Fibras combines knowledge and experience in technology, medicine, neuroscience, psychology, education and research. Fibras’ pillars: synergy and collaboration in sync towards a common purpose. Projects The projects that weave the fabric of Fibras cross and intertwine the different dimensions aligned with the sustainable development objectives defined by the United Nations. The projects that are being promoted by Fibras and their members can be explored at
Blog Image
January 11, 2022.
Virtual reality and electroencephalography for modulation of emotional states through visual and auditory stimulus
Combining Virtual Reality and EEG technologies, by adding visual and auditory stimulus, Iker López has created a software application for the modulation of emotional states with binaural waves and neurofeedback techniques. As part of his graduation work as Computer Science Engineer in the Escola Politècnica Superior of Universitat de les Illes Balears (UIB), Iker López de Suso Sánchez developed a software application motivated by providing technological support for mental health care, combining software games with Unity and Virtual Reality. Iker worked under the supervision of Dr. Francisco José Perales López and Dr. José María Buades Rubio in the program 2020/2021 in the city of Palma, Majorca, Spain. Iker analyzed brain waves in non-invasive EEG devices like Muse Band 2, which has 4 channels plus one for reference, using Bluetooth 4.2 communication: ● Gamma (32-100Hz) ○ High cognitive processing. ○ Learning. ○ Problem solving. ● Beta (13-32Hz) ○ Concentration. ○ Decision making. ● Alpha (8-13Hz) ○ Relaxation. ○ Well-being. ● Theta (4 - 8Hz) ○ Imagination. ○ Internal processing. ○ Dreams (REM), fears. ● Delta (0.5 - 4Hz) ○ Deep meditation. ○ Deep sleep (without dreaming).   A binaural beat is an auditory illusion perceived when two different pure-tone sine waves, both with frequencies lower than 1500 Hz, with less than a 40 Hz difference between them, are presented to a listener dichotically (one through each ear). For example, if a 100 Hz pure tone is presented to a subject's right ear, while a 104 Hz pure tone is presented to the subject's left ear, the listener will perceive the auditory illusion of a third tone, in addition to the two pure tones presented to each ear. The third sound is called a binaural beat, and in this example would have a perceived pitch correlating to a frequency of 4 Hz, that being the difference between the 104 Hz and 100 Hz pure tones presented to each ear. Figure: Binaural Beats Binaural-beat perception originates in the inferior colliculus of the midbrain and the superior olivary complex of the brainstem, where auditory signals from each ear are integrated and precipitate electrical impulses along neural pathways through the reticular formation up the midbrain to the thalamus, auditory cortex, and other cortical regions. Then the neurofeedback (NFB), also called neurotherapy, is a type of biofeedback that presents real-time feedback from brain activity in order to reinforce healthy brain function through operant conditioning. In this case, electrical activity from the brain is collected via sensors placed on the scalp using electroencephalography (EEG Muse Band 2), with feedback presented using video displays or sound. Figure: Neurofeedback “There’s decades of innovations ahead. We’re at the very beginning, where it’s just at the stage where we can bring in consumers but there’s so much further to go from there” said Brendan Iribe, CEO of Oculus Rift, the device Iker used for the application. Also in his work, Iker cited Mark Zuckerberg, CEO of Facebook, now rebranded as Meta, to enter with these applications in the so called Metaverse: “The incredible thing about the technology is that you feel like you’re actually present in another place with other people. People who try it say it’s different from anything they’ve ever experienced in their lives.” Iker considered different reality technologies: Augmented reality (AR), Virtual reality (VR) and Mixed Reality (MR). Augmented reality (AR) adds digital elements to a live view often by using the camera on a smartphone. Examples of augmented reality experiences include Snapchat lenses and the game Pokemon Go. Virtual reality (VR) implies a complete immersion experience that shuts out the physical world. Using VR devices such as HTC Vive, Oculus Rift or Google Cardboard, users can be transported into a number of real-world and imagined environments such as the middle of a squawking penguin colony or even the back of a dragon. In a Mixed Reality (MR) experience, which combines elements of both AR and VR, real-world and digital objects interact. Mixed reality technology is just now starting to take off with Microsoft’s HoloLens one of the most notable early mixed reality apparatuses. The Oculus Quest 2 device has a resolution per eye of 1920 x 1832 pixels, a refresh rate of 90Hz, and a FOV (Field of View) of 90°. Figure: Oculus Quest 2 Unity is a cross-platform game engine developed by Unity Technologies, first announced and released in June 2005. The engine has since been gradually extended to support a variety of desktop, mobile, console and virtual reality platforms. The engine can be used to create three-dimensional (3D) and two-dimensional (2D) games, as well as interactive simulations and other experiences. Oculus Quest and Quest 2 deliver the freedom of wireless, standalone VR with the industry leading power and performance to drive your next immersive app. Both of these devices include spatially tracked controllers, integrated open-ear audio, and support for Oculus Link which enables users to access their Oculus Rift library of apps from their gaming compatible PC. For this applications Oculus Quest 2 has been integrated with Unity to create the VR environment, scene, game objects, the components defining the game object behavior and the materials that add texture and colors to objects. For the integration of the EEG device with the software application it was used the Naxon Explorer API to get the data at the right moment. With Naxon Labs platform and an EEG device like Interaxon’s Muse, you can create a mark derived from an external event in a sequence of brain activity expressed in waves. Naxon Explorer is a useful tool and neurofeedback system for researchers in Neuroscience, Psychology and Medicine. You can record brain data, get measurements and sessions data that will let you use machine learning and automatic pattern analysis. With the API, you can analyze brain behavior and its response to an external activity.In this application, Iker exposes the brain to visual and auditory stimulus at the same time informs Naxon Explorer through an API to register the moment accurately. With this, you can analyze the continuous brain waves and check what was the impact of the external event in the brain activity. Figure: Data flow In the application you can work with the session, the configuration, course of session, summary of results (graphics and information). You can also configure the environment, the binaural waves and the music. As future directions of this work, Iker Lopez indicated the statistical study to validate the effectiveness of the techniques used, increase the stimulus perceived by the user (shaders, particles), the expansion of neurofeedback techniques, the development of a backend project including communication with a server and a data base, the Inclusion of alternative input systems (head movement, voice control) and design and develop an in-game tutorial. This is one of the latest initiatives developed by Dr. Francisco Perales workgroup in the Universitat de les Illes Balears. The UIB research team has been working a lot in Brain Computer Interfaces and VR / AR. They have several projects where they normally apply technology to therapeutic subjects. They have used these technologies with children with CP, Autism, ADHD, etc. Recently they were involved in national project for the elderly and social robots. One of the applications of VR as indicated above, was intended to assess the emotional state of the person and modulate it, for example to control chronic pain in diseases. For this, the team uses physiological parameters (heart rate, EDA, etc). The team also used EEG addressing the challenges of using a VR headset with an EEG headband with Naxon Explorer API. The paper Evaluation of a VR system for Pain Management using binaural acoustic stimulation explains the process which was written by Francisco J. Perales, Laia Riera, Silvia Ramis and Alejandro Guerrero. Abstract The system proposed is oriented to evaluate the pain perceived by the user under a high controlled virtual reality environment (VR). AVR system is an implementation of a virtual world that the user perceivesasthe realone.Thesensationofimmersionaffectsthestimulus (visual,acousticandhaptic) perceived by the user and it is able to promote change in the brainwaves power and produce an activation of Autonomic Nervous System (ANS). The Electro-Dermal Activity (EDA) allow measuring the electrical properties of the skin by the sweat activity. This work proposes a VR environment combined with binaural beats (binaural sounds) and visual stimulus to evaluate the perceptionthatthe user has and comparingtheir sensation with real physiological data. It is believed thatthe use of different binaural beatsin along period oftime can help patientstoinduce a relaxation state (mood) and consequently modulate the perception to pain. In this study we show two experiments. The first one applies 8 types of acoustic stimulus (4 binaural and 4 monaural) in a standardsimpleVRscenarioandweproposetheenduserstoselecttheexperimented feelingthey felt in any case, in parallel using the Empatica wristband we contrast the subjective users answers with physiological values given by the device. In the second experiment, an immersive environment based ont he wholeVRapplication is proposed for control and real users to evaluate chronic pain.The users are immersed in three VR equal scenarios butwith random sound stimulation.Withthe results obtained, we can conclude that binaural beats work better than non-binaural beats if we talk about relaxation and meditation.