Read your emotions and cognitive states
Will a clinical/occupational psychologist, a talent and human resources consultant, or a researcher in the future know in what emotional state one is just by looking at a mobile phone, tablet or computer? Yes, and we explain how:
We are developing a next generation tool that can detect and record emotions and cognitive states in real time. We believe that it can make great and positive changes as a technology-supported assessment in the area of mental health, talent search/human resources management, or general research, helping the professional during sessions, therapies or evaluations, with a practical and intuitive visualization.
The product consists of a software application that can be used with a computer, cellphone or tablet that works together with portable EEG devices and can offer valuable and real-time information about different mental states that the client is experimenting. The professional will also be able to enter in a specific mental state mode and access a live feedback visualization to work with the client and aid in the regulation of that state through the preferred technique used by the professional. Sessions can be recorded and accessed later, where information tracking different mental states through time can be displayed.
Doing research with Naxon is easy
We are developing a practical, cheap and useful tool for researchers in Neuroscience, Psychology, Medicine, Engineering and Information Technology. We want to open to the world the possibilities of researching the brain. With this tool, an experienced researcher or recently graduated professional will be able to explore the mind.
The product consists of a software and data platform integrated with tools to configure and carry out experiments in different scenarios, with built-in pattern detection instruments (including evoked potentials, brain wave measurements and mental-state detection algorithms) and machine learning tools for working with recorded data. We simplify the process of extracting meaningful responses to stimuli, like evoked potential, with fully automated signal processing tools. Additionally, each user will have the possibility of saving and accessing the sessions in our data platform. The platform will be available in a Software-as-a-Service format.
We want to take the mind control of devices to another level, adding the mental commands and slight facial movements to the reading of emotional states. In this way we can change the conditions of a video game depending on the mental state of the player, or control complex objects such as a drone or robotic hand.