Researchers from Meta and the Basque Center on Cognition, Brain and Language in Spain have achieved a significant advancement in decoding brain activity. They were able to reconstruct typed sentences solely from non-invasive brain recordings. This opens up new possibilities for human-computer interaction and could, in the long term, help people with communication impairments.
In two studies, the scientists investigated the neuronal processes involved in typing. 35 participants typed sentences while their brain activity was recorded using magnetoencephalography (MEG) and electroencephalography (EEG). These methods capture the electrical activity of the brain without requiring any surgical intervention.
A specially trained AI system analyzed the recorded brain signals and learned to reconstruct the typed sentences. The system achieved an accuracy of up to 80 percent at the letter level. In many cases, the AI succeeded in reconstructing complete sentences solely from the brain activity.
The second study focused on how our brain translates thoughts into complex movement sequences. Since mouth and tongue movements interfere with the measurement of brain signals, the researchers analyzed MEG recordings during typing. With a sampling rate of 1,000 recordings per second, they were able to track the moment when thoughts become words, syllables, and letters.
The results show that the brain initially works with abstract representations of meaning and then gradually converts them into concrete finger movements. A special "dynamic neural code" allows the brain to represent multiple words and actions simultaneously and coherently.
Despite the impressive results, the technology still faces challenges. MEG measurements require participants to sit still in a shielded room. Furthermore, additional studies with patients with brain damage are necessary to demonstrate the clinical benefit.
Decoding the neural code of language remains a central challenge for AI research and neuroscience. However, a deeper understanding of the structure of language in the brain could lead to further advances in AI and enable the development of neuroprosthetics for people with communication disorders. Meta emphasizes that the research is still in its early stages, but the potential of this technology is enormous.
The research is already being applied in healthcare. For example, the French company BrightHeart uses Meta's open-source model DINOv2 to detect congenital heart defects in ultrasound images. Similarly, the US company Virgo uses this technology to analyze endoscopy videos.
Bibliographie: - https://ai.meta.com/blog/brain-ai-research-human-communication/ - https://me.mashable.com/tech/52740/metas-ai-achieves-80-mind-reading-accuracy-what-it-means-for-the-future - https://the-decoder.com/meta-ai-reconstructs-typed-sentences-from-brain-activity-with-80-accuracy/ - https://www.pcgamer.com/hardware/meta-mightve-done-something-useful-pioneering-an-ai-model-that-can-interpret-brain-activity-into-sentences-with-almost-80-percent-accuracy/ - https://startupnews.fyi/2025/02/17/metas-ai-can-now-read-your-mind-with-80-accuracy/ - https://www.technologyreview.com/2025/02/07/1111292/meta-has-an-ai-for-brain-typing-but-its-stuck-in-the-lab/ - https://www.threads.net/@the_ainavigator/post/DGJeMZAq9Bi - https://themunicheye.com/meta-achieves-80-accuracy-decoding-brain-activity-9981 - https://ai.meta.com/research/publications/brain-to-text-decoding-a-non-invasive-approach-via-typing/