Scientists at the ATR Computational Neuroscience Labs in Japan have created an AI-based system that’s capable of performing deep image reconstruction from human brain activity.
In simple words, it’s reading our minds without actually knowing what we’re thinking. Now, what that means is the AI system can’t see inside our brain or the things we’re picturing. It takes help of the brain waves (MRI data) to guess what we are thinking and draws an image out of it. For instance, a letter or shape.
To train their AI, the researchers fed it with recorded brainwaves of human subjects after showing them images. Over the course of 10 weeks, they collected brain activity data in real-time and also by making the human subjects visualize what they had seen in the past.
There is still scope for improvement for the AI which lacks perfection when it comes to reading our minds. But in the future, more advanced systems could open ways for new modes of communication. Possibly, which won’t require us to speak or even make hand gestures. There could be many applications. For example, such technology could be used to help people with difficulty in speaking.
You can read more about the mind-reading AI in the research paper.