Many of us, at some point, have wished for a gadget that could perform tasks without so much as lifting a finger to type or even speaking. Our wish might just come true because researchers from MIT have built a computer interface that can read your thoughts.
They have developed a headset named AlterEgo that can interpret words that are spoken out loud in mind without actually speaking. This invention outdoes virtual assistants like Siri or Alexa that require voice commands to trigger a response.
AlterEgo is a wearable headset that wraps around the user’s ear and jaw, and the computing system integrated in the device processes the data picked up by its sensors.
When we speak in mind, neuromuscular signals are generated in our jaw and face. These signals are fed to the machine-learning system of AlterEgo which can later correlate particular signals with particular words.
The idea was to develop a computing platform that “melds human and machine” in a certain way and acts as “an internal extension of our own cognition,” said Arnav Kapur, the lead researcher on this project.
This headpiece requires calibration for every individual user because each wearer creates slightly different neuromuscular signals. But AlterEgo is still impressive considering that it can perform arithmetic calculations and even play chess through subvocal commands.
In future, this silent interface can prove to be useful in noisy environments or places where silence is required. It could also allow speech disabled people to communicate, assuming they can still use facial and jaw muscles.