Texting should come with a health warning. It has caused motorists to drive off cliffs, and ruined dates and dinner table conversations. It even has a couple of physical ailments named after it (text neck or texting thumb, anyone?). But what if you could text with your brain — composing and sending text messages just by thinking about them?
So-called "text by thinking" technology is in its early stages, but it could become a natural way of interacting with augmented reality (AR) glasses, which some predict will replace smartphones around 2025. Key to the technology is what's called a brain-computer interface (BCI). BCIs are communication systems that analyze patterns of brain activity to determine user intent. They translate thought into action so that users can move cursors, control computers and more — using their mind. Interest in BCIs, and in neurotechnology in general, has surged in recent years. It has spawned a host of BCI startups such as Kernel, Paradromics and Elon Musk's Neuralink, as well as academic collaborations such as the BrainGate project.
Tech giants such as Facebook are also going all in on BCIs. Last year, Facebook said it has a team of 60 engineers focused on building a BCI that will let users type by thinking, without invasive implants. The long-range goal: to enable people to type at 100 words per minute just using their mind (that's about five times faster than regular texting). Facebook wants to achieve this by optically reading brain activity from outside the skull — scanning the brain 100 times per second to detect the user speaking silently in their head, then translating it into text. But some neuroscientists say such brain-texting speeds will only be possible with devices planted inside the skull.
Look Ma — No Thumbs!
Next, the BrainGate researchers want to adapt the system so that BCIs can control commercial computers, phones and tablets — opening up a world of possibilities for people with neurological disorders. But the use of surgical implants in humans has been very limited due to health and safety concerns.
Non-invasive BCIs that don't require surgery have proven promising as a means of harnessing intent to communicate with a wide range of devices. They have been used to demonstrate control of cursors, wheelchairs, robotic arms, drones, humanoid robots and even brain-to-brain communication. Non-invasive BCIs are commonly based on electroencephalography (EEG), which uses electrodes placed on the scalp to measure brain activity caused by the flow of electric currents through neurons. But because they read brain activity from outside the skull, it's harder for them to separate the signal from the noise.
A concern around text by thinking is privacy. Facebook CEO Mark Zuckerberg has taken heat recently in part over his company's handling of data privacy. What privacy issues might arise if Facebook were to succeed in developing a BCI that can capture speech at the source, as it were — literally getting into people's heads?
Last month, researchers at MIT Media Lab (of which Cisco is a member company) announced a non-invasive wearable that addresses such privacy concerns while allowing users to converse silently with machines and people. Known as AlterEgo, the wearable captures the electrical signals generated by the muscles used in silent speech — as when a person talks to herself. It attaches to the user's jaw, chin and ear. Bone-conduction earphones allow the user to "hear" responses.
The researchers' goal with AlterEgo is to build a seamless interface between humans and computers. The high-bandwidth, high-bit rate, natural language interface is also capable of texting at speeds similar to those of actual speaking — all without the user opening her mouth or vocalizing.
"The AlterEgo interface is a sweet spot between thinking and speaking because it delivers the privacy that thinking would bring, but it also feels like you have control over what input you're transmitting to a computing device or another person," says Arnav Kapur, a graduate student at the MIT Media Lab who led development of the new system.
Pattie Maes, a professor in MIT's Program in Media Arts and Sciences, says AlterEgo takes a radically different approach from other BCI systems, including the one Facebook is developing.
"Several teams are trying to read thoughts directly using brain computer interfaces, but our approach has the advantage that the user's thoughts remain private, in that the user has control over what words are communicated with the system," Maes says.
Mixed Reality by Thinking
BCIs and wearables like AlterEgo are part of a larger quest for the holy grail of a mind-machine meld — with potential applications far beyond brain texting.
Last year, a startup called Neurable unveiled what it called the world's first BCI for virtual reality (VR). Using VR goggles and EEG sensors placed on the scalp, the prototype immerses users in a virtual world where they must escape from a cell using just their mind. The game was built to showcase the capabilities of a neural interface for VR. Players engage in tasks such as selecting objects, typing and teleporting.
"The historical speed for BCIs before Neurable was 10-20 seconds per selection," says Adam Molnar, Neurable's director of sales and marketing. "With Neurable, it's below 1 second per selection."
The prototype game hints at a brave new world in which people will be able to interact with VR/AR environments (aka mixed reality, or MR) using only their brain activity. Neurable predicts that AR/VR headset companies will eventually integrate brain sensors directly into their products. In time, the company says, high-performance, non-invasive, intuitive BCIs will unleash a "productivity revolution" in which users can scroll menus, select items, launch applications, manipulate objects and even input text using only their brain activity.
"BCIs will have impact in the enterprise space, medical, research, gaming space and many more," Molnar says. "VR/AR are right there on the horizon, and we still try to interface with these modalities as we do with existing systems. Neurable is building a toolkit that all verticals will be able to use and adopt, and that links mixed reality to the ultimate computing platform — the human brain."