The mouse that depletes its battery. The trackpad that becomes unresponsive and fails to react to a gliding finger. The “sent from my iPhone, please excuse any errors” note.
You might view these inconveniences as unavoidable consequences of an otherwise advanced digital era. Thomas Reardon perceives them as evidence that our interaction with computers is inherently flawed.
Reardon, a neuroscientist who developed one of the first gateways to the Internet—Microsoft’s Internet Explorer—imagines a future where machines respond directly to our thoughts.
His startup, CTRL-Labs, based in New York City, is developing an external body sensor that interprets thoughts to control a computer. Reardon spoke with National Geographic about his work on a brain-machine interface and its potential impact on the future of computing.
The idea of a brain-machine interface revolves around interpreting signals from motor neurons. Could you clarify what that entails?
The neurons that convey your intentions—your ability to take action in the world—are motor neurons.
When your brain decides to act, it activates or deactivates muscles. This may sound somewhat discouraging, but it’s the only function your brain is capable of performing. Your brain can recognize and comprehend a multitude of things in the world. However, regarding action, its sole ability is to turn muscles on and off. This occurs through motor neurons, which are the neurons we engage with in our research.
What is the actual process? Where are the sensors placed?
We don’t disclose much detail since we are actively securing intellectual property at the moment. [However] our firm is not concerned about the specific location of the sensors on your body. One location could be the wrist, but they could also be on your neck, in your ear, or around your ankle. What matters to us is accessing the electrical signals generated by motor neurons as they connect to muscles.
When you consider all the various ways you move in life and how motor neurons control actions, your hands represent the most innovatively dense area of your body. We perceive everything we do with our hands as skillful, especially when it comes to adaptation. A simple task like lifting a glass of water to your lips involves incredible skill. It doesn’t happen effortlessly. Your brain allocates a vast amount of effort and processing power to that task—something you typically take for granted.
And how does this lead to an enhancement in our interactions with computers today?
Currently, you perform all that interaction through a device, whether it’s a mouse, keyboard, or joystick. Only by eliminating the device and directly interpreting the nerve signals can we achieve a new type of interaction between humans and machines.
You can absorb a tremendous amount of information into your brain and process it almost instantly. You operate as an extraordinary processing unit. The limitation occurs when you try to output information. This is due to having to communicate through biomechanical muscles. That’s where everything gets complicated, and that’s the challenge we aim to address.
All previous research on brain-machine interfaces has primarily focused on clinical populations: individuals with nerve disorders, such as ALS or various muscular dystrophies. Our key insight was to consider that most of the work done thus far has been aimed at those who lack functional motor systems. So we wondered: how would you tackle the brain-machine interface problem if you have a functioning motor system? This question served as the core motivation for our company: instead of circumventing the motor neuron system, let’s engage with it directly.
There are numerous potential applications for this technology. Are there one or two that excite you particularly?
The initial challenge we want to address is text input on phones. From my viewpoint, when the iPhone launched in 2007, we as a species took a step backward. We didn’t enhance our communication skills; instead, we limited them, as typing and texting on a phone can be incredibly frustrating. I envision making these experiences more satisfying than they currently are at a keyboard— even for proficient typists—when using a phone. I want to rethink those experiences so we can start to progress from the regressions we faced over the last decade.
I aspire to see computing advance to a new level of ease and usefulness for everyone. That might seem vague, but I really think about questions like, why must I use a keyboard to input text into a device? Why can’t I compose a message to my wife while my hand remains in my pocket?
Consider how you try to highlight text while editing today. Reflect on your use of a mouse and how slow that process can be. What if you could type and select text at the same time, without moving your hand to the mouse? Everything becomes much richer and more intuitive once we shift away from devices and decode the actual nerve signals. There is truly no interaction with any machine today—be it a computer or robot—that this technology won’t fundamentally transform.
Some may argue that we have already become too connected to machines. Do any thoughts about the path we’re on give you pause? We’ve taken significant steps backward with the mobile Internet. Given the constant flow of information right in front of us and a frustratingly limited ability to engage with it, I believe humanity is regressing. We seem to have lost control, as if the machines are increasingly programming us.
The reason we chose the name CTRL-Labs for our company is that we aim to give people control in advancing the interactions between humans and machines. Let’s revert machines back to being tools. I want us to be the ones programming machines at the most fundamental level.
Picture the ability to control your computer merely by thinking. It might sound far-fetched, but real progress is being made in the realm of brain-computer interfaces. An increasing number of researchers and companies are venturing into this field. However, significant challenges persist, ranging from user training to the realities of invasive brain implant methods.
Elon Musk is currently focusing on sensors implanted in pig brains. While he is best known for his endeavors with Tesla and SpaceX, he is also the founder of Neuralink, a company dedicated to transforming brain-computer interfaces. These devices promise to enable humans to operate computers using just their thoughts. Neuralink is currently experimenting with this innovative technology on pigs. At a September press conference, Musk showcased a pig with a brain implant that monitored stimulation in her snout.
Though this may seem like science fiction or just hype, there is potential in this research area. Brain-computer interfaces, or BCIs, might soon assist patients with brain injuries or motor impairments to recover or better interact with their environment. For instance, someone with limited motor abilities could direct a motorized wheelchair with their mind or even control household appliances and devices like a TV or thermostat without any physical movement, thus enhancing their independence. Ultimately, it might contribute to boosting people’s cognitive abilities. However, for now, there are several technological and human obstacles to overcome.
Such hurdles are the focus of Dr. Fabien Lotte, Research Director at Inria Bordeaux-Sud-Ouest in France. “Most brain-computer interfaces function, but they don’t function effectively,” he stated.
There are primarily two categories of BCIs: non-invasive and invasive. Non-invasive BCIs are the most prevalent type, consisting simply of sensors attached to the head, resembling a high-tech hat with wires. These sensors measure brain activity and convert that data to a computer. Invasive BCIs, on the other hand, involve placing sensors inside the skull, which is what Neuralink is investigating.
A BCI might aim to move a mouse pointer left or right based on the user’s brain activity. Dr. Lotte notes that BCIs, on average, are accurate about 60% to 80% of the time, though this varies with the number of mental commands involved. A system that only directs a cursor left or right incorporates just two mental commands and achieves a higher accuracy of around 70% to 80%. Therefore, the system occasionally makes errors, roughly once every few attempts. “If a computer mouse had that rate of mistakes, you wouldn’t use it,” Dr. Lotte remarked.
However, in Dr. Lotte’s view, the challenge might not solely lie with the technology but also with the users of BCIs. “Controlling a BCI is a skill that must be learned,” he said. “We don’t just need advanced technology; we also need well-trained users.”
Dr. Lotte directs a research initiative called BrainConquest that aims to improve training for users of non-invasive BCIs. The researchers offer users activities like playing a video game using their brain, where individuals envision an action that should be executed on the screen. Additionally, the team is creating improved feedback systems, such as tactile gloves that deliver vibrations to the user’s hand.
They are also experimenting with social feedback, including encouragement. They even developed an artificial companion named PEANUT, resembling an adorable cartoon robot with a screen as its face. “It’s quite challenging to have a human instructor that is consistent,” Dr. Lotte explained, advocating for an artificial companion that can provide a more uniform interpretation of brain activity while still delivering a valuable feedback experience.
The study is still ongoing but indicates significant improvements in specific users. A blend of tactile and visual feedback results in an average accuracy increase of 5% for the whole test group. PEANUT positively influences individuals who prefer collaborative work. Without PEANUT, their accuracy averages at 63%, which increases by 5% to 10% depending on the individual. Conversely, users who prefer solitary work experience a decline in performance when PEANUT is involved.
On the other hand, technology continues to present challenges. Dr. Aaron Schurger, an assistant professor at Chapman University in the US, contends that the method of data analysis used by BCIs could be enhanced. Traditionally, BCIs focus solely on the data from moments when users intend to take action. For instance, they gather extensive brain data when a user aims to move a mouse pointer left, using that information to improve the timing of that action.
However, Dr. Schurger suggests it is essential to consider a broader range of data, which includes information collected while the brain is at rest. This is a notion he has previously investigated in the research project ACTINIT. ‘We are currently analyzing all the data,’ stated Dr. Schurger. ‘This includes data not just from right before a movement.’
Dr. Schurger draws a parallel with weather forecasting, where meteorologists analyze vast amounts of weather data to make predictions. ‘If you want to predict when it might rain, only examining rainy days will lead to inadequate forecasts. You’ll overlook a significant portion of the information.’
Nevertheless, if BCIs aim to resolve the existing issues they face, more drastic measures than user training or improved data analysis may be necessary. Researchers must look beyond non-invasive methods. One primary non-invasive technique is EEG, or electroencephalography. This process involves placing electrodes on the scalp to measure the electrical activity generated by neurons in the brain. ‘EEG detects microcurrents that signify brain activity,’ explained Dr. Lotte.
When a person performs an action or considers it, countless neurons may fire, producing an electrical current strong enough to be captured on the scalp. Then, software systems work to interpret this data and link it to a specific action or thought.
Dr. Schurger believes that EEG technology has hit a plateau. ‘Researchers have been tackling this issue for three to four decades, and we haven’t seen significant advancements for a long time,’ he remarked.
A crucial aspect of this challenge is the thickness of the skull. While it provides excellent protection for our brain, it simultaneously complicates our understanding of the activity happening beneath it.
‘The brain signals are exceedingly faint,’ stated Dr. Schurger. ‘Imagine trying to capture a single conversation in a crowded football stadium using a few microphones placed overhead. You might note when a goal is scored, but isolating that one conversation is quite difficult.’
The answer is to get closer to the action, akin to entering the stadium. For BCIs, this necessitates drilling into the skull and attaching sensors directly to the brain. This method improves signal clarity, and invasive BCIs have been implemented in humans since the late 1970s in experimental cases aimed at restoring partial sight to blind patients and granting control of prosthetics to those who are paralyzed. However, it comes with a variety of medical concerns.
Initially, physicians must persuade patients and regulatory bodies to approve the installation of a device within a person’s skull. Additionally, there might be health complications. The body may develop immune tissue around the sensor or even reject it entirely—which could degrade the signal quality for the device or pose health risks for the patient. ‘There is a foreign object within your skull,’ Dr. Schurger noted. ‘The body tends to respond by rejecting it.’
Due to these factors, the more futuristic possibilities of merging human capabilities with machines to enhance cognitive functions might need to be delayed. For the time being, medical applications are likely to remain at the forefront of this field, according to Dr. Schurger.
However, even BCI systems that don’t operate perfectly can still be beneficial. Dr. Lotte points out that non-invasive BCIs can support stroke patients in their rehabilitation, a topic he has also examined with the Pellegrin hospital in Bordeaux. A stroke patient already needs to engage the damaged parts of their brain by, for instance, visualizing a specific action. A BCI could assist these patients by providing feedback on their mental exercises, though it is too early in the study to report specific results regarding efficacy.
‘In this context, the reliability of the system is not critical,’ Dr. Lotte said. ‘The goal isn’t to control something but to relearn how to utilize that area and aid in recovery.’