By the 1970s, Licklider’s vision inspired DARPA’s first research into human-machine interactions facilitated by direct neural interfaces. An early set of experiments explored how well noninvasive sensors could measure responses to sensory stimuli experienced while performing tasks. At the time, the enabling technology for meaningfully interacting with the brain did not yet exist, and so the research results were marginal. But that situation began to change by the late 1990s with the accumulation of advances in information systems, materials science, and sensors for studying brain structure and function at a new level of detail. By the early 2000s, DARPA began investing heavily in neurotechnology. The agency established the Brain-Machine Interfaces program to record patterns of neural activity in animal models and decode the neural states associated with memory formation, sensory perception, and motor intent. When the research moved to humans, the ability for a user to directly observe neural decoder outputs in the form of a moving cursor or robotic arm proved critical. That visual feedback allowed the user’s brain to adapt – essentially altering its own function to help the neural decoder achieve the task. Subsequent development of more advanced decoders opened the way for iterative co-adaptation between the system’s algorithms and the user’s neural activity, which further accelerated a user’s mind-based motion control. Today, researchers expect that the added ability to convey near-natural touch sensations will further improve feedback-driven learning. During the in-human studies conducted under the HAPTIX and Revolutionizing Prosthetics programs, study participants were so highly engaged with the work that they effectively became part of the research team. That dynamic made it possible to tailor the system’s performance to the needs and wants of the participant. For instance, Jan, a woman living with quadriplegia, quickly achieved the goal of feeding herself a chocolate bar with a prosthetic arm controlled by a direct interface to her central nervous system. Jan next determined to project herself beyond the confines of her wheelchair and to enter a simulated cockpit. Despite being unable to move
DARPA has invested more than $500 million in support of the White House Brain Initiative since it was announced in 2013. This federal investment has accelerated the development of innovative neurotechnologies with the potential to improve human health and change the way people live, work, and play. Among the breakthroughs realized with DARPA funding was development of the CLARITY method by researchers at Stanford University, which allows intact brain tissue to be studied in rich, three-dimensional detail and makes possible better understanding of how brain processes work.
from the neck down, Jan used her neural interface to fly a virtual aircraft simply by looking at the plane on a monitor and visualizing it moving one direction or another. “I could raise the nose of the plane up and down. Then I could bank it right or left,” Jan explained. “I was lost so quickly in that world because I was up in the clouds, and I was flying. And I was out of my chair. I was out of my broken body. I was flying!” Nathan was similarly able to extend his abilities. Adding more sensors to his prosthetic arm and hand enabled him to detect infrared (IR) signals. Nathan used his brain signals to move the hand over a surface that emitted invisible IR signals only in a specific location. When the prosthesis crossed the target, the sensors converted the IR signal into electrical pulses delivered to Nathan’s somatosensory cortex, enabling him to “feel” infrared radiation. Nathan reported an immediate, touch-like perception of the IR field. Still unknown is whether users’ brains, after long-term use of a bidirectional interface with novel sensors like the IR ones, will adapt to a new type of input and ultimately experience a “sixth sense” in a new way. Prosthetic movement and sensation are currently the best-studied applications for neural interfaces. Through the Neural Engineering System Design (NESD) program, DARPA has even extended its aspirations for higher-resolution upgrades of such systems to potentially restore hearing and vision to people with sensory deficits. What’s more, the