The year 2017 is shaping up to be the one in which mind-controlled computing research gathers momentum.
According to DARPA, the project will aim to find a way to “enable rich two-way communication with the brain at a scale that will help deepen our understanding of that organ’s underlying biology, complexity, and function”. If successful, the Neural Engineering System Design (NESD) will “support potential future therapies for sensory restoration”.
So manipulating human brains, altering senses, including “vision, hearing, and speech” is on the cards and it sound particularly scary. DARPA says it wants to create “an implantable package”, which is a device that can be put directly into the brains of those selected for the sensory rewiring. One of the proposed interfaces will be the development of “up to 100,000 untethered, submillimeter-sized ‘neurograin’ sensors implanted onto or into the cerebral cortex.”
Once the device is implanted in the brain, a “relay station transceiver worn on the head” to “wirelessly power and communicate with the implanted device”. It certainly adds a whole new dimension to “hearing voices in your head”.
A team from the University of California-Berkley is attempting “to create quantitative encoding models to predict the responses of neurons to external visual and tactile stimuli, and then apply those predictions to structure photo-stimulation patterns that elicit sensory percepts in the visual or somatosensory cortices, where the device could replace lost vision or serve as a brain-machine interface for control of an artificial limb”.
“Predict the responses of neurons” means the Pentagon-controlled brain will send messages to the controllers alerting them to thoughts or actions about enter the person’s conscious mind.
Some academics voiced their support of the project. A paper entitled “The Brain Activity Map Project and the Challenge of Functional Connectomics” includes predictions of the development of “techniques for wireless, noninvasive readout of the activity of neuronal populations”. This research would allow those in control of the project to wirelessly access and control the brains of target populations.
An excerpt from the academic study notes: “This emergent level of understanding could also enable accurate diagnosis and restoration of normal patterns of activity to injured or diseased brains, foster the development of broader biomedical and environmental applications, and even potentially generate a host of associated economic benefits.”
As revealed in the academic study, the NESD is part of a broader initiative called the BRAIN Initiative — short for Brain Research through Advancing Innovative Neurotechnologies. All the projects are broadly interested in finding ways to decode neural data which would aid in finding techniques to artificially manipulate humans.
The financial magazine Forbes reported that DARPA employs scientists at Carnegie Mellon University to develop “an artiﬁcial intelligence system that can watch and predict what a person will ‘likely’ do in the future using specially programmed software designed to analyze various real-time video surveillance feeds. The system can automatically identify and notify officials if it detects “anomalous behaviors” suggesting a surveillance world such as the one portrayed in the movie Minority Report.
In its announcement, DARPA named “Detection and Computational Analysis of Psychological Signals (DCAPS)” as one of its primary areas of focus in its BRAIN activity. A separate entry on another part of the DARPA website reveals more about DCAPS and how it could be used:
“DCAPS tools will be developed to analyze patterns in everyday behaviors to detect subtle changes associated with post-traumatic stress disorder, depression and suicidal ideation. In particular, DCAPS hopes to advance the state-of-the-art in extraction and analysis of “honest signals” from a wide variety of sensory data inherent in daily social interactions. DCAPS is not aimed at providing an exact diagnosis, but at providing a general metric of psychological health.
“DCAPS also aims to develop novel algorithms for detecting distress cues from users who opt in to provide data such as text and voice communications, daily patterns of sleeping, eating, social interactions and online behaviors, and nonverbal cues such as facial expression, posture and body movement. The outcomes of these analytical algorithms would be correlated with distress markers from neurological sensors for improved understanding of distress cues.”
Meanwhile, a team from Fondation Voir et Entendre will try to link an artificial retina worn over the eyes with neurons in the visial cortex using optogenetics. In a similar vein, the John B. Pierce Laboratory team aims to create an all-optical prosthesis for the visual cortex using neurons modified to bioluminesce and respond to optogenetic stimulation.
After the first year, the program will move into Phase II, which will look towards human experiments.