Synthetic telepathy

Synthetic telepathy is a term used to describe the process in brain-computer interfaces by which human thought (as electromagnetic radiation) is intercepted, processed by computer and a return signal generated that is perceptible by the human brain.
History
In 1967, Edmond M. Dewan published a paper in Nature demonstrating the control of Alpha waves, turning them on and off, to produce Morse code. Using an EEG machine, Dewan and his fellow researchers were able to send words and phrases by thought alone.
In 1976, Robert G. Malech was awarded United States Patent 3951134 for remotely monitoring and altering brainwaves using radio. This patent makes reference to demodulating the waveform, displaying it to an operator for viewing and passing this to a computer for further analysis.
In 1988, Farwell, L.A. & Donchin, D.produced a paper describing a method of transmitting linguistic information using the P300 response system. This system combined matching observed information to what the subject was thinking of. In this case, being able to select a letter from the alphabet that the subject was thinking of. In theory, any input could be used and a lexicon constructed.
Theory
Approaches to synthetic telepathy can be categorized into two major groups, passive and active. Like sonar, the receiver can take part or passively listen.
Passive reception is the ability to "read" a signal without first broadcasting a signal. This can be roughly equated to tuning into a radio station, the brain generates electromagnetic radiation which can be received at a distance. That distanced is determined by the sensitivity of the receiver, the filters used and the bandwidth required. Most universities would have limited budgets and receivers such as EEG (and similar devices) would be used.
Malech's approach requires a modulated signal to be broadcast at the target. The method uses an active signal which is interfered with by the brain's modulation. Thus, the return signal can be used to infer the original brainwave. This approach does expose the transmitter, but is ultimately required for generating return signals that can be processed by the brain.
The research of Farwell & Donchin, is the first public revelation that could lead to a generic lexicon being developed, however, this is implied in the work of Malech in 1976.
In 2009, Tom Mitchell of Carnegie Mellon University was interviewed by 60 Minutes on his work in "Thought Identification" using fMRI. The segment showed associate producer Meghan Frank being asked to think of one of two images while in an fMRI scanner, with the software being able to correctly identify all ten of these thoughts.
Quite apart from linguistic information, images have been extracted from the brain. In 2008, researchers at Japan's ATR Computational Neuroscience Laboratories were able to take images that a subject can currently see and display those pictures on a computer screen after training the system with a series of small black and white images. The process only worked with simple images, and further goals of the project were to view both retinal and imagined images in real-time, including dreams.
Computer mediation
Computer mediation falls into two basic categories, interpretative and interactive.
Interpretative mediation is the passive analysis of signals coming from the human brain. A computer "reads" the signal then compares that signal against a database of signals and their meanings. Using statistical analysis and repetition, false-positives are reduced over time.
Interactive mediation can be in a passive-active mode, or active-active mode. In this case, passive and active denote the method of reading and writing to the brain and whether or not they make use of a broadcast signal. Interactive mediation can also be performed manually or via artificial intelligence.
Manual interactive mediation involves a human operator producing return signals such as speech or images. Computer mediation leverages the cognitive system of the subject to identify images, pre-speech, objects, sounds and other artifacts, rather than developing software routines to perform such activities. Computer based systems may incorporate natural language processing interfaces that produce sensations, mental impressions, humor and conversation to provide a mental picture of a computerized personality.
Military uses
As of 2010, research is being driven by military for "covert speech" technology, which would allow troops to communicate over radio without having to physically speak.
A major reason for continued research appears to be silent communication with battlefield troops. $4 million was provided to DARPA for the fiscal year 2009/2010 to develop such a system called "Silent Talk". Some of the research is being conducted at The Cognitive NeuroSystems Lab at UC Irvine.
A further $4 million was allocated by the Army to the University of California to investigate computer-mediated "synthetic telepathy".
 
< Prev   Next >