Ever since it was discovered that the brain works by generating electrical impulses, which give off brain waves, the idea of reading those waves has been tantalizing. In the past decades, moderate strides were made, but recently, the ability to record brain activity and then apply that knowledge to useful ends has been increasing.
What began in the late 1950s as an ability to record a single neuron at a time has progressed to today¡¯s ability to record the activity from hundreds of neurons simultaneously.
In a meta-analysis of 56 studies conducted since the 1950s that recorded the activity of neurons in animals or humans, researchers at The Rehabilitation Institute of Chicago discovered that the number of simultaneously recorded single neurons has doubled every seven years due to continued improvements in technology and data analysis.
This increase is significant in the move toward reading the mind, since more data points offer a better understanding of what¡¯s happening at any given moment in the brain.
Scientists at the Rehabilitation Institute are themselves involved in ground-breaking research to restore connections in the brain that are lost due to stroke or spinal cord injury.1 Using data from neurons, researchers are working on ways to reestablish those connections through cutting-edge technologies such as brain-machine interfaces, functional electronic stimulation, and virtual reality.
Other work is proving that it is indeed possible to record information from the brain and then use it to mimic natural neuronal activity. Researchers at the Tel Aviv University have successfully implanted a robotic cerebellum in a brain-damaged rodent that restored its ability for movement.2
The chip in the robotic cerebellum was designed to receive, interpret, and transmit sensory information from the brain stem, allowing it to provide the communication between the brain and body. Simple blinking movements in response to stimuli are the early results. Without the robotic cerebellum functioning, this movement was not possible.
When we hear about research into mind reading, it is difficult not to think in nefarious terms, picturing someone eavesdropping in on our thoughts or stealing our ideas. The research today is much more humanitarian in its vision, from making life more convenient, to helping those who are paralyzed regain the capacity for movement.
The goal is to harness the power of brain waves to deliver a positive outcome. For example, researchers at the University of California in Berkeley have conducted research into ways to help patients with brain damage speak again.3 The result showed it is possible to decode the complex patterns of electrical activity that the brain forms from audible words, and then translate those patterns back into a very close approximation of the original words.
As volunteers listened to 5 to 10 minutes of conversation, brain activity was recorded using electronic telepathy, where electrodes are placed on the surface of the brain. Then, when patients heard a single word, their brain activity was analyzed by a computer program and researchers were able to deduce which word the volunteers had heard.
Neuroscientists have long suspected that the brain translates spoken words into patterns of electrical activity. The study shows that it is possible to translate these patterns back into the original sounds, which is an important step toward helping stroke victims regain the ability to speak.
¡×¡×¡×¡×¡×¡×¡×¡×¡×¡×
Perhaps just as remarkable is the progress being made on using thoughts to control the motions of a car. At the AutoNOMOS innovation labs of Freie Universitat Berlin, a computer-controlled vehicle was linked through an interface to new commercially available sensors that measure brain waves.4
The computer was trained to interpret bioelectrical wave patterns from the brain as commands for ¡°left,¡± ¡°right,¡± ¡°accelerate¡± or ¡°brake.¡± The goal of the researchers is to create the autonomous vehicles of the future, and today¡¯s brain-sensor controlled model is what they refer to as a hybrid control approach, where people and machines work together.
¡×¡×¡×¡×¡×¡×¡×¡×¡×¡×
In another new study, at the Washington University School of Medicine in St. Louis, scientists demonstrated that a cursor on a computer screen could be controlled by people speaking words out loud or thinking them in their heads.5
Volunteers quickly learned how to control a computer cursor by thinking or saying specific words, which generated brainwave patterns that an interface had been programmed to recognize.
¡×¡×¡×¡×¡×¡×¡×¡×¡×¡×
According to Eric C. Leuthardt, MD, of Washington University ¡°We can distinguish both spoken sounds and the patient imagining saying a sound, so that means we are truly starting to read the language of thought. This is one of the earliest examples, to a very, very small extent, of what is called ¡®reading minds¡¯ ? detecting what people are saying to themselves in their internal dialogue.¡±
¡×¡×¡×¡×¡×¡×¡×¡×¡×¡×
In a similar study, researchers at the Duke University Center for Neuroengineering trained two monkeys to move a virtual avatar hand using only brain activity.6 But in this study, a significant element was added: feedback.
The texture of virtual objects was fed to the monkeys¡¯ brains as patterns of electrical signals, which enabled the monkeys to identify and differentiate the virtual objects. According to the study, this was the first time there was a bi-directional link between a brain and a virtual body.
These developments could be life-changing for victims of injury and stroke. The results of the Washington University study, for example, hold great possibilities for enabling a person who has lost mobility to move a robotic arm using the same portion of the brain that used to serve that function.
The Duke study could lead to a robotic exoskeleton that would not only allow severely paralyzed patients to move using their thoughts, but would also offer feedback from their surrounding world regarding the texture, shape, and temperature of objects.
Echoing the work being done at Duke, a new study from the University of Chicago revealed that by adding kinesthetic feedback information about movement and position in space for a robotic arm, monkeys that were using a brain-machine interface were able to greatly improve their ability to control the arm.7
This type of feedback has great implications for the future of mind control over robotic prosthetics. With this type of kinesthetic sensing, tasks such as buttoning a shirt or even walking will be much easier to achieve than they would if visual cues were used alone.
In another noteworthy effort, a team of scientists at the University of Maryland have developed a braincap that offers a non-invasive way to read brain waves.8 For some time, it was believed the human skull was too thick to be able to detect enough useful brain activity through it, and that sensors needed to be placed directly on the brain.
The Maryland study achieved decoding results that rivaled those obtained from implanted electrodes. The team envisions their sensor-lined cap, paired with interface software, soon controlling computers, robotic prosthetic limbs, motorized wheelchairs, and even digital avatars. The health and comfort benefits of using a non-invasive method for detecting brain waves are compelling.
¡×¡×¡×¡×¡×¡×¡×¡×¡×¡×
Another minimally invasive, low-power approach for connecting the brain to external devices is being developed at the University of Michigan.9 An implant is positioned under the skin, but does not penetrate the cortex. The body¡¯s skin acts like a conductor to wirelessly transmit neural signals from the brain to control a computer.
¡×¡×¡×¡×¡×¡×¡×¡×¡×¡×
Given this trend, we provide the following forecasts:
First, just as increasing processor speeds ushered in the computing revolution in the last century, the increasing ability to simultaneously record single neurons marks the beginning of a neuroscientific revolution.
Over the next few years, as more results are obtained from our improved ability to read the brain, we will gain a better understanding of the functional principles, development, and operation of the brain. Insight will lead to even greater insight, causing knowledge of the brain to grow exponentially. With this increased knowledge will come new and better ways to help those who are disabled from neural disorders or mishaps. Methods will be developed for rerouting messages to parts of the body that have been shut off from the brain because of injury or stroke. It will become common to restore functionality to patients who have suffered paralysis.
Second, in the not-too-distant future, keyboards, trackpads, and computer mice will be a thing of the past, and we¡¯ll interface with devices through thought.
The primitive interfaces we use today will initially be made obsolete by voice recognition technology. But mind-machine interfaces will ensure they are gone forever. This will also make speech recognition unnecessary since our devices will pick up directly on our thoughts. We will use our brain activity to not only interface with computers, but also to control and be linked to virtually all of our devices, from household appliances to cell phones. To call someone, we¡¯ll simply think of the person¡¯s name, and to search the Internet, we¡¯ll simply think about what we want to find.
References List :1. Nature Neuroscience, February 2011, Vol. 14, No. 2, "How Advances in Neural Recording Affect Data Analysis," by Ian H. Stevenson and Konrad P. Kording. ¨Ï Copyright 2011 by Nature Publishing Group, a division of Macmillan Publishers Limited. All rights reserved. http://www.nature.com 2. For more information about research on robotic body-brain interface, visit the ABC News website at: http://abcnews.go.com 3. PLoS Biology, January 2012, "Reconstructing Speech from Human Auditory Cortex," by Brian N. Pasley, et al. ¨Ï Copyright 2012 by the Public Library of Science. All rights reserved. http://www.plosbiology.org 4. Daily Mail, February 22, 2011, "Look, No Hands (or Feet): Scientists Develop Car That Can Be Driven Just by THINKING," by Daniel Bates. ¨Ï Copyright 2011 by Associated Newspapers, Ltd. All rights reserved. http://www.dailymail.co.uk 5. Journal of Neural Engineering, June 2011, Vol. 8, No. 3, "Using the Electrocorticographic Speech Network to Control a Brain-Computer Interface in Humans," by Eric C. Leuthardt, et al. ¨Ï Copyright 2011 by IOP Publishing. All rights reserved. http://iopscience.iop.org 6. Nature, November 10, 2011, Vol. 479, No. 7372, "Active Tactile Exploration Using a Brain-Machine-Brain Interface," by Miguel A.L. Nicolelis, et al. ¨Ï Copyright 2011 by Nature Publishing Group, a division of Macmillan Publishers Limited. All rights reserved. http://www.nature.com 7. The Journal of Neuroscience, December 15, 2010, Vol. 30, No. 50, "Incorporating Feedback from Multiple Sensory Modalities Enhances Brain-Machine Interface Control," by Nicholas G. Hatsopoulos, et al. ¨Ï Copyright 2010 by the Society for Neuroscience. All rights reserved. http://www.jneurosci.org 8. Journal of Neurophysiology, October 2011, "Neural Decoding of Treadmill Walking from Noninvasive Electroencephalographic Signals," by Jose Luis Contreras-Vidal, et al. ¨Ï Copyright 2011 by the American Physiological Society. All rights reserved. http://jn.physiology.org 9. For more information about a minimally invasive brain implant, visit the University of Michigan website at: http://ns.umich.edu