One of the most dramatic consequency of local or extensive paralysis, involving the oral motor skills, is certainly the loss of speech production. This inability to verbally communicate hinders the patient's social ties and sometimes isolates him. If the paralysis affects a significant portion of the motor skills, which is the case of Locked-in syndrome patients, the communication can only, in the best case, find a substitute in the development of a code based on the eye and eyelids movements. The Brain-Machine Interface A recent approach called BMI (Brain Machine Interface), however, is considering wire the brain to a computer via electrodes implanted in the cortex, recording the signals that go through it in order to send and interpret them as commands for a computer.
Beginning from the early 90s, this approach has shown impressive results in the motor control of artificial limbs, or in control of a cursor on the screen ... The cursor control also helped to partially restore speech and communication: the patient could now form words by selecting letters (by moving the cursor) by the will of his thought (Wolpaw et al, 2000-2004)(1). Other techniques combining the peak detection of visual attention with a heuristic classification of letters have been tested. However, this type of communication is particularly laborious and far from equaling the richness and speed of a casual conversation...
Return to speech An international team(2) tried to treat the problem of restoration of communication, rather than trying to reorganize a production system of communication based on the selection of letters, but sound: aim was not to reinvent a system of communication but to rebuild a direct link between cerebral motor command and the production of sound.
A Neurotrophic ** electrod has been implanted on the cortex (On the bottom of the precentral gyrus, suspected central motor planning of movements producing speech – the left ventral premotor cortex)(3) of a 26 years old volunteer which was suffering from deep paralysis locked-in syndrome, caused by a stroke at the brain stem. After an adjustment period during which neurites were able to colonize cones of the electrode, the learning began.
The brain activity recorded by the electrode was then transmitted by radio wave to a receiver connected to a computer, which included an output voice synthesizer. The latency between the recording of cortical activity and its synthetic production took about 50 ms. The patient therefore had a direct auditory feedback almost as fast as it would have been his own words.
Examinated over 5 months in tests of imitation (of vowel sounds), the patient showed a marked improvement in the control of sounds, from 40% of sounds correctly copied to 75%, with a maximum of 89% in the last session. Also note, the electrode itself generates the radio signal, thus limiting the risk of infection because no hole in the scalp was necessary to "connect" the brain machine. The machine itself was limited to an operating system running on a classical computer. This experiment is therefore a major breakthrough in the development of neural prostheses that could allow deep paralyzed people to almost normally “speek” with a simple laptop.
* The title of this article is of course referring to the work of Jean-Dominique Bauby, and the film of the same name, The Diving Bell and the Butterfly, Which describe the life and the view from inside of a Locked-in Syndrome. Also presented in these works, the art of communication based on eye movement and eyelid, and use of an alphabet of letters, arranged in order of frequency in the language. ** Neurotrophic electrode is specifically designed to be implanted in the cortex. It has several peaks or cones intended to constitute a contact with the neurons, whose extensions can connect to it. In this experiment, the electrode contained very few cones, other experiments using some more sophisticated electrodes implanted on the cortex of monkeys have shown better results in motor control of a synthetic prosthesis, as and as the number of sensors increases. This would suggest a reasonable possibility of refining the technique discussed here.
(1) Wolpaw J, McFarland D, Vaughan T (2000) Brain-computer interface research at the Wadsworth Center. IEEE Trans Rehabil Eng 8: 222-226.
Wolpaw JR, McFarland DJ (2004) Control of a Two-dimensional movement signal by a noninvasive brain-computer interface in humans. Proc Natl Acad Sci USA 101: 17849-17854. (2) Guenther FH, Brumberg JS, Wright EJ, Nieto-Castanon A, Tourville JA, et al. (2009) A Wireless Brain-Machine Interface for Real-Time Speech Synthesis . PLoS ONE 4 (12): e8218. doi: 10.1371/journal.pone.0008218 (3) Guenther FH, Hampson M, Johnson D (1998) A theoretical investigation of reference frames for the planning of speech movements. Psych Rev 105: 611-633. |