In a recent study published by Nature Communications, scientists from the Wyss Center for Bio and Neuroengineering in Geneva, Switzerland shared remarkable new details about a human being using a brain-computer interface to ask if he can have a beer and listen to TOOL.
The patient, a 36-year-old man who has been in a "completely locked-in state" for several years as a result of amyotrophic lateral sclerosis. In this case, that includes the muscles in his eyes. Most previous brain interface experiments with people dealing with paralysis have been relied on implants in places such as the eyes, where the system can pick up on the electrical impulses of the muscle movements. But that didn't work in this case, as they explain:
None of these [previous] studies has demonstrated communication at the level of voluntary sentence formation in CLIS [completely locked-in state] individuals, who lack stable and reliable eye-movement/muscle control or have closed eyes, leaving the possibility open that once all movement – and hence all possibility for communication – is lost, neural mechanisms to produce communication will concurrently fail.
[…]
This participant was implanted with intracortical microelectrode arrays in two motor cortex areas. The patient, who is in home care, then employed an auditory-guided neurofeedback-based strategy to modulate neural firing rates to select letters and to form words and sentences using custom software. Before implantation, this person was unable to express his needs and wishes through non-invasive methods, including eye-tracking, visual categorization of eye-movements, or an eye movement-based BCI-system.
After three months of trials, the researchers found some news to translate the patient's communications — essentially (as I understand it) by tracking the electrical impulses from the patient's attempts to produce different auditory frequencies. They gradually built up his communication skills until finally, this happened:
The patient also participated in social interactions and asked for entertainment ('come tonight [to continue with the speller]', day 203, 247, 251, 294, 295, 'wili ch tool balbum mal laut hoerenzn' – 'I would like to listen to the album by Tool [a band] loud', day 245, 'und jetwzt ein bier' – 'and now a beer', day 247 (fluids have to be inserted through the gastro-tube), 251, 253, 461.
After rocking out with a brewski, he got the chance to bond with his son, too:
He interacted with his 4 years old son and wife, '(son's name) ich liebe meinen coolen (son's name) – 'I love my cool son' on day 251; '(son's name) willst du mit mir bald disneys robin hood anschauen'- 'Do you want to watch Disney's Robin Hood with me' on day 253; 'alles von den dino ryders und brax autobahnund alle aufziehautos' – 'everything from dino riders and brax and cars' on day 309;
By day 462, he could even make detailed food requests, asking for curry with potato and then Bolognese and potato soup the next day.
Never underestimate a TOOL fan.
Spelling interface using intracortical signals in a completely locked-in patient enabled via auditory neurofeedback training [Ujwal Chaudhary, Ioannis Vlachos, Jonas B. Zimmermann, Arnau Espinosa, Alessandro Tonin, Andres Jaramillo-Gonzalez, Majid Khalili-Ardali, Helge Topka, Jens Lehmberg, Gerhard M. Friehs, Alain Woodtli, John P. Donoghue & Niels Birbaumer / Nature Communications]
Paralysed man communicates first words in months using brain implant: 'I want a beer' [Anthony Cuthbertson / The Independent]