Imagine waking up one day and not being able to move any part of your body. You try to call for help but discover you cannot speak. You attempt to look around the room to see if anyone is nearby, but your eyes won’t move either. Your mind is working perfectly, but you are trapped in your own body.
People with this rare horrific condition, known as “locked-in syndrome”, can become this way almost instantly from a brainstem stroke or severe trauma, or more either gradually from amyotrophic lateral sclerosis (ALS, otherwise known as Lou Gerhig’s Disease). In some cases, people will continue to maintain control over the eyes, allowing them to use assistive technologies like eye-tracking devices to communicate through a computer. Unfortunately, for those who lose eye control, communication is significantly more challenging and solutions are severely limited.
Our Approach
There’s no success more gratifying than helping a locked-in man communicate with his family for the first time in years.
Archinoetics developed brain-machine interfaces (BMI, also known as brain-computer interfaces or BCI) that enables people to interact with and control machines through our custom-designed functional brain imaging system. These systems, which use functional near-infrared imaging (fNIR), monitor the brain’s activity in real-time to detect what types of mental tasks a person is doing. By giving the subject a choice of a few tasks to select from, the person is able to create a signal that can be interpreted by our software and thereby allow them to manipulate basic computer interfaces. 
For the first time in years, he communicated!
After our initial success, we continued to improve our software for communication to improve its reliability and the speed with which someone could communicate. In parallel, we also worked on some fun applications to give locked-in people a way to entertain themselves while practicing the required mental tasks that allow them to control the system for communications. The applications included video games and painting applications. A screenshot of the video game appears here and shows a dolphin that the person controls in an attempt to eat the fish that swim by. The painting application is discussed more below.
Brain Painting

For Brain Painting, Archinoetics worked closely with the late artist, Peggy Chun, whose tropical watercolor paintings made her a household name in Hawaii. Peggy was diagnosed with ALS in 2002, but never let the disease stop her from painting. As she became paralyzed, she switched to painting with her left hand, then by holding the paintbrush in her teeth. Even when she was only able to move her eyes, Peggy used an eye-tracking system to communicate and paint. At Archinoetics, we helped Peggy become the world’s first ‘brain painter’ (see her most famous brain painting on the laptop screen in the photo, entitled ‘Navajo Nightfall’). Sadly, Peggy passed away in 2008, but her memory and spirit live on in her beautiful paintings.
To view or purchase Peggy’s artwork, please visit her website at www.peggychun.com.

Support
This research is in collaboration with the University of Virginia and Georgia Tech University, and has received support from the National Science Foundation under Grant No. 0705804 and Grant No. 0512003.This article from the Archinoetics website
No comments:
Post a Comment