Showing posts with label Brain Controlled Devices. Show all posts
Showing posts with label Brain Controlled Devices. Show all posts

March 12, 2012

NeuroSky to develop iOS assistive technology apps

NeuroSky, which specializes in mass-market brain-computer interface technology (BCI), says it's bringing its user interface with the brain to the next level with a new wave of "killer mobile apps" for patients with severely limited communication conditions, including Traumatic Brain Injury (TBI), Cerebral Palsy (CP), Multiple Sclerosis (MS), Amyotrophic Lateral Sclerosis (ALS) and others.

These conditions relegate people with otherwise healthy minds to be trapped in their bodies unable to communicate. NeuroSky is launching a campaign on the crowdfunding site IndieGogo, to raise US$50,000. Proceeds will pay "top-tier, disruptive thinking" developers to create a new type of user interface for assistive technology applications on mobile devices.

Crowdfunding is a new investment platform that appeals to the general public to invest in projects they believe in or products they would like to see made. As the assistive technology applications are completed, they will be made available free of charge on the iTunes store, NeuroSky.com, and other locations.

NeuroSky's MindWave Mobile (pictured) will cost $129 and be compatible with mobile (iOS and Android) devices. By combining the brainwave input (such as blink, attention, meditation, and RAW EEG) with standard mobile device features such as geo location, accelerometers, video, etc. developers can provide new and unique capabilities for Android and iOS devices, according to Head of Communication Tansy Brook

July 28, 2011

The Walk Again Project

Over the past decade, neuroscientists at the Duke University Center for Neuroengineering (DUCN) have developed the field of brain-machine interface (BMI) into one of the most exciting—and promising—areas of basic and applied research in modern neuroscience. By creating a way to link living brain tissue to a variety of artificial tools, BMIs have made it possible for non-human primates to use the electrical activity produced by hundreds of neurons, located in multiple regions of their brains, to directly control the movements of a variety of robotic devices, including prosthetic arms and legs.

As a result, BMI research raises the hope that in the not-too-distant future, patients suffering from a variety of neurological disorders that lead to devastating levels of paralysis may be able to recover their mobility by harnessing their own brain impulses to directly control sophisticated neuroprostheses.
The Walk Again Project, an international consortium of leading research centers around the world represents a new paradigm for scientific collaboration among the world’s academic institutions, bringing together a global network of scientific and technological experts, distributed among all the continents, to achieve a key humanitarian goal.

The project’s central goal is to develop and implement the first BMI capable of restoring full mobility to patients suffering from a severe degree of paralysis. This lofty goal will be achieved by building a neuroprosthetic device that uses a BMI as its core, allowing the patients to capture and use their own voluntary brain activity to control the movements of a full-body prosthetic device. This “wearable robot,” also known as an “exoskeleton,” will be designed to sustain and carry the patient’s body according to his or her mental will.

In addition to proposing to develop new technologies that aim at improving the quality of life of millions of people worldwide, the Walk Again Project also innovates by creating a complete new paradigm for global scientific collaboration among leading academic institutions worldwide. According to this model, a worldwide network of leading scientific and technological experts, distributed among all the continents, come together to participate in a major, non-profit effort to make a fellow human being walk again, based on their collective expertise. These world renowned scholars will contribute key intellectual assets as well as provide a base for continued fundraising capitalization of the project, setting clear goals to establish fundamental advances toward restoring full mobility for patients in need.

Walk again Project Homepage

July 25, 2011

Scientists differentiate brain activity associated with grasping

Quickly grabbing a cup of coffee is an everyday action for most of us. For people with severe paralysis however, this task is unfeasible - yet not "unthinkable". Because of this, interfaces between the brain and a computer can in principle detect these "thoughts" and transform them into steering commands. Scientists from Freiburg now have found a way to distinguish between different types of grasping on the basis of the accompanying brain activity.

In the current issue of the journal "NeuroImage", Tobias Pistohl and colleagues from the Bernstein Center Freiburg and the University Medical Centre describe how they succeeded in differentiating the brain activity associated with a precise grip and a grip of the whole hand. Ultimately, the scientists aim to develop a neuroprosthesis: a device that receives commands directly from the brain, and which can be used by paralysed people to control the arm of a robot - or even their own limbs.

One big problem about arm movements had been so far unresolved. In our daily lives, it is important to handle different objects in different ways, for example a feather and a brick. The researchers from Freiburg now found aspects in the brain's activity that distinguish a precise grip from one with the whole hand.

To this end, Pistohl and his collaborators made use of signals that are measured on the surface of the brain. The big advantage of this approach is that no electrodes have to be implanted directly into this delicate organ. At the same time, the obtained signals are much more precise than those that can be measured on the skull's surface.

The scientists conducted a simple experiment with patients that were not paralysed, but had electrodes implanted into their skull for medical reasons. The task was to grab a cup, either with a precise grip formed by the thumb and the index finger, or with their whole hand. At the same time, a computer recorded the electrical changes at the electrodes. And in fact, the scientists were able to find signals in the brain's activity that differed, depending on the type of grasp. A computer was able to attribute these signals to the different hand positions with great reliability. Now, the next challenge will be to identify these kinds of signals in paralysed patients as well - with the aim of eventually putting a more independent life back within their reach.

Source Bernstein Center Freiburg

June 2, 2011

Archinoetics-Helping patients with Locked-In-Syndrome

The Challenge

Imagine waking up one day and not being able to move any part of your body. You try to call for help but discover you cannot speak. You attempt to look around the room to see if anyone is nearby, but your eyes won’t move either. Your mind is working perfectly, but you are trapped in your own body.
People with this rare horrific condition, known as “locked-in syndrome”, can become this way almost instantly from a brainstem stroke or severe trauma, or more either gradually from amyotrophic lateral sclerosis (ALS, otherwise known as Lou Gerhig’s Disease). In some cases, people will continue to maintain control over the eyes, allowing them to use assistive technologies like eye-tracking devices to communicate through a computer. Unfortunately, for those who lose eye control, communication is significantly more challenging and solutions are severely limited.

Our Approach

There’s no success more gratifying than helping a locked-in man communicate with his family for the first time in years.
Archinoetics developed brain-machine interfaces (BMI, also known as brain-computer interfaces or BCI) that enables people to interact with and control machines through our custom-designed functional brain imaging system. These systems, which use functional near-infrared imaging (fNIR), monitor the brain’s activity in real-time to detect what types of mental tasks a person is doing. By giving the subject a choice of a few tasks to select from, the person is able to create a signal that can be interpreted by our software and thereby allow them to manipulate basic computer interfaces. In our research lab testing our system on healthy people, everything appeared to function perfectly. The real test was when we visited a man who, because he was locked-in, had not been able to communicate with his family in years. The Archinoetics team looked on anxiously as the sensors were placed on his head and the computer started receiving data. As with many studies involving human subjects, our first tests did not work. But, over the course of several days, we worked through a number of challenges and were able to help this man answer several yes or no questions that his family wanted to ask him.
For the first time in years, he communicated!
After our initial success, we continued to improve our software for communication to improve its reliability and the speed with which someone could communicate. In parallel, we also worked on some fun applications to give locked-in people a way to entertain themselves while practicing the required mental tasks that allow them to control the system for communications. The applications included video games and painting applications. A screenshot of the video game appears here and shows a dolphin that the person controls in an attempt to eat the fish that swim by. The painting application is discussed more below.
Brain Painting
Archinoetics has developed a BCI called “brain painting”. This application allows someone to paint through consciously modifying the level of activity in a region of his or her brain. Typically this means either “singing in your head” or repeating nonsense syllables in your head (such as “la la la”). The first activity activates the language area, thereby raising the signal measured by OTIS, whereas the second activity lowers the signal. In addition to being a fun creative tool, brain painting also helps people learn the skills necessary to use a BCI effectively for communication.
For Brain Painting, Archinoetics worked closely with the late artist, Peggy Chun, whose tropical watercolor paintings made her a household name in Hawaii. Peggy was diagnosed with ALS in 2002, but never let the disease stop her from painting. As she became paralyzed, she switched to painting with her left hand, then by holding the paintbrush in her teeth. Even when she was only able to move her eyes, Peggy used an eye-tracking system to communicate and paint. At Archinoetics, we helped Peggy become the world’s first ‘brain painter’ (see her most famous brain painting on the laptop screen in the photo, entitled ‘Navajo Nightfall’). Sadly, Peggy passed away in 2008, but her memory and spirit live on in her beautiful paintings.
To view or purchase Peggy’s artwork, please visit her website at www.peggychun.com.

Support

This research is in collaboration with the University of Virginia and Georgia Tech University, and has received support from the National Science Foundation under Grant No. 0705804 and Grant No. 0512003.
This article from the Archinoetics website

May 31, 2011

CorTec GmbH - a bridge between ideas and action

Brain machine interfaces that are able to read a paralysed patient’s desired movement from his or her brain and convert it into actual movement might be available in a few years’ time, if everything goes to plan.

After many years of intensive research, CorTec GmbH, a spin-off company of the University of Freiburg, now has a technology platform that is able to measure and interpret a person’s brain activity and drive muscles or artificial prostheses. Why and how was the company founded? How far has the technology come since the initial idea was first mooted?
A person is involved in an accident and suffers extensive injuries; the neurons linking the spinal marrow and the extremities are completely severed. Although the patient can now imagine how his hand grasps a cup of tea, his body no longer does what he wants. Can applied neuroscience help? Researchers from the Brain Machine Interface Initiative (BMII) at the University of Freiburg firmly believe that they can. They have spent more than ten years investigating how to drive and control muscles or prostheses using brain activity. They hope that one day they will be able to bridge connections using sensors, electrodes and computer chips. They hope to develop a platform that directly connects the brain with the machine. Is this the kind of science fiction that we know from William Gibson’s cyber punk novels?

Completely different structures are needed

“We are currently working on the BRAINCON technology platform and hope to be able to use it for such purposes in a few years’ time,” said Dr. Jörn Rickert, the managing director of CorTec GmbH, which was spun out of the University of Freiburg last year. “Some individual components will soon be granted marketing authorisation.” The researchers are already able to measure and record brain activity using electrodes and deduce from this the type of movement a patient wants to make. Using specific software and hardware, information from the brain will be translated into commands for the control of a prosthesis, or of leg or arm muscles. In experiments, volunteers have been able to move a cursor on a computer screen just through thought. The researchers believe that a paralysed patient might be able to learn to write and communicate again using the BRAINCON technology.
So science fiction becomes reality. Nevertheless, there is still a long way to go before BRAINCON can be commercialised. The technology needs to be tested in preclinical studies and then in clinical studies. So, why has CorTec already been established? “Turning an idea that resulted from basic research into a leading product requires structures quite different from those in a university research group,” said Rickert. Rickert, who did his doctorate in the Department of Neurobiology and Biophysics under the supervision of Prof. Dr. Ad Aertsen at the University of Freiburg on the representation of movement direction in the motor cortex. Rickert has been working on technology transfer issues for around five years. Supported by the university’s Technology Transfer Office, Rickert and his colleagues Carsten Mehring and Tonio Ball filed their first patent application in 2006. In 2005, the three had already received a grant for the development of a brain machine interface under the EXIST-Seed programme run by the German Ministry of Education and Research (BMBF).

Sophisticated quality management and final negotiations

Supported by EXIST-Seed funds and from 2007 onwards with additional funds from the BMBF’s GO-Bio programme, the team was able to further develop the technology and prepare to set up a company. While his colleagues worked with medical doctors, physicists, mathematicians, computer scientists, biologists and material researchers at the University of Freiburg and the Freiburg University Medical Centre on the scientific aspects of the project, Rickert focused on setting up the team, creating structures enabling interdisciplinary communication between the project partners, and preparing a project, a business plan and patent management. “Creating the structure to set up a company requires investors to provide financial injections,” said Rickert. “It is also necessary to put in place a sophisticated quality management structure, comply with standards and carefully compile all the required documents.”

The company was officially founded in September 2010. The interdisciplinary team has since introduced a top-notch quality management structure in the laboratory of Prof. Dr. Thomas Stieglitz from the Department of Biomedical Microtechnology at the University of Freiburg’s Institute of Microsystems Technology. The business plan prepared by Rickert and his colleagues was shortlisted in the final round of the Science4Life funding initiative competition for the best business concept. In early 2011, the future entrepreneurs received a financial injection from the BMBF to establish their company. At present, the CorTec team is in contact with investors and hopes to commence preclinical studies with BRAINCON in a few months’ time. It is planned to commence the clinical phase in about two to three years. Some components of the technology platform will most likely be placed on the market sometime in 2011.

Being among the first

Technology transfer starts way before a prototype, not to mention a product, exists. It is a process that needs to be well prepared and can only be managed with financial and ideological support. Rickert and his colleagues are very grateful for the support they received from the Technology Transfer Office at the University of Freiburg and through the many non-university funding programmes. In the coming months, seven scientists will move from the university to the company, and two open positions still need to be filled.

The company will soon relocate to its own rooms at the University of Freiburg. The close contact with research partners is also expected to bear fruit in the future. “We have gained a technology lead in the last few years and we hope to be able to keep this despite the work being done by our competitors, in the USA and elsewhere,” said Rickert. CorTec GmbH is aiming to be among the first companies to offer permanent neurotechnological systems that will enable paralysed people to turn ideas into actions.

Source: BIOPRO Baden-Württemberg GmbH

May 28, 2011

Listing of Brain Controlled Devices for sale now!

Welcome to the world of Brain Controlled Devices where a simple thought controls machines. I have been facinated by these for some time now and with the whirlwind of research going on in this area I thought you might like to get brought up to speed on most of whats available today. I currently own 2 of these devices in which me and my kids enjoy frequently.

1)Emotiv Epoch-Based on the latest developments in neurotechnology, the Emotiv EPOC is a revolutionary, new personal interface for human computer interaction.They have a developer and researcher kits or you can buy it for yourself to play games on your PC with the mastermind program.

2)Star Wars Science - Force Trainer- is a game for kids ages 8+. There is a headband with sensors that is worn and controls a fan motor which in turn controls the height of a ball. The object of the game is to concentrate on specific things or relax your mind. You can check out my full review of it here.

3)OCZ OCZMSNIA NIA - Neural Impulse Actuator-OCZ's Neural Impulse Actuator (NIA) marks a new era in gaming. Rather than being a substitute for a mouse, the NIA is a pioneering peripheral to be used in conjunction with your mouse for a more immersive gaming experience. The NIA is compatible with any PC game using keyboard input... past, present, or future. Predefined profiles allow the gamer to develop their own NIA - memory to launch the desired behavior of their character and shoot with the "blink of an eye", without lifting a finger.

4)Mindflex Game - Another thought based game for kids. Similar to the Star Wars Mindforce game in that thoughts control the actions of a ball with an air fan. But different in the object of the game which is to manuver the ball through an obstacle course and can be played with 2 people. There is a sequel to this game in the Mindforce Duel in which 2 people battle is out simultaneously on the game board for control of the ball.

5)I-phone PLX X-wave- The X-wave is a thought control headband that links to your iPhone. There are games for X-wave on iphone(pad) by MindGames include W.I.L.D., Tug-of Mind

6)Neurosky mindset MINDWAVE-very similar to the Emotiv Epoch. Neurosky you can buy for under 100$ and has over 50 games available for it. There are also kits for rearchers as well

7)MYND-NeuroFocus’s new device, which it calls “Mynd,” has a few key features. It claims to get “full-brain coverage with dense-arrray EEG” sensors, yielding data “within seconds” of switching the device on. It can also network with any Bluetooth-enabled mobile device, like an iPhone or iPad. Unlike other EEG devices you may be familiar with, Mynd doesn’t need to use gel (that’s what’s meant by calling the device “dry”). And since the device isn’t too heavy itself, and can be linked to a wireless device, that basically makes it a mobile brain scanner. (See our earlier take on a “wearable” PET scanner for rats, here.) NeuroFocus envisions research panels conducted at home with the device; its CEO Dr. A.K. Pradeep tells Fast Company those might happen within the next eight months.

8)Mind technologies-exclusevely developes apps for the emotive headset. Some games created include mastermind,mindmouse,and think tac toe.

9)Interaxon-Has a headset and creates apps for the ipad.

10)3d glasses by interaxon- It's (3-D glasses) gives you the ability to watch a movie or televised content, and it knows the state that you're in, whether it be scared, excited, or bored.

11)Openvibe- Created a software platform for  BCI'S. There is a 10minute video explaining exactly how it works here