February 28, 2013

Rat Brain Flight Simulator Experiments

THE GIST

A glass dish contains a "brain" -- 
a living network of 
25,000 rat brain cells connected to 
an array of 60 electrodes.
- An electrode grid was placed at the bottom of a glass dish and then covered with rat neurons.

- This gradually formed a neural network -- a brain.

- The research could lead to tiny, brain-controlled prosthetic devices and unmanned airplanes flown by living computers.

A University of Florida scientist has created a living "brain" of cultured rat cells that now controls an F-22 fighter jet flight simulator.

Scientists say the research could lead to tiny, brain-controlled prosthetic devices and unmanned airplanes flown by living computers.

And if scientists can decipher the ground rules of how such neural networks function, the research also may result in novel computing systems that could tackle dangerous search-and-rescue jobs and perform bomb damage assessment without endangering humans.

NEWS: Brain in a Dish Comes Alive

Additionally, the interaction of the cells within the lab-assembled brain also may allow scientists to better understand how the human brain works. The data may one day enable researchers to determine causes and possible non-invasive cures for neural disorders, such as epilepsy.

For the recent project, Thomas DeMarse, a University of Florida professor of biomedical engineering, placed an electrode grid at the bottom of a glass dish and then covered the grid with rat neurons. The cells initially resembled individual grains of sand in liquid, but they soon extended microscopic lines toward each other, gradually forming a neural network -- a brain -- that DeMarse says is a "living computational device."
The brain then communicates with the flight simulator through a desktop computer.

"We grow approximately 25,000 cells on a 60-channel multi-electrode array, which permits us to measure the signals produced by the activity each neuron produces as it transmits information across this network of living neurons," DeMarse told Discovery News. "Using these same channels (electrodes) we can also stimulate activity at each of the 60 locations (electrodes) in the network. Together, we have a bidirectional interface to the neural network where we can input information via stimulation. The network processes the information, and we can listen to the network's response."

The brain can learn, just as a human brain learns, he said. When the system is first engaged, the neurons don't know how to control the airplane; they don't have any experience.

NEWS: Petri Dish Brain Has 'Short-term Memory'

But, he said, "Over time, these stimulations modify the network's response such that the neurons slowly (over the course of 15 minutes) learn to control the aircraft. The end result is a neural network that can fly the plane to produce relatively stable straight and level flight."

At present, the brain can control the pitch and roll of the F-22 in various virtual weather conditions, ranging from hurricane-force winds to clear blue skies.

This brain-controlled plane may sound like science fiction, but it is grounded in work that has been taking place for more than a decade. A breakthrough occurred in 1993, when a team of scientists created a Hybrot, which is short for "hybrid robot."

The robot consisted of hardware, computer software, rat neurons, and incubators for those neurons. The computer, programmed to respond to the neuron impulses, controlled a wheel underneath a machine that resembled a child's toy robot.

Last year, U.S. and Australian researchers used a similar neuron-controlled robotic device to produce a "semi-living artist." In this case, the neurons were hooked up to a drawing arm outfitted with different colored markers. The robot managed to draw decipherable pictures -- albeit it bad ones that resembled child scribbles -- but that technology led to today's fighter plane simulator success.

Steven Potter, an assistant professor of biomedical engineering at Georgia Tech who directed the living artist project, believes DeMarse's work is important, and that such studies could lead to a variety of engineering and neurobiology research goals.

"A lot of people have been interested in what changes in the brains of animals and people when they are learning things," Potter said. "We're interested in getting down into the network and cellular mechanisms, which is hard to do in living animals. And the engineering goal would be to get ideas from this system about how brains compute and process information."

Though the "brain" can successfully control a flight simulation program, more elaborate applications are a long way off, DeMarse said.

"We're just starting out. But using this model will help us understand the crucial bit of information between inputs and the stuff that comes out," he said. "And you can imagine the more you learn about that, the more you can harness the computation of these neurons into a wide range of applications."


February 27, 2013

Mobile EEG for cell phones is finally here

Leading Brain-Computer Interface company, NeuroSky, has recently launched the Brainwave Starter Kit, a MindWave Mobile EEG package that offers an introduction to the power of the human brain through fun, interactive applications.

Simply slip on the MindWave Mobile EEG headset and see your brainwaves displayed on screen in the colorful Brainwave Visualizer. Watch your relaxation levels change in real time as you listen to your favorite music, or monitor your attention levels as you do a series of math problems in Speed Math!

“We’re excited to begin offering the Brainwave Starter Kit and look forward to introducing our technology to a new generation.” said Stanley Yang, NeuroSky CEO. “This new package, targeted to younger users, delivers an easy, fun way to discover your own brainwave patterns.”

The device consists of a headset, an ear clip, and a sensor arm. The headset’s electrodes are on the ear clip and on the sensor arm, which rests on the forehead above the eye.

As the world's first comprehensive brainwave-reading device for iOS and Android, the MindWave Mobile headset is evolved for today's mobile user. It differs from its sister-product MindWave by transferring the user's RAW brainwave data via Bluetooth rather than radio frequency.

Whether it is blowing up a can of RedBull on the iPad, or levitating a cupcake on an Android device, MindWave Mobile lets users train their brain to focus and relax no matter where they are.

You can get the
NeuroSky MindWave Mobile Myndplay Bundle on Amazon for 125$

February 19, 2013

Superhuman Vision Experiments

Coupling infrared levels with a microstimulator in rat’s brain. On each trial, the IR light (red) turns on, which activates the IR detector mounted on the rat’s head. Processing converts the detected IR level into a stimulation frequency. This value is sent to the microstimulator, which produces the desired current pulses, resulting in perception. (Credit: Nicolelis Lab)
How would you like to have superhuman vision and be able to see or sense in infrared, radio or magnet wave levels? It reminds  me of comic book or sci-fi movie characters like George Laforge in Star Trek who were born blind but had vision partially restored with infrared sight. There are legitimate experiments proving it is possible now.

An experiment done at the Duke Medical University on rats shows that it is possible to fix existing vision problems or add extra vision capabilities to our own normal vision.

Researchers at the Nicolelis Lab attached a head-mounted infrared (IR) sensor to rats and connected it to the whisker area of the brain (somatosensory cortex), using electrical microstimulation.

The rats were able to distinguish between the whisker and IR senses going to the same area without any problems

the researchers note that, in principle, this any novel stimulus (for example, magnetic or radio waves) could be used, instead of infrared light.

Sample trials from a session with “blank” trials interleaved. On random trials the IR light is activated, but is uncoupled from the (somatosensory cortex stimulation. Trial types (“stim” versus “no-stim” trials) are indicated before each trial.

Based on an article in Nature Magazine

February 17, 2013

Advanced Ocular Prosthesis receives FDA approval



Artificial retina device, consisting of a glasses-mounted camera and a microchip surgically implanted on the retina (credit: Dr. Wentai Liu)


2sight_argus_ii
Argus II (credit: Second Sight)

Argus II is first approved ocular prosthesis to restore limited vision to those blinded by retinitis pigmentosa
In an historic move, the U.S. Food and Drug Administration (FDA) has granted market approval to an artificial retina technology, the first bionic eye to be approved for patients in the U.S. 
The device, called the Argus II Retinal Prosthesis System, from Second Sight Medical Products, transmits images from a small, eye-glass-mounted camera wirelessly to a microelectrode array implanted on a patient’s damaged retina. The array sends electrical signals via the optic nerve, and the brain interprets a visual image.
The FDA approval currently applies to individuals who have lost sight as a result of severe to profound retinitis pigmentosa (RP), an ailment that affects one in every 4,000 Americans. The implant allows some individuals with RP, who are completely blind, to locate objects, detect movement, improve orientation and mobility skills and discern shapes such as large letters.
How it works

Argus II components (credit: FDA)
The Argus II design consists of an external video camera system matched to the implanted retinal stimulator, which contains a microelectrode array that spans 20 degrees of visual field.

An external camera system, built into a pair of glasses, streams video to a belt-worn computer, which converts the video into stimulus commands for the implant.

The belt-worn video processing unit (computer) encodes the commands into a wireless signal that is transmitted to the implant, which has the necessary electronics to receive and decode both wireless power and data.

Based on those data, the implant stimulates the retina with small electrical pulses. The electronics are hermetically packaged and the electrical stimulus is delivered to the retina via a microelectrode array.
In 1998, Robert Greenberg founded Second Sight to develop the technology for the marketplace. While under development, the Argus I and Argus II systems have won wide recognition.

This ocular prosthesis technology has received early and continuing support from the National Science Foundation (NSF), the National Institutes of Health and the Department of Energy, with grants totaling more than $100 million. The private sector’s support nearly matched that of the federal government.

The NSF BMES ERC also developed a prototype system with an array of more than 15 times as many electrodes and an ultra-miniature video camera that can be implanted in the eye. However, this prototype is many years away from being available for patient use.

Article courtesy of Kurzweil.Ai

February 13, 2013

Robocalypse Alert: Defense Contract Awarded to Scary BigDog Robot

You remember the BigDog Robot, don't you? It's that loud all-terrain prototype robot quadruped that peopled your dreams with Terminator-esque nightmares when you saw the video. DARPA just awarded a $32 million contract to build it.

The contract's been won by maker Boston Dynamics, which has just 30 months to turn the research prototype machines into a genuine load-toting, four-legged, semi-intelligent war robot--"first walk-out" of the newly-designated LS3 is scheduled in 2012.

LS3 stands for Legged Squad Support System, and that pretty much sums up what the device is all about: It's a semi-autonomous assistant designed to follow soldiers and Marines across the battlefield, carrying up to 400 pounds of gear and enough fuel to keep it going for 24 hours over a march of 20 miles.

LS3 is a direct descendant of the BigDog robot, and it'll be battle-hardened and clever enough to use GPS and machine vision to either yomp along behind a pack of troops, or navigate its own way to a pre-programmed assembly point. Yup, that's right, LS3 is smart enough to trot off over the horizon all on its lonesome. That opens up all sorts of amazing military possibilities, like resupply of materiel to troops who are deployed in difficult remote locations, as well as the standard "If LS3 can offload 50 pounds from the back of each soldier in a squad, it will reduce warfighter injuries and fatigue and increase the combat effectiveness of our troops" as described by BD's president Marc Raibert.

And its clear that these, and other, potential benefits have been proven to DARPA enough that it's prepared to fund what seems to be an extremely future-focused piece of military hardware. But LS3, of course, stands for much more than its simple "squad support" label would suggest. It's placing artificially-intelligent robots right next to soldiers on the battle field, which is a natural extension of the way robots are currently used in combat--essentially as smart remote control units for situations too dangerous for a human to risk. And in that sense, LS3 is a significant piece of kit. Because it won't be too long before someone considers the benefits of replacing its 400-pound load with a heavy gun, and LS32 becomes an AI-equipped armed battlefield robot. More terminator-dog than K9, you see.

Here's BigDog auto-tracking a human, just to give you an extra robocalyptic chill: Article courtesy Fast Company

February 10, 2013

DoD’s new Android Biofeedback App connects to wearable devices


,biozen t2 dod

The National Center for Telehealth and Technology (T2) an agency of the Department of Defense, has been introducing online and mobile health tools for people in the military, veterans, and their families since 2008. Their newest offering, BioZen, is an effort to get ahead of the trend of personal sensors and provide a free mobile tool to help people use those sensors to improve their health through the practice of biofeedback in app form.

“One of the things we do here at T2 is constantly look for ways to innovate mobile health. Wearable technology is one of our interests,” David Cooper, a psychologist at T2, told MobiHealthNews. “One thing we didn’t see a lot of was people using biosensors for biofeedback. But we found a way to integrate some biosensors through the mobile phone using Bluetooth.”

Biofeedback is a therapy wherein patients wear sensors while experiencing a problem like chronic pain or migraines. By seeing exactly how their physiological response changes throughout the experience, patients can gain control over those responses and, therefore, over their symptoms. It can be used in conjunction with other therapies or by itself: sometimes the awareness itself is therapeutic, other times biofeedback is a tool for determining what’s the most effective way of dealing with a condition.

Biofeedback is an appealing area for mobile health because traditional biofeedback technology is cumbersome, requiring the patient to be hooked up to a lot of wires with limited mobility. Mobile biofeedback, which only requires some wireless sensors and a smartphone, not only allows a patient to take the therapy home and use it more often and more casually, it can also be used in the clinic, lowering the price point and effort involved.

Cooper says there is a stigma attached to mental health for many returning servicemembers. Part of the idea behind a biofeedback app is to provide a discreet personal treatment option for people who don’t want to take medication or don’t want others to know they’re in therapy.

“That’s why we like developing for mobile devices,” said Cooper. “People don’t know if you’re checking Facebook or learning about your PTSD.”

For that reason, the app is designed to work in the context of therapy, but it also includes tutorials so people can learn to do biofeedback on their own. It includes a highly informational graph-based interface, but it also includes a single-variable pictorial interface where, for instance, a picture of a tree becomes brighter and more detailed as your gamma waves increase.

The BioZen Android app works with a variety of commercially available sensors with open APIs. Sensors the app can interact with include electroencephalogram (EEG), electrocardiogram (ECG), electromyography, galvanic skin response, respiratory rate, and skin temperature. The app can either work with just one sensor or a whole suite. The device currently works with brainwave sensors from NeuroSky and BrainAthlete, as well as physiological sensors from Zephyr Technology and Shimmer Research, sensors Cooper said were chosen because they are commercially available and have open APIs.

“We’re trying to move into a more open framework when it comes to health sensors,” Cooper said. “We know that’s an important consumer area, we know its going to be important in mHealth coming up. So we’re trying to incorporate as many of these APIs in our future products as we can.”

Cooper said T2 likes to make its apps multiplatform, but because of the difficulty in interfacing with Bluetooth, he doesn’t think an iOS version of BioZen is likely.

T2 has 10 Apple and Android health apps available, including a mood tracker, an app for managing with PTSD, and an app to help combat stress with breathing exercises. T2 recently updated the mood tracker app with a new capability: users can now export and share their mood records with friends or care providers. In addition, the agency maintains two online communities to help people in the service and their families deal with the wide range of issues affecting them: afterdeployment.org and militarykidsconnect.org.

Cooper said the DoD is careful to only release apps based on research-driven practices, but they don’t wait until the apps themselves are clinically proven to release them. T2 has published 75 papers since 2008, however, and is planning to do efficacy studies on BioZen in the future.

 By: Jonah Comstock

February 4, 2013

Free Subscription to Life Extension Magazine

I just found out that freebizmag.com is offering a free 1 year subscription to Life Extension Magazine.


Life Extension magazine is the monthly publication of the Life Extension Foundation. Members of Life Extension receive the magazine for free in addition to their other membership benefits.

They were the first to report new discoveries involving nutrition, hormones, and anti-aging supplements, including such news-making items as coQ10 and omega-3 fatty acids. In addition, Life Extension reports innovative findings concerning the diseases that threaten many of us, such as atherosclerosis, cancer and diabetes. With each issue, subscribers to Life Extension magazine receive potentially life-saving information they won't find in most other publications.

Go here to get it now

February 3, 2013

Tranhuman Technology You Can Use Now!

Brain Computer Interfaces
There have been many come to market over the last 5 years and they are progressing quickly. I Believe as the technology develops everyone will want one.Neurosky Mindwave ,Emotiv Epoch.


Cybernetics
Lose an arm or leg. How about your vision or hearing. There are already many people using bionic prosthesis's already. Look no further than your local prosthesist for the latest models. The newer ones available this year can be surgically attached to your nervous system which means thought control over it.



Clothing
There is a lot of electronics being applied to clothing that will augment our bodies. Shoes that will help us jump higher or faster. Google will be releasing Google glass. The glasses have electronic display units inside the glasses and projectors on each side or the arms.Watches and wristbands that monitor your body for blood clots. Recently the APL Concept 1 shoes were banned by the NBA due to it giving players an unfair advantage in being bale to jump higher from wearing them


Superhearing
There area number of spy tech devices tat are being micro sized and marketed in the open. the iPod has an app for it right now you can get. The Super mini personal sound amplifier is not really for the hearing impaired but to augment your hearing.


Biohacking
This is not really mainstream but a group of  transhumanists who have taken it on themselves to experiment with meshing of their body to electronics. A recent post last week was about a guy who wanted to implant magnets to his fingertips. Nevertheless there are some interesting experiments being carried out with some success as well. You can check it out here

Portable MicroDevices
Every year devices like the iPod get more powerful with more bells and whistles. But now they are being equipped with lifesaving apps that monitor your heart and vitals, be a pocket CPR and first aid guide, monitor blood pressure. They will have tests you can do on  yourself to test for illnesses. There already is an app to connect it to a Neurosky's mindwave mobile BCI

Tattoo Technology
Right now at ThinkGeek you can get a programmable tattoo system The moodINQ. You get an electronic grid system that you can program to show whatever tattoo you want. The applications with something like this are vast. They could monitor blood sugar levels or any vitals for diabetics or people that require medication on a frequent basis and remind them. Some which use blood for the display could test you for diseases right away.

Life Extension
There are many products coming to market every year and tons of research happening in this area. Science is advancing every year. Its only a matter of time. Cryonic preservation  is available now as a backup plan. Controlling your lifestyle via diet, exercise and stress has been shown with some accuracy to increase lifespans by 20-30 years already. Books like "The Blue Zones" "The China Study" or a book on Caloric Restriction is a good place to start learning about this.

If you have any ideas of things i may have missed let me know and I'll add it to this article.
J5un

February 1, 2013

Latest News/Home

 Autonomous Medical Robots cleared for use this year


Most Recent POsts

Hospital Robots: FDA Clears First Autonomous Telemedicine Robot

RP-VITA
RP-VITA™, by iRobot and InTouch Health, enables doctors to provide patient care from anywhere in the world via a telemedicine solution

Medical Robots
All Medical Robots by iRobot
BEDFORD, Mass., January 24, 2013 – iRobot Corp. (NASDAQ: IRBT), a leader in delivering robotic solutions, announced that the RP-VITA Remote Presence Robot has received 510(k) clearance by the U.S. Food and Drug Administration (FDA) for use in hospitals. RP-VITA is the first autonomous navigation remote presence robot to receive FDA clearance.

RP-VITA is a joint effort between two industry leaders, iRobot and InTouch Health. The robot combines the latest in autonomous navigation and mobility technologies developed by iRobot with state-of-the-art telemedicine and electronic health record integration developed by InTouch Health. RP-VITA allows remote doctor-to-patient consults, ensuring that the physician is in the right place at the right time and has access to the necessary clinical information to take immediate action. The robot has unprecedented ease of use. It maps its own environment and uses an array of sophisticated sensors to autonomously move about a busy space without interfering with people or other objects. Using an intuitive iPad® interface, a doctor can visit a patient, and communicate with hospital staff and patients with a single click, regardless of their location.

The FDA clearance specifies that RP-VITA can be used for active patient monitoring in pre-operative, peri-operative and post-surgical settings, including cardiovascular, neurological, prenatal, psychological and critical care assessments and examinations.

RP-VITA is being sold into the healthcare market by InTouch Health as its new flagship remote presence device. iRobot will continue to explore adjacent market opportunities for robots like RP-VITA and the iRobot Ava™ mobile robotics platform.

“FDA clearance of a robot that can move safely and independently through a fast-paced, chaotic and demanding hospital environment is a significant technological milestone for the robotics and healthcare industries,” said Colin Angle, chairman and CEO of iRobot. “There are very few environments as difficult to maneuver as that of a busy ICU or emergency department. Having crossed this technology threshold, the potential for self-navigating robots in other markets, and for new applications, is virtually limitless.”

“Remote presence solutions have proven their worth in the medical arena for quite some time,” said Yulun Wang, chairman and CEO of InTouch Health. “RP-VITA has undergone stringent testing, and we are confident that the robot’s ease of use and unique set of capabilities will enable new clinical applications and uses.”

Now that hospital robots are currently being employed I can't wait until they have an R2D2 or C3PO version available.

About iRobot Corp.

iRobot designs and builds robots that make a difference. The company’s home robots help people find smarter ways to clean, and its defense & security robots protect those in harm’s way. iRobot’s consumer and military robots feature iRobot Aware® robot intelligence systems, proprietary technology incorporating advanced concepts in navigation, mobility, manipulation and artificial intelligence. For more information about iRobot, please visit www.irobot.com.