Category: BCI

Researchers take major step forward in Artificial Intelligence

By Hugo Angel,

The long-standing dream of using Artificial Intelligence (AI) to build an artificial brain has taken a significant step forward, as a team led by Professor Newton Howard from the University of Oxford has successfully prototyped a nanoscale, AI-powered, artificial brain in the form factor of a high-bandwidth neural implant.
Professor Newton Howard (pictured above and below) holding parts of the implant device
In collaboration with INTENT LTD, Qualcomm Corporation, Intel Corporation, Georgetown University and the Brain Sciences Foundation, Professor Howard’s Oxford Computational Neuroscience Lab in the Nuffield Department of Surgical Sciences has developed the proprietary algorithms and the optoelectronics required for the device. Rodents’ testing is on target to begin very soon.
This achievement caps over a decade of research by Professor Howard at MIT’s Synthetic Intelligence Lab and the University of Oxford, work that resulted in several issued US patents on the technologies and algorithms that power the device, 
  • the Fundamental Code Unit of the Brain (FCU)
  • the Brain Code (BC) and the Biological Co-Processor (BCP) 

are the latest advanced foundations for any eventual merger between biological intelligence and human intelligence. Ni2o (pronounced “Nitoo”) is the entity that Professor Howard licensed to further develop, market and promote these technologies.

The Biological Co-Processor is unique in that it uses advanced nanotechnology, optogenetics and deep machine learning to intelligently map internal events, such as neural spiking activity, to external physiological, linguistic and behavioral expression. The implant contains over a million carbon nanotubes, each of which is 10,000 times smaller than the width of a human hair. Carbon nanotubes provide a natural, high-bandwidth interface as they conduct heat, light and electricity instantaneously updating the neural laces. They adhere to neuronal constructs and even promote neural growth. Qualcomm team leader Rudy Beraha commented, ‘Although the prototype unit shown today is tethered to external power, a commercial Brain Co-Processor unit will be wireless and inductively powered, enabling it to be administered with a minimally-invasive procedures.
The device uses a combination of methods to write to the brain, including 
  • pulsed electricity
  • light and 
  • various molecules that simulate or inhibit the activation of specific neuronal groups
These can be targeted to stimulate a desired response, such as releasing chemicals in patients suffering from a neurological disorder or imbalance. The BCP is designed as a fully integrated system to use the brain’s own internal systems and chemistries to pattern and mimic healthy brain behavior, an approach that stands in stark contrast to the current state of the art, which is to simply apply mild electrocution to problematic regions of the brain. 
Therapeutic uses
The Biological Co-Processor promises to provide relief for millions of patients suffering from neurological, psychiatric and psychological disorders as well as degenerative diseases. Initial therapeutic uses will likely be for patients with traumatic brain injuries and neurodegenerative disorders, such as Alzheimer’s, as the BCP will strengthen the weak, shortening connections responsible for lost memories and skills. Once implanted, the device provides a closed-loop, self-learning platform able to both determine and administer the perfect balance of pharmaceutical, electroceutical, genomeceutical and optoceutical therapies.
Dr Richard Wirt, a Senior Fellow at Intel Corporation and Co-Founder of INTENT, the company’s partner of Ni2o bringing BCP to market, commented on the device, saying, ‘In the immediate timeframe, this device will have many benefits for researchers, as it could be used to replicate an entire brain image, synchronously mapping internal and external expressions of human response. Over the long term, the potential therapeutic benefits are unlimited.
The brain controls all organs and systems in the body, so the cure to nearly every disease resides there.- Professor Newton Howard
Rather than simply disrupting neural circuits, the machine learning systems within the BCP are designed to interpret these signals and intelligently read and write to the surrounding neurons. These capabilities could be used to reestablish any degenerative or trauma-induced damage and perhaps write these memories and skills to other, healthier areas of the brain. 
One day, these capabilities could also be used in healthy patients to radically augment human ability and proactively improve health. As Professor Howard points out: ‘The brain controls all organs and systems in the body, so the cure to nearly every disease resides there.‘ Speaking more broadly, Professor Howard sees the merging of man with machine as our inevitable destiny, claiming it to be ‘the next step on the blueprint that the author of it all built into our natural architecture.
With the resurgence of neuroscience and AI enhancing machine learning, there has been renewed interest in brain implants. This past March, Elon Musk and Bryan Johnson independently announced that they are focusing and investing in for the brain/computer interface domain. 
When asked about these new competitors, Professor Howard said he is happy to see all these new startups and established names getting into the field – he only wonders what took them so long, stating: ‘I would like to see us all working together, as we have already established a mathematical foundation and software framework to solve so many of the challenges they will be facing. We could all get there faster if we could work together – after all, the patient is the priority.
© 2017 Nuffield Department of Surgical Sciences, John Radcliffe Hospital, Headington, Oxford, OX3 9DU
2 June 2017 

A Brain-Computer Interface That Lasts for Weeks

By admin,

Photo: John Rogers/University of Illinois
Brain signals can be read using soft, flexible, wearable electrodes that stick onto and near the ear like a temporary tattoo and can stay on for more than two weeks even during highly demanding activities such as exercise and swimming, researchers say.
The invention could be used for a persistent brain-computer interface (BCI) to help people operate prosthetics, computers, and other machines using only their minds, scientists add.
For more than 80 years, scientists have analyzed human brain activity non-invasively by recording electroencephalograms (EEGs). Conventionally, this involves electrodes stuck onto the head with conductive gel. The electrodes typically cannot stay mounted to the skin for more than a few days, which limits widespread use of EEGs for applications such as BCIs.
Now materials scientist John Rogers at the University of Illinois at Urbana-Champaign and his colleagues have developed a wearable device that can help record EEGs uninterrupted for more than 14 days. Moreover, their invention survived despite showering, bathing, and sleeping. And it did so without irritating the skin. The two weeks might be “a rough upper limit, defined by the timescale for natural exfoliation of skin cells,” Rogers says. 
The device consists of a soft, foldable collection of gold electrodes only 300 nanometers thick and 30 micrometers wide mounted on a soft plastic film. This assemblage stays stuck to the body using electric forces known as van der Waals interactions—the same forces that help geckoes cling cling to walls.
The electrodes are flexible enough to mold onto the ear and the mastoid process behind the ear. The researchers mounted the device onto three volunteers using tweezers. Spray-on bandage was used once twice a day to help the electrodes survive normal daily activities.
The electrodes on the mastoid process recorded brain activity while those on the ear were used as a ground wire. The electrodes were connected to a stretchable wire that could plug into monitoring devices. “Most of the experiments used devices mounted on just one side, but dual sides is certainly possible,” Rogers says.
The device helped record brain signals well enough for the volunteers to operate a text-speller by thought, albeit at a slow rate of 2.3 to 2.5 letters per minute.
According to Rogers, this research: 
…could enable a persistent BCI that one could imagine might help disabled people, for whom mind control is an attractive option for operating prosthetics… It could also be useful for monitoring cognitive states—for instance, 

  • to see if people are paying attention while they’re driving a truck, 
  • flying an airplane, or 
  • operating complex machinery. 

It could also help monitor patterns of sleep to better understand sleep disorders such as sleep apnea, or for monitoring brain function during learning.

The scientists hope to improve the speed at which people can use this device to communicate mentally, which could expand its use into commercial wearable electronics. They also plan to explore devices that can operate wirelessly, Rogers says. The researchers detailed their findings online March 16 in the journal Proceedings of the National Academy of Sciences.
By Charles Q. Choi
16 Mar 2015 

Building Mind-Controlled Gadgets Just Got Easier

By admin,

By Eliza Strickland
11 Aug 2014
A new brain-computer interface lets DIYers access their brain waves
Photo: Chip AudetteEngineer Chip Audette used the OpenBCI system to control a robot spider with his mind.
The guys who decided to make a mind-reading tool for the masses are not neuroscientists. In fact, they’re artists who met at Parsons the New School for Design, in New York City. In this day and age, you don’t have to be a neuroscientist to muck around with brain signals.
With Friday’s launch of an online store selling their brain-computer interface (BCI) gear, Joel Murphy and Conor Russomanno hope to unleash a wave of neurotech creativity. Their system enables DIYers to use brain waves to control anything they can hack—a video game, a robot, you name it. “It feels like there’s going to be a surge,” says Russomanno. “The floodgates are about to open.” And since their technology is open source, the creators hope hackers will also help improve the BCI itself.

Photo: OpenBCI The OpenBCI board takes in data from up to eight electrodes.

Their OpenBCI system makes sense of an electroencephalograph (EEG), signal, a general measure of electrical activity in the brain captured via electrodes on the scalp. The fundamental hardware component is a relatively new chip from Texas Instruments, which takes in analog data from up to eight electrodes and converts it to a digital signal. Russomanno and Murphy used the chip and an Arduino board to create OpenBCI, which essentially amplifies the brain signal and sends it via Bluetooth to a computer for processing. “The big issue is getting the data off the chip and making it accessible,” Murphy says. Once it’s accessible, Murphy expects makers to build things he hasn’t even imagined yet.
The project got its start in 2011, when Russomanno was a student in Murphy’s physical computing class at Parsons and told his professor he wanted to hack an EEG toy made by Mattel. The toy’s EEG-enabled headset supposedly registered the user’s concentrated attention (which in the game activated a fan that made a ball float upward). But the technology didn’t seem very reliable, and since it wasn’t open source, Russomanno couldn’t study the game’s method of collecting and analyzing the EEG data. He decided that an open-source alternative was necessary if he wanted to have any real fun.
Happily, Russomanno and his professor soon connected with engineer Chip Audette, of the New Hampshire R&D firm Creare, who already had a grant from the U.S. Defense Advanced Research Projects Agency (DARPA) to develop a low-cost, high-quality EEG system for “nontraditional users.” Once the team had cobbled together a prototype of their OpenBCI system, they decided to offer their gear to the world with a Kickstarter campaign, which ended in January and raised more than twice the goal of US $100,000.
Murphy and Russomanno soon found that production would be more difficult and take longer than expected (as is the case with so many Kickstarter projects), so they had to push back their shipping date by several months. Now, though, they’re in business—and Russomanno says that shipping a product is only the beginning. “We don’t just want to sell something; we want to teach people how to use it and also develop a community,” he says. OpenBCI wants to be an online portal where experimenters can swap tips and post research projects.
So once a person’s brain-wave data is streaming into a computer, what is to be done with it? OpenBCI will make some simple software available, but mostly Russomanno and Murphy plan to watch as inventors come up with new applications for BCIs.
Audette, the engineer from Creare, is already hacking robotic “battle spiders” that are typically steered by remote control. Audette used an OpenBCI prototype to identify three distinct brain-wave patterns that he can reproduce at will, and he sent those signals to a battle spider to command it to turn left or right or to walk straight ahead. “The first time you get something to move with your brain, the satisfaction is pretty amazing,” Audette says. “It’s like, ‘I am king of the world because I got this robot to move.’
In Los Angeles, a group is using another prototype to give a paralyzed graffiti artist the ability to practice his craft again. The artist, Tempt One, was diagnosed with Lou Gehrig’s disease in 2003 and gradually progressed to the nightmarish “locked in” state. By 2010 he couldn’t move or speak and lay inert in a hospital bed—but with unimpaired consciousness, intellect, and creativity trapped inside his skull. Now his supporters are developing a system called the BrainWriter: They’re using OpenBCI to record the artist’s brain waves and are devising ways to use those brain waves to control the computer cursor so Tempt can sketch his designs on the screen.
Another early collaborator thinks that OpenBCI will be useful in mainstream medicine. David Putrino, director of telemedicine and virtual rehabilitation at the Burke Rehabilitation Center, in White Plains, N.Y., says he’s comparing the open-source system to the $60,000 clinic-grade EEG devices he typically works with. He calls the OpenBCI system robust and solid, saying, “There’s no reason why it shouldn’t be producing good signal.
Putrino hopes to use OpenBCI to build a low-cost EEG system that patients can take home from the hospital, and he imagines a host of applications. Stroke patients, for example, could use it to determine when their brains are most receptive to physical therapy, and Parkinson’s patients could use it to find the optimal time to take their medications. “I’ve been playing around with these ideas for a decade,” Putrino says, “but they kept failing because the technology wasn’t quite there.” Now, he says, it’s time to start building.

Researcher controls colleague’s motions in first human brain-to-brain interface (w/ Video)

By admin,

ORIGINAL: MedicalXPress
by Doree Armstrong & Michelle Ma


University of Washington researcher Rajesh Rao, left, plays a computer game with his mind. Across campus, researcher Andrea Stocco, right, wears a magnetic stimulation coil over the left motor cortex region of his brain. Stocco’s right index

 (Medical Xpress)—University of Washington researchers have performed what they believe is the first noninvasive human-to-human brain interface, with one researcher able to send a brain signal via the Internet to control the hand motions of a fellow researcher.

Using electrical brain recordings and a form of magnetic stimulation, Rajesh Rao sent a brain signal to Andrea Stocco on the other side of the UW campus, causing Stocco’s finger to move on a keyboard.

While researchers at Duke University have demonstrated brain-to-brain communication between two rats, and Harvard researchers have demonstrated it between a human and a rat, Rao and Stocco believe this is the first demonstration of human-to-human brain interfacing.

The Internet was a way to connect computers, and now it can be a way to connect brains,” Stocco said. We want to take the knowledge of a brain and transmit it directly from brain to brain.

The researchers captured the full demonstration on video recorded in both labs. The version available at the end of this release has been edited for length.

Rao, a UW professor of computer science and engineering, has been working on brain-computer interfacing (BCI) in his lab for more than 10 years and just published a textbook on the subject. In 2011, spurred by the rapid advances in BCI technology, he believed he could demonstrate the concept of human brain-to-brain interfacing. So he partnered with Stocco, a UW research assistant professor in psychology at the UW’s Institute for Learning & Brain Sciences.

On Aug. 12, Rao sat in his lab wearing a cap with electrodes hooked up to an electroencephalography machine, which reads electrical activity in the brain. Stocco was in his lab across campus wearing a purple swim cap marked with the stimulation site for the transcranial magnetic stimulation coil that was placed directly over his left motor cortex, which controls hand movement.

This image shows the cycle of the experiment. Brain signals from the “Sender” are recorded. When the computer detects imagined hand movements, a “fire” command is transmitted over the Internet to the TMS machine, which causes an upward 

The team had a Skype connection set up so the two labs could coordinate, though neither Rao nor Stocco could see the Skype screens.

Rao looked at a computer screen and played a simple video game with his mind. When he was supposed to fire a cannon at a target, he imagined moving his right hand (being careful not to actually move his hand), causing a cursor to hit the “fire” button. Almost instantaneously, Stocco, who wore noise-canceling earbuds and wasn’t looking at a computer screen, involuntarily moved his right index finger to push the space bar on the keyboard in front of him, as if firing the cannon. Stocco compared the feeling of his hand moving involuntarily to that of a nervous tic.

It was both exciting and eerie to watch an imagined action from my brain get translated into actual action by another brain,” Rao said. “This was basically a one-way flow of information from my brain to his. The next step is having a more equitable two-way conversation directly between the two brains.”

The technologies used by the researchers for recording and stimulating the brain are both well-known. Electroencephalography, or EEG, is routinely used by clinicians and researchers to record brain activity noninvasively from the scalp. Transcranial magnetic stimulation, or TMS, is a noninvasive way of delivering stimulation to the brain to elicit a response. Its effect depends on where the coil is placed; in this case, it was placed directly over the brain region that controls a person’s right hand. By activating these neurons, the stimulation convinced the brain that it needed to move the right hand.

Computer science and engineering undergraduates Matthew Bryan, Bryan Djunaedi, Joseph Wu and Alex Dadgar, along with bioengineering graduate student Dev Sarma, wrote the computer code for the project, translating Rao’s brain signals into a command for Stocco’s brain.

Brain-computer interface is something people have been talking about for a long, long time,” said Chantel Prat, assistant professor in psychology at the UW’s Institute for Learning & Brain Sciences, and Stocco’s wife and research partner who helped conduct the experiment. “We plugged a brain into the most complex computer anyone has ever studied, and that is another brain.”

At first blush, this breakthrough brings to mind all kinds of science fiction scenarios. Stocco jokingly referred to it as a “Vulcan mind meld.” But Rao cautioned this technology only reads certain kinds of simple brain signals, not a person’s thoughts. And it doesn’t give anyone the ability to control your actions against your will.

Both researchers were in the lab wearing highly specialized equipment and under ideal conditions. They also had to obtain and follow a stringent set of international human-subject testing rules to conduct the demonstration.

I think some people will be unnerved by this because they will overestimate the technology,” Prat said. “There’s no possible way the technology that we have could be used on a person unknowingly or without their willing participation.

Stocco said years from now the technology could be used, for example, by someone on the ground to help a flight attendant or passenger land an airplane if the pilot becomes incapacitated. Or a person with disabilities could communicate his or her wish, say, for food or water. The brain signals from one person to another would work even if they didn’t speak the same language.

Rao and Stocco next plan to conduct an experiment that would transmit more complex information from one brain to the other. If that works, they then will conduct the experiment on a larger pool of subjects.
Explore further: Artifact suppression and analysis of brain activities with EEG signals

More information:

Provided by University of Washington

  Category: BCI, TMS
  Comments: Comments Off on Researcher controls colleague’s motions in first human brain-to-brain interface (w/ Video)