Toggle light / dark theme

First bilateral brain implant gives paraplegic a two-handed sense of touch

Brain-connected machines that capture and translate electrical signals are showing great promise across a number of areas, but one with massive potential is the world of prosthetics. Scientists exploring these possibilities at Johns Hopkins University are now reporting a big breakthrough, demonstrating a system that enables a quadriplegic to control two prosthetics arms at once using only his thoughts, and also feel a sense of touch coming back the other way.

The team at Johns Hopkins University has been making some exciting progress in this area through its Revolutionizing Prosthetics program, which was launched by DARPA in 2006. In 2016, we saw a double amputee use his brain to control two of the team’s Modular Prosthetic Limbs (MPLs), bilateral shoulder-level prosthetics that enabled him to do things like move cups between shelves, a first for this kind of research.

This system worked via custom sockets which both supported the artificial limbs and hooked them up to nerves in the patient’s torso which, following a treatment regime, had been trained to provide specific control movements for the prosthetic limbs. Five years on, the team has made some advances.

Are You Ready for the Future of Transhumanism?

Are you ready for the future? A Transhumanist future? One where everyone around you—friends, family, and neighbors—has dipped into the transhumanist punch bowl. A future of contact lenses that see in the dark, endoskeleton attached artificial limbs that lift a half-ton, and brain chip implants that read your thoughts and instantly communicate them to others. Sound crazy? Indeed, it does. Nevertheless, it’s coming soon. Very soon. In fact, much of the technology already exists. Some of it’s being sold commercially at your local superstore or being tested in laboratories right now around the world.

We’ve all heard about driverless test cars on the roads and how doctors in France are replacing people’s hearts with permanent robotic ones, but did you know there’s already a multi-billion dollar market for brain wave reading headsets? Using electroencephalography (EEG) sensors that pick up and monitor brain activity, NeuroSky’s MindWave can attach to Google Glass and allow you to take a picture and post it to Facebook and Twitter just by thinking about it. Other headsets allow you to play video games on your iPhone just with your thoughts too. In fact, well over a year ago now, the first mind-to-mind communication took place. A researcher in India projected a thought to a colleague in France, and using their headsets, they understood each other. Telepathy went from science fiction to reality.

The history of transhumanism—the burgeoning field of science and radical tech used to describe robotic implants, prosthetics, and cyborg-like enhancements in the human being and its experience—has come a long way since scientists began throwing around the term a half century ago. What a difference a generation or two makes. Today a thriving pro-cyborg medical industry is setting the stage for trillion-dollar markets that will remake the human experience. Five million people in America suffer from Alzheimer’s, but a new surgery that involves installing brain implants is showing promise in restoring people’s memory and improving lives. The use of medical and microchip implants, whether in the brain or not, are expected to surge in the coming years. Some experts surmise as many as half of Americans will have implants by 2020. I already have one in my hand. It’s truly a new age for humans.

Moving Beyond Mind-Controlled Limbs to Prosthetics That Can Actually ‘Feel’

Brain-machine interface enthusiasts often gush about “closing the loop.” It’s for good reason. On the implant level, it means engineering smarter probes that only activate when they detect faulty electrical signals in brain circuits. Elon Musk’s Neuralinkamong other players—are readily pursuing these bi-directional implants that both measure and zap the brain.

But to scientists laboring to restore functionality to paralyzed patients or amputees, “closing the loop” has broader connotations. Building smart mind-controlled robotic limbs isn’t enough; the next frontier is restoring sensation in offline body parts. To truly meld biology with machine, the robotic appendage has to “feel one” with the body.

This month, two studies from Science Robotics describe complementary ways forward. In one, scientists from the University of Utah paired a state-of-the-art robotic arm—the DEKA LUKE—with electrically stimulating remaining nerves above the attachment point. Using artificial zaps to mimic the skin’s natural response patterns to touch, the team dramatically increased the patient’s ability to identify objects. Without much training, he could easily discriminate between the small and large and the soft and hard while blindfolded and wearing headphones.

Researchers build a soft robot with neurologic capabilities

In work that combines a deep understanding of the biology of soft-bodied animals such as earthworms with advances in materials and electronic technologies, researchers from the United States and China have developed a robotic device containing a stretchable transistor that allows neurological function.

Cunjiang Yu, Bill D. Cook Associate Professor of Mechanical Engineering at the University of Houston, said the work represents a significant step toward the development of prosthetics that could directly connect with the in biological tissues, offering to , as well as toward advances in soft neurorobots capable of thinking and making judgments. Yu is corresponding author for a paper describing the work, published in Science Advances.

He is also a principal investigator with the Texas Center for Superconductivity at the University of Houston.

Sensitive synthetic skin makes for hug-safe humanoid robot

Back in 2011 we looked at an array of small hexagonal plates created to serve as an electronic skin that endows robots with a sense of touch. The team responsible had placed 31 of these hexagonal “skin cells” on a small robot, but now they’ve gone a lot further, equipping a human-sized robot with 1,260 cells to create what they claim is the first autonomous humanoid robot with artificial skin covering its entire body – even the soles of its feet.

In the eight years since the original touchy-feely robot, Professor Gordon Cheng and his team at the Technical University of Munich (TUM) have refined the look of the individual sensor cells, but they still boast the same basic capabilities. They’re still hexagonal in shape, allowing them to be placed in a honeycomb arrangement, and they can still measure proximity, pressure, temperature and acceleration.

But the main hurdle the team faced in expanding the number of cells so as to fully cover a human-sized robot was computing power, and it’s here that the team is claiming a breakthrough. Continuously processing data from more than a few hundred sensors quickly overloaded previous systems, so the team took inspiration from an approach employed by the human nervous system.