Toggle light / dark theme

Adam Savage has made bullet-proof Iron Man Armor using 3D printed titanium and a flying jet suit from Gravity.

It is more precisely a real-life Titanium Man (comic book enemy of Iron Man).

The US military (Special Ops) recently canceled an attempt to make real-life iron man exoskeleton armor with strength enhancement. They are looking to use components of the system to help boost the strength of joints and to increase light-weight armor protection for many soldiers.

Kyle Reese: The Terminator’s an infiltration unit, part man, part machine. Underneath, it’s a hyperalloy combat chassis — micro processor-controlled, fully armored. Very tough. But outside, it’s living human tissue — flesh, skin, hair, blood, grown for the cyborgs…


3D bioprinting is the automated fabrication of multicellular tissue via spatially defined deposition of cells. The ability to spatially control deposition in the x, y and z axes allows for creation of tissue-specific patterns or compartments, with in vivo-like architecture that mimics key aspects of native biology.

3D bioprinted tissues exhibit a microenvironment more suited to in vivo-like cellular function in comparison to traditional 2D monoculture (or monolayer co-cultures), as well as maintenance of a more defined architecture than is observed in self-aggregated co-culture models.

Read more

The protocol, dubbed Walk Again Neuro-Rehabilitation (WA-NR), first uses EEG to record and control virtual avatars and robotic exoskeleton walkers while the patient wears a “tactile shirt” that offers them sensory feedback. This stimulation theoretically teaches damaged nerves to reroute their motor functions to healthy ones. Following the program for just three years, the patients—some paralyzed for decades—dramatically regained sensation in their lower limbs. They could feel where their legs were in space and better control their lower limbs. Some even reported feelings of normal, welcomed pain after a sharp jab.

The current study, published in Scientific Reports, takes neurorehab a step further. In two patients from the original cohort, the team further trained and examined their neuro-recovery in detail. Patient P1 was a middle-aged man paralyzed for 4.5 years at the onset of the study; P2, a 32-year-old, had been paralyzed for a decade. Although trained with WA-NR, both patients scored on the low end of overall movement, with the ability to extend their knees at most.

For each training session, the patients wore an EEG cap to measure movement intent and had eight electrodes placed on the skin of each leg to stimulate muscles. Simultaneously they wore a haptic shirt, which gave them a sense of their body in space by stimulating their forearms.

Read more

A new open-source, artificially intelligent prosthetic leg designed by researchers at the University of Michigan and Shirley Ryan AbilityLab is now available to the scientific community.

The leg’s free-to-copy design and programming are intended to improve the quality of life of patients and accelerate by offering a unified platform to fragmented research efforts across the field of bionics.

“Our Open-Source Bionic Leg will enable investigators to efficiently solve challenges associated with controlling across a range of activities in the lab and out in the community,” said lead designer Elliott Rouse, core faculty at U-M’s Robotics Institute and assistant professor of mechanical engineering. “In addition, we hope our bionic leg will unite researchers with a common hardware platform and enable new investigators from related fields to develop innovative control strategies.”

Read more

When Elon Musk and DARPA both hop aboard the cyborg hypetrain, you know brain-machine interfaces (BMIs) are about to achieve the impossible.

BMIs, already the stuff of science fiction, facilitate crosstalk between biological wetware with external computers, turning human users into literal cyborgs. Yet mind-controlled robotic arms, microelectrode “nerve patches”, or “memory Band-AIDS” are still purely experimental medical treatments for those with nervous system impairments.

With the Next-Generation Nonsurgical Neurotechnology (N3) program, DARPA is looking to expand BMIs to the military. This month, the project tapped six academic teams to engineer radically different BMIs to hook up machines to the brains of able-bodied soldiers. The goal is to ditch surgery altogether—while minimizing any biological interventions—to link up brain and machine.

Read more

This post by Prof. Kevin Warwick originally appeared at OpenMind.

Article from the book There’s a Future: Visions for a Better World

If you could improve by implanting a chip in your brain to expand your nervous system through the Internet, ‘update yourself’ and partially become a machine, would you? What Kevin Warwick, professor of cybernetics at the University of Reading, poses may sound like science fiction but it is not; he has several implanted chips, which makes him a cyborg: half man, half machine. In this fascinating article, Warwick explains the various steps that have been taken to grow neurons in a laboratory that can then be used to control robots, and how chips implanted in our brains can also move muscles in our body at will. It won’t be long before we also have robots with brains created with human neurons that have the same types of skills as human brains. Should they, then, have the same rights as us?

Read more

Wearing a sensor-packed glove while handling a variety of objects, MIT researchers have compiled a massive dataset that enables an AI system to recognize objects through touch alone. The information could be leveraged to help robots identify and manipulate objects, and may aid in prosthetics design.

The researchers developed a low-cost knitted glove, called “scalable tactile glove” (STAG), equipped with about 550 tiny sensors across nearly the entire hand. Each sensor captures pressure signals as humans interact with objects in various ways. A processes the signals to “learn” a dataset of pressure-signal patterns related to specific objects. Then, the system uses that dataset to classify the objects and predict their weights by feel alone, with no visual input needed.

In a paper published in Nature, the researchers describe a dataset they compiled using STAG for 26 common objects—including a soda can, scissors, tennis ball, spoon, pen, and mug. Using the dataset, the system predicted the objects’ identities with up to 76 percent accuracy. The system can also predict the correct weights of most objects within about 60 grams.

Read more