The research, published today in the Journal of Neuroengineering by researchers from the University of Arizona, could help us understand how human babies learn to walk.
The robot’s legs are made from plastic using a 3D printer, and the ”muscles” consist of motors that pull on kevlar straps to bend and straighten the legs. Force sensors in the straps endow the robot with proprioception, the human body’s sense of its limb positions and movements.
To better mimic walking in humans, the computer that controls the legs acts like a central pattern generator, or CPG, the neural network in our spinal cords responsible for the rhythmic, mindless quality of walking. The researchers used a simple “half-center CPG” consisting of two signals firing alternately to flex and then extend each leg. The movement allows slight adjustments for sensory feedback, in a reflex-like manner.