The U.S. military is reportedly testing a new breed of war robots, ones that are designed to go out in the field with human soldiers and, like their flesh and blood brethren, respond to gestures and voice commands. They're also capable of carrying – and using – lethal weapons like grenade launchers and machine guns on command [source: Sanborn].
The 350-pound (159-kilogram) Modular Advanced Armed Robotic System (MAARS) machines run about $300,000 a pop, but proponents say that cost is easily justified if the robots can eventually be used in place of human soldiers. Not only might that cut down on physical risks, but it also may help soldiers avoid some of the mental and emotional issues – anxiety, post-traumatic stress – that can come with a tour of duty [sources: Dubiel, Dean].
Also in development is a pack animal-esque robot prototype designed to make human soldiers better fighters by lightening their loads. The Legged Squad Support System (LS3) is a roving set of next-level mechanical bulls, headless machines that look like bulls or pack horses. These robots are more of a complement to than a replacement of human boots on the ground, lugging gear and serving as a mobile auxiliary power source. The goal is for each semiautonomous machine to be able to "carry 400 pounds [181 kilograms] of a squad's load, follow squad members through rugged terrain, and interact with troops in a natural way, similar to a trained animal and its handler," according to the Defense Advanced Research Projects Agency (DARPA), the technology's developer [sources: Madrigal, DARPA].
Meanwhile, researchers at Johns Hopkins University are working on the next generation of robot bomb disposers. This one features a two-wheel torso that makes the machine more agile and prosthetic limbs, like those designed for humans, that can curl up to 50 pounds (23 kilograms) and pinch with the force of up to 20 pounds (9 kilograms). In addition to remote control, the bots can be operated via telepresence gloves that let the user move the machine's arms and hands by simply moving his or her own arms and hands, as well as a motion tracking headset that allows the user to see what the robot sees [source: Tarantola].