ETH Zurich engineers demonstrated a system enabling a robot to control a marionette. Although a robotic puppeteer is pretty damn cool, that's not the point of the research.
"Our long term goal is to enable robots to manipulate various types of complex physical systems – clothing, soft parcels in warehouses or stores, flexible sheets and cables in hospitals or on construction sites, plush toys or bedding in our homes, etc – as skillfully as humans do," they write in their technical paper. "We believe the technical framework we have set up for robotic puppeteering will also prove useful in beginning to address this very important grand-challenge."
(via IEEE Spectrum)
Read the rest
The fifth annual World Robot Conference was open to the public in Beijing last Thursday, August 22, and this bionic flying bird based on a herring gull was one of the more spectacular sights.
Other robots on show at the annual event in China included robo-superheroes, and Taiji-playing robots.
Rough cut of video from Reuters is here (no reporter narration).
[via] Read the rest
The term exoskeleton usually brings to mind the hulking Power Loader worn by Sigourney Weaver in Aliens. But Harvard University researchers have developed a much lighter, more minimal exoskeleton that reduces the energy needed to run or walk. One breakthrough in this exosuit design is that it can tell if the wearer is walking or running and adjusts the robotic assistance accordingly.
“After wearing the system for 15 minutes or so, you start to question if it’s really helping at all, because you just feel like you’re walking,” David Perry, a robotics engineer at the Wyss Institute for Biologically Inspired Engineering at Harvard University, told Scientific American. “Once you shut it off, however, your legs suddenly feel heavy, and you realize how much it was helping. It’s a lot like stepping off the end of one of those moving sidewalks at the airport.”
Not surprisingly, the research is sponsored by the Defense Advanced Research Projects Agency (DARPA)’s former Warrior Web program. From Harvard:
(The exosuit) assists the wearer via a cable actuation system. The actuation cables apply a tensile force between the waist belt and thigh wraps to generate an external extension torque at the hip joint that works in concert with the gluteal muscles. The device weighs 5kg in total with more than 90% of its weight located close to the body’s center of mass.
More: "Reducing the metabolic rate of walking and running with a versatile, portable exosuit" (Science)
Read the rest
Aquanaut is an autonomous submarine developed by Houston Mechatronics Inc. that transforms into a humanoid robot -- well, the upper half anyway -- to service underwater oil and gas rigs. IEEE Spectrum's Evan Ackerman took a dive with Aquanaut in a massive swimming pool that NASA uses to help train astronauts for microgravity. From IEEE Spectrum:
Read the rest
The HMI engineers, who often joke that building a Transformer has been one of their long-term career objectives, are convinced that it can be done. Aquanaut has been designed primarily for servicing subsea oil and gas installations. The companies that own and operate this infrastructure spend vast sums of money to inspect and maintain it. They rely on robotic technologies that haven’t fundamentally changed in decades, largely because of the challenge of working in such an extreme environment. For HMI, however, that’s not a problem: Of its 75 employees, over two dozen used to work for NASA. Extreme environments are what they’re best at.
HMI cofounder and chief technology officer Nic Radford spent 14 years working on advanced robotics projects at NASA’s Johnson Space Center, in Houston. “I’ll grant you that getting into space is harder than getting underwater,” he says. “But space is a pristine environment. Underwater, things are extraordinarily dynamic. I haven’t decided yet whether it’s 10 times harder or 50 times harder for robots working underwater than it is in space..."
Aquanaut will not require a tether or a support ship. It will travel in submarine mode to its deepwater destination, where it’ll transform into its humanoid form, unfolding its powerful arms.
Trying to see the world through someone else's eyes is a great way to build empathy and understanding between people. Turns out, this approach -- when taken literally -- also works with robots. Researchers from the University of Bourgogne, University of Trento, and their colleagues used a head-mounted display to put people "inside" a robot and then studied their "likeability and closeness towards the robot."
"We have demonstrated that by 'beaming' a participant into a robot we can change his or her attitude towards the robot," says University of Trento psychologist Francesco Pavani.
"By 'beaming', we mean that we gave the participants the illusion that they were looking through the robot's eyes, moving its head as if it were their head, look in the mirror and see themselves as a robot."
"Unlike exercises in which the participants couldn't t move the robot's head or do that in a coordinated manner with other body movements, in our study the experience of walking in the shoes of a robot led the participants to adopt a friendlier attitude, to perceive them as socially closer."
From the abstract of their scientific paper published in Scientific Reports:
Read the rest
When participant’ and robot’s head movements were correlated, participants felt that they were incorporated into the robot with a sense of agency. Critically, the robot they embodied was judged more likeable and socially closer. Remarkably, we found that the beaming experience with correlated head movements and corresponding sensation of embodiment and social proximity, was independent of robots’ humanoid’s appearance.
Researchers from the University of Chicago and Sony are developing a wearable electrical muscle stimulation system that boosts your physical reaction time without making it feel like you've lost control of your body. The latter is particularly important when considering the development of exoskeletons and other systems that bring us physically closer to machines for augmenting human capabilities. The system essentially zaps your muscles into contracting at precisely the right time while making it seem as if you're still controlling the movement. From IEEE Spectrum:
Read the rest
The typical reaction time for a human is about 250 milliseconds—meaning it takes you about a quarter of a second after you see something to physically react to it. But the researchers explain that "our conscious awareness of intention takes a moment to arise, around 200 ms." In other words, it takes you about 200 milliseconds for your brain to turn sensory input into a decision to do something like move a muscle, and then another 50 or so milliseconds for that muscle to actually start moving. The researchers suggest that this 50-ish millisecond gap between intention and action is a window that they can exploit to make humans react more quickly while still feeling like the action they take is under their control.
The video below shows a series of experiments that demonstrate how reflexes can be usefully accelerated without decreasing the sense of control, or agency, that the user experiences. It turns out that an EMS-driven improvement in reflexes of up to 80 milliseconds is possible while still maintaining the user's sense of agency, which is the difference between success and failure in these particular experiments.
This is Digit, a new bipedal bot from Agility Robotics, out for a stroll in its hometown of Albany, Oregon. Next year, you'll be able to order your own Digit, but the price hasn't been announced yet. From Agility Robotics:
Read the rest
Although still in testing, Digit is strong enough to pick up and stack boxes weighing up to 40 lb (18 kg), as well as durable enough to catch itself during a fall using its arms to decelerate. In addition to the physical changes, the control system for Digit has been overhauled to enable advanced behaviors such as stair climbing and footstep planning, all controlled through a robust API that can be accessed both onboard the robot and via a wireless link... Out-of-the-box, Digit will be up and walking within five minutes, even for users who are not legged locomotion control researchers.
One in 500 people are born with polydactyly, extra fingers or toes. Researchers at University of Freiburg in Germany, Imperial College London and Université de Lausanne / EPFL in Switzerland studied two people with well-formed usable sixth fingers between the thumb and first fingers on both hands to understand how their brains deal with the "extra workload" of controlling those digits. According to Imperial College bioengineer Etienne Burdet, high-resolution functional magnetic resonance imaging (fMRI) revealed that "the polydactyl individual's brains were well adapted to controlling extra workload, and even had dedicated areas for the extra fingers. It's amazing that the brain has the capacity to do this seemingly without borrowing resources from elsewhere." From Imperial College London:
Read the rest
Polydactyl participants also performed better at many tasks than their non-polydactyl counterparts. For instance, they were able to perform some tasks, like tying shoelaces, with only one hand, where two are usually needed... (See video above.)
The international team of authors say the findings might serve as blueprint for the developing artificial limbs and digits to expand our natural movement abilities. For example, giving a surgeon control over an extra robotic arm could enable them to operate without an assistant...
However, (lead author Carsten Mehring of Freiburg University) warned that people with robotic extra limbs may not achieve as good control as observed in the two polydactyl subjects. Any robotic digits or limbs wouldn’t have dedicated bone structure, muscles, tendons or nerves.
In addition, subjects would need to learn to use extra fingers or limbs, much like how an amputee learns how to use a prosthetic arm.
Salto is a single-legged, hopping robot that its UC Berkeley inventors compare to a "hyper-aggressive pogo-stick." Previously, Salto was constrained to a highly-structured indoor environment with a motion caption system. Now though, roboticists Justin Yim and Eric Wang have imbued Salto with the onboard smarts to bounce freely through the world albeit still under human control. From UC Berkeley:
Salto’s single, powerful leg is modeled after those of the galago, or Senegalese bush baby. The small, tree-dwelling primate’s muscles and tendons store energy in a way that gives the spry creature the ability to string together multiple jumps in a matter of seconds. By linking a series of quick jumps, Salto also can navigate complex terrain — like a pile of debris — that might be impossible to cross without jumping or flying.
“Unlike a grasshopper or cricket that winds up and gives one jump, we’re looking at a mechanism where it can jump, jump, jump, jump,” (UC Berkeley robotics professor Ronald) Fearing said. “This allows our robot to jump from location to location, which then gives it the ability to temporarily land on surfaces that we might not be able to perch on.”
From IEEE Spectrum:
Read the rest
...The researchers expect that “higher precision estimation and control can enable jumping on more finely varied surfaces like stairs, furniture, or other outcroppings” as well as “soft substrates like upholstery or natural foliage.”
The researchers tell us that Salto’s hardware is capable enough at this point that aside from potentially upgrading the motor or battery for more jumping power or run time, the focus now will be on new behaviors, although they’re toying with the idea of adding some kind of gripping foot so that Salto can launch from, and land on, tree branches (!).
Researchers from the University of Zurich's Robotics and Perception Group designed an event camera system for drones. In the video above, the fun starts at 1:25. As explained by IEEE Spectrum, "These are sensors that are not good at interpreting a scene visually like a regular camera, but they’re extremely sensitive to motion, responding to changes in a scene on a per-pixel basis in microseconds. A regular camera that detects motion by comparing one frame with another takes milliseconds to do the same thing, which might not seem like much, but for a fast-moving drone it could easily be the difference between crashing into something and avoiding it successfully."
Read the rest
We previously posted about a robot that solved a Rubik's Cube in .637 seconds. Read the rest
A few days ago, two little robots arrived at the International Space Station to help astronauts with simple tasks. Called Astrobees, the cube bots are 12" x 12" x 12" and propelled around the microgravity environment by small fans. The bots are named Honey and Bumble. A third, Queen, remains on Earth. From NASA:
Working autonomously or via remote control by astronauts, flight controllers or researchers on the ground, the robots are designed to complete tasks such as taking inventory, documenting experiments conducted by astronauts with their built-in cameras or working together to move cargo throughout the station. In addition, the system serves as a research platform that can be outfitted and programmed to carry out experiments in microgravity - helping us to learn more about how robotics can benefit astronauts in space.
Read the rest
Remember UC Berkeley researcher Pieter Abbeel's fantastic towel-folding robot? Now, Abbeel and his team have prototyped a new kind of robot arm design meant for the home and other human environments. Compared to robot arms common in factories, this manipulator, called Blue, is less expensive ( Read the rest
Toyota Engineering Society's CUE 3 is a 6'3" humanoid robot reportedly hits free throws with nearly 100 percent accuracy. From the AP:
(The robot) computes as a three-dimensional image where the basket is, using sensors on its torso, and adjusts motors inside its arm and knees to give the shot the right angle and propulsion for a swish...
Stanford University Professor Oussama Khatib, who directs the university's robotics lab, said Cue 3 demonstrates complex activities such as using sensors and nimble computation in real-time in what he called "visual feedback."
To shoot hoops, the robot must have a good vision system, be able to compute the ball's path then execute the shot, he said in a telephone interview.
"What Toyota is doing here is really bringing the top capabilities in perception with the top capabilities in control to have robots perform something that is really challenging," Khatib said.
"Toyota robot can’t slam dunk but it shoots a mean 3-pointer" (AP/Asahi Shimbun)
Read the rest
I-Wei Huang (aka Crabfu) makes all sorts of cool steam-powered mini-robots. In this video, he explains how he made a walking robot. Read the rest
Several years ago, I wrote a feature for Bloomberg Businessweek about soft robotics, "in which steel skeletons and power-hungry motors make way for textiles." The idea is that soft robots, often powered by compressed air in pneumatic "muscles," are more flexible, lighter weight, and much safer for their human workmates. Above is video of automation robotics firm Festo's BionicSoftArm. From their description:
Whether free and flexible movements or defined sequences, thanks to its modular design, the pneumatic lightweight robot can be used for numerous applications. In combination with various adaptive grippers, it can pick up and handle a wide variety of objects and shapes. At the same time, it is completely compliant and poses no danger to the user even in the event of a collision.
Read the rest
The United States Food and Drug Administration issued a warning Thursday about the use of surgical robots in breast cancer surgery. FDA says that use of the robotic medical devices in mastectomy, lumpectomy, and related surgery because of "preliminary" evidence that it may be linked to lower long-term survival.
. Read the rest