A few days ago, two little robots arrived at the International Space Station to help astronauts with simple tasks. Called Astrobees, the cube bots are 12" x 12" x 12" and propelled around the microgravity environment by small fans. The bots are named Honey and Bumble. A third, Queen, remains on Earth. From NASA:
Working autonomously or via remote control by astronauts, flight controllers or researchers on the ground, the robots are designed to complete tasks such as taking inventory, documenting experiments conducted by astronauts with their built-in cameras or working together to move cargo throughout the station. In addition, the system serves as a research platform that can be outfitted and programmed to carry out experiments in microgravity - helping us to learn more about how robotics can benefit astronauts in space.
Read the rest
Remember UC Berkeley researcher Pieter Abbeel's fantastic towel-folding robot? Now, Abbeel and his team have prototyped a new kind of robot arm design meant for the home and other human environments. Compared to robot arms common in factories, this manipulator, called Blue, is less expensive ( Read the rest
Toyota Engineering Society's CUE 3 is a 6'3" humanoid robot reportedly hits free throws with nearly 100 percent accuracy. From the AP:
(The robot) computes as a three-dimensional image where the basket is, using sensors on its torso, and adjusts motors inside its arm and knees to give the shot the right angle and propulsion for a swish...
Stanford University Professor Oussama Khatib, who directs the university's robotics lab, said Cue 3 demonstrates complex activities such as using sensors and nimble computation in real-time in what he called "visual feedback."
To shoot hoops, the robot must have a good vision system, be able to compute the ball's path then execute the shot, he said in a telephone interview.
"What Toyota is doing here is really bringing the top capabilities in perception with the top capabilities in control to have robots perform something that is really challenging," Khatib said.
"Toyota robot can’t slam dunk but it shoots a mean 3-pointer" (AP/Asahi Shimbun)
Read the rest
I-Wei Huang (aka Crabfu) makes all sorts of cool steam-powered mini-robots. In this video, he explains how he made a walking robot. Read the rest
Several years ago, I wrote a feature for Bloomberg Businessweek about soft robotics, "in which steel skeletons and power-hungry motors make way for textiles." The idea is that soft robots, often powered by compressed air in pneumatic "muscles," are more flexible, lighter weight, and much safer for their human workmates. Above is video of automation robotics firm Festo's BionicSoftArm. From their description:
Whether free and flexible movements or defined sequences, thanks to its modular design, the pneumatic lightweight robot can be used for numerous applications. In combination with various adaptive grippers, it can pick up and handle a wide variety of objects and shapes. At the same time, it is completely compliant and poses no danger to the user even in the event of a collision.
Read the rest
The United States Food and Drug Administration issued a warning Thursday about the use of surgical robots in breast cancer surgery. FDA says that use of the robotic medical devices in mastectomy, lumpectomy, and related surgery because of "preliminary" evidence that it may be linked to lower long-term survival.
. Read the rest
A Chinese high schooler purchased a handwriting robot to draw Chinese characters for her homework. When her mother discovered the machine, she destroyed it. Read the rest
This is the Android Kannon, a robotic manifestation of the Buddhist bodhisattva associated with mercy. Read the rest
MIT researchers developed a robot that can play Jenga based on a novel approach to machine learning that synthesizes sight and touch. From MIT News:
Read the rest
Alberto Rodriguez, the Walter Henry Gale Career Development Assistant Professor in the Department of Mechanical Engineering at MIT, says the robot demonstrates something that’s been tricky to attain in previous systems: the ability to quickly learn the best way to carry out a task, not just from visual cues, as it is commonly studied today, but also from tactile, physical interactions.
“Unlike in more purely cognitive tasks or games such as chess or Go, playing the game of Jenga also requires mastery of physical skills such as probing, pushing, pulling, placing, and aligning pieces. It requires interactive perception and manipulation, where you have to go and touch the tower to learn how and when to move blocks,” Rodriguez says. “This is very difficult to simulate, so the robot has to learn in the real world, by interacting with the real Jenga tower. The key challenge is to learn from a relatively small number of experiments by exploiting common sense about objects and physics.”
He says the tactile learning system the researchers have developed can be used in applications beyond Jenga, especially in tasks that need careful physical interaction, including separating recyclable objects from landfill trash and assembling consumer products.
“In a cellphone assembly line, in almost every single step, the feeling of a snap-fit, or a threaded screw, is coming from force and touch rather than vision,” Rodriguez says.
The 3D-printed robot above weighs just one milligram and is only 2.5mm at its longest point. Designed by University of Maryland mechanical engineer Ryan St. Pierre and his colleagues, it is likely the smallest walking robot in the world. Video of the microbot scurrying along is below. From IEEE Spectrum:
Like its predecessors, this robot is far too small for traditional motors or electronics. Its legs are controlled by external magnetic fields acting on tiny cubic magnets embedded in the robot’s hips. Rotating magnetic fields cause the magnets to rotate, driving the legs at speeds of up to 150 Hz. With all of the magnets installed into the hips in the same orientation, you get a pronking gait, but other gaits are possible by shifting the magnets around a bit. Top speed is an impressive 37.3 mm/s, or 14.9 body lengths per second, and somewhat surprisingly, the robot seems to be quite durable—it was tested for 1,000,000 actuation cycles “with no signs of visible wear or decreased performance.”
Read the rest
Continuing the quest to design robots that could travel through our bodies to deliver drugs and cure disease, researchers at EPFL and ETH Zurich demonstrated tiny shape-shifting microrobots that swim through blood vessels. Made from hydrogel nanocomposites, the microbots can fold into various shapes for easy travel through tight spaces and flowing with dense, viscous, or fast-moving liquids. The microbots are peppered with magnetic nanoparticles so that they can be "steered" with an external magnetic field. From EPFL:
“Our robots have a special composition and structure that allow them to adapt to the characteristics of the fluid they are moving through. For instance, if they encounter a change in viscosity or osmotic concentration, they modify their shape to maintain their speed and maneuverability without losing control of the direction of motion,” says (EPFL researcher Selman) Sakar.
These deformations can be “programmed” in advance so as to maximize performance without the use of cumbersome sensors or actuators. The robots can be either controlled using an electromagnetic field or left to navigate on their own through cavities by utilizing fluid flow. Either way, they will automatically morph into the most efficient shape.
"Smart microrobots that can adapt to their surroundings" (EPFL)
Read the rest
Released in 1998 by Tiger Electronics, more than 40 million Furbies were sold in its first three years of life. What made this bizarre animatronic toy so damn popular? Read the rest
State-owned TV network Russia-24 ran a story about an impressive humanoid robot named Boris that wowed attendees at a youth technology conference. Turns out, Boris the Robot was actually a man inside a commercially-available, high-end robot costume. From The Guardian:
A photograph published by MBKh Media, the news agency founded by the Vladimir Putin opponent Mikhail Khodorkovsky, appeared to show the actor in the robot suit ahead of the forum on Tuesday in Yaroslavl, a city about 150 miles north-east of Moscow.
The organisers of the Proyektoria technology forum, held each year for the “future intellectual leaders of Russia”, did not try to pass off the robot as real, the website reported.
But whether by mistake or design, the state television footage did just that. “It’s entirely possible one of these [students] could dedicate himself to robotics,” an anchor reported. “Especially as at the forum they have the opportunity to look at the most modern robots.”
Read the rest
Elowan is a "plant-robot hybrid" that uses its own bio-electromechanical signaling to drive itself around toward light sources. From an explanation by researcher Harpreet Sareen and his colleagues at the MIT Media Lab:
In this experimental setup, electrodes are inserted into the regions of interest (stems and ground, leaf and ground). The weak signals are then amplified and sent to the robot to trigger movements to respective directions.
Such symbiotic interplay with the artificial could be extended further with exogenous extensions that provide nutrition, growth frameworks, and new defense mechanisms.
Elowan: A plant-robot hybrid Read the rest
Astronauts on board the International Space Station have switched on CIMON (Crew Interactive Mobile CompanioN), a new AI companion robot built by German space agency DLR, Airbus, and IBM. CIMON is an interface for IBM's WATSON AI system. From Space.com:
Marco Trovatello, a spokesman of the European Space Agency's Astronaut Centre in Cologne, Germany, told Space.com that CIMON could respond within a couple of seconds after a question was asked, no slower than in ground-based tests.
A data link connects CIMON with the Columbus control center in Germany; from there, the signal travels first to the Biotechnology Space Support Center at the Lucerne University in Switzerland, where CIMON's control team is based. Then, the connection is made over the internet to the IBM Cloud in Frankfurt, Germany, Bernd Rattenbacher, the team leader at the ground control centre at Lucerne University, said in the statement...
"CIMON is a technology demonstration of what a future AI-based assistant on the International Space Station or on a future, longer-term exploration mission would look like," Trovatello said. "In the future, an astronaut could ask CIMON to show a procedure for a certain experiment, and CIMON would do that."
Read the rest
Researchers at Melbourne, Australia's RMIT University devised these bizarre "third arm" chest-mounted"robots to experiment with what they call "playful eating." For science. Video below. From RMIT University's Exertion Games Lab:
In this experience, all three arms (the person’s own two arms and the “third” arm, the robotic arm) are used for feeding oneself and the other person. The robotic arm (third arm) is attached to the body via a vest. We playfully subverted the functioning of the robotic arm so that its final movements (once it has picked up the food), i.e. whether to feed the wearer or the partner, are guided by the facial expressions of the dining partner...
Mapping of the partner’s “more positive” facial expression to the feeding of food to the partner (via the wearer’s third arm) we hoped would elicit joy, laughter, and a sense of sharing based on the knowledge of feeding one another that is associated with positive emotions, however, this could also result in the perception of a loss of agency over what one eats. Through to-and-fro ambiguous movements of the third arm in the air (when sensing a “neutral” facial expression of the dining partner), it gave an opportunity to the diners to express their reactions more vividly, as we know that facial expressions become a key element to engage with a partner while eating.
"Arm-A-Dine: Towards Understanding the Design of Playful Embodied Eating Experiences" (PDF)
More at IEEE Spectrum: "Feed Your Friends With Autonomous Chest-Mounted Robot Arms"
Read the rest
This is the new version of Affetto, the robot child head that's a testbed for synthetic facial expressions. According to the Osaka University researchers who birthed Affeto, their goal is to "offer a path for androids to express greater ranges of emotion, and ultimately have deeper interaction with humans." From Osaka University:
The researchers investigated 116 different facial points on Affetto to measure its three-dimensional movement. Facial points were underpinned by so-called deformation units. Each unit comprises a set of mechanisms that create a distinctive facial contortion, such as lowering or raising of part of a lip or eyelid. Measurements from these were then subjected to a mathematical model to quantify their surface motion patterns.
While the researchers encountered challenges in balancing the applied force and in adjusting the synthetic skin, they were able to employ their system to adjust the deformation units for precise control of Affetto’s facial surface motions.
“Android robot faces have persisted in being a black box problem: they have been implemented but have only been judged in vague and general terms,” study first author Hisashi Ishihara says. “Our precise findings will let us effectively control android facial movements to introduce more nuanced expressions, such as smiling and frowning.”
Read the rest