This robot plays Jenga to demonstrate the future of manufacturing

MIT researchers developed a robot that can play Jenga based on a novel approach to machine learning that synthesizes sight and touch. From MIT News:

Alberto Rodriguez, the Walter Henry Gale Career Development Assistant Professor in the Department of Mechanical Engineering at MIT, says the robot demonstrates something that’s been tricky to attain in previous systems: the ability to quickly learn the best way to carry out a task, not just from visual cues, as it is commonly studied today, but also from tactile, physical interactions.

“Unlike in more purely cognitive tasks or games such as chess or Go, playing the game of Jenga also requires mastery of physical skills such as probing, pushing, pulling, placing, and aligning pieces. It requires interactive perception and manipulation, where you have to go and touch the tower to learn how and when to move blocks,” Rodriguez says. “This is very difficult to simulate, so the robot has to learn in the real world, by interacting with the real Jenga tower. The key challenge is to learn from a relatively small number of experiments by exploiting common sense about objects and physics.”

He says the tactile learning system the researchers have developed can be used in applications beyond Jenga, especially in tasks that need careful physical interaction, including separating recyclable objects from landfill trash and assembling consumer products.

“In a cellphone assembly line, in almost every single step, the feeling of a snap-fit, or a threaded screw, is coming from force and touch rather than vision,” Rodriguez says.

Read the rest

This walking microrobot is smaller than an ant's head

The 3D-printed robot above weighs just one milligram and is only 2.5mm at its longest point. Designed by University of Maryland mechanical engineer Ryan St. Pierre and his colleagues, it is likely the smallest walking robot in the world. Video of the microbot scurrying along is below. From IEEE Spectrum:

Like its predecessors, this robot is far too small for traditional motors or electronics. Its legs are controlled by external magnetic fields acting on tiny cubic magnets embedded in the robot’s hips. Rotating magnetic fields cause the magnets to rotate, driving the legs at speeds of up to 150 Hz. With all of the magnets installed into the hips in the same orientation, you get a pronking gait, but other gaits are possible by shifting the magnets around a bit. Top speed is an impressive 37.3 mm/s, or 14.9 body lengths per second, and somewhat surprisingly, the robot seems to be quite durable—it was tested for 1,000,000 actuation cycles “with no signs of visible wear or decreased performance.”

Read the rest

Ancient civilizations' fascination with AI, robots, and synthetic life

Stanford folklorist and science historian Adrienne Mayor has a fascinating-sounding new book out, titled "Gods and Robots: Myths, Machines, and Ancient Dreams of Technology." It's a survey of how ancient Greeks, Romans, Indian, and Chinese myths imagined and grappled with visions of synthetic life, artificial intelligence, and autonomous robots. From Mayor's interview at Princeton University Press:

Who first imagined the concept of robots?

Most historians of science trace the first automatons to the Middle Ages. But I wondered, was it possible that ideas about creating artificial life were thinkable long before technology made such enterprises possible? Remarkably, as early as the time of Homer, ancient Greek myths were envisioning how to imitate, augment, and surpass nature, by means of biotechne, “life through craft”—what we now call biotechnology. Beings described as fabricated, “made, not born,” appeared in myths about Jason and the Argonauts, the sorceress Medea, the bronze robot Talos, the ingenious craftsman Daedalus, the fire-bringer Prometheus, and Pandora, the female android created by Hephaestus, god of invention. These vivid stories were ancient thought experiments set in an alternate world where technology was marvelously advanced.

Modern sci-fi movies pop up in several chapters. How do they relate to ancient myths?

Some 2,500 years before movies were invented, ancient Greek vase painters illustrated popular stories of the bronze robot warrior Talos, the techno-wizard Medea, and the fembot Pandora dispatched to earth on an evil mission, in ways that seem very “cinematic...” Movies and myths about imagined technology are cultural dreams. Like contemporary science fiction tales, the myths show how the power of imagination allows humans to ponder how artificial life might be created—if only one possessed sublime technology and genius.

Read the rest

Shapeshifting microrobots to travel through your bloodstream

Continuing the quest to design robots that could travel through our bodies to deliver drugs and cure disease, researchers at EPFL and ETH Zurich demonstrated tiny shape-shifting microrobots that swim through blood vessels. Made from hydrogel nanocomposites, the microbots can fold into various shapes for easy travel through tight spaces and flowing with dense, viscous, or fast-moving liquids. The microbots are peppered with magnetic nanoparticles so that they can be "steered" with an external magnetic field. From EPFL:

“Our robots have a special composition and structure that allow them to adapt to the characteristics of the fluid they are moving through. For instance, if they encounter a change in viscosity or osmotic concentration, they modify their shape to maintain their speed and maneuverability without losing control of the direction of motion,” says (EPFL researcher Selman) Sakar.

These deformations can be “programmed” in advance so as to maximize performance without the use of cumbersome sensors or actuators. The robots can be either controlled using an electromagnetic field or left to navigate on their own through cavities by utilizing fluid flow. Either way, they will automatically morph into the most efficient shape.

"Smart microrobots that can adapt to their surroundings" (EPFL)

Read the rest

Walking car design proposed

Hyundai developed a "walking car" and "unveiled" it at the CES trade show yesterday. As far as I can tell it's just a mockup video, but the idea is timely. Traditionally, only flimsy and obviously impractical spider-legged robots could handle rough terrain, while two- and four-legged ones were too unstable to get far. Will Boston Dynamics soon have serious competition? Add your thoughts in the comments, increasingly-useless waterlogged meatbags! Read the rest

This pop-up cafe in Japan has a staff of robots controlled by people with disabilities

The host of Life Where I'm From, a YouTube channel about life as a foreigner in Japan, visited a pop-up cafe with robot waiters that are remotely controlled by people with disabilities. The workers control the robots from their homes, and can speak with customers using a microphone.

In December of 2018 I visited a temporary robot cafe, but it's not the type of Japanese robot cafe that comes to mind to most. Rather than a robotic show, this was a cafe where the robot waiters were in fact avatars for people with disabilities, who remotely controlled them from their homes.

This cafe was part of an initiative by The Nippon Foundation, Ory Lab Inc., and ANA Holdings Inc. It allowed people with ALS and severe disabilities to be able to work and interact with the world.

Read the rest

A cafe where the robot waiters are remote-piloted by paralyzed people

Dawn ver.β is a Tokyo cafe in Akasaka where all the table service is performed by 120 cm tall OriHime-D robots that are piloted by people who are paralyzed and work from home; it was inspired by a fictional cafe in the 2008 anime Time of Eve. Read the rest

Impressive robot praised by Russia state television revealed to be a man in a costume

State-owned TV network Russia-24 ran a story about an impressive humanoid robot named Boris that wowed attendees at a youth technology conference. Turns out, Boris the Robot was actually a man inside a commercially-available, high-end robot costume. From The Guardian:

A photograph published by MBKh Media, the news agency founded by the Vladimir Putin opponent Mikhail Khodorkovsky, appeared to show the actor in the robot suit ahead of the forum on Tuesday in Yaroslavl, a city about 150 miles north-east of Moscow.

The organisers of the Proyektoria technology forum, held each year for the “future intellectual leaders of Russia”, did not try to pass off the robot as real, the website reported.

But whether by mistake or design, the state television footage did just that. “It’s entirely possible one of these [students] could dedicate himself to robotics,” an anchor reported. “Especially as at the forum they have the opportunity to look at the most modern robots.”

Read the rest

This plant drives its own robot

Elowan is a "plant-robot hybrid" that uses its own bio-electromechanical signaling to drive itself around toward light sources. From an explanation by researcher Harpreet Sareen and his colleagues at the MIT Media Lab:

In this experimental setup, electrodes are inserted into the regions of interest (stems and ground, leaf and ground). The weak signals are then amplified and sent to the robot to trigger movements to respective directions.

Such symbiotic interplay with the artificial could be extended further with exogenous extensions that provide nutrition, growth frameworks, and new defense mechanisms.

Elowan: A plant-robot hybrid Read the rest

Space station astronauts have a new floating AI robot companion

Astronauts on board the International Space Station have switched on CIMON (Crew Interactive Mobile CompanioN), a new AI companion robot built by German space agency DLR, Airbus, and IBM. CIMON is an interface for IBM's WATSON AI system. From Space.com:

Marco Trovatello, a spokesman of the European Space Agency's Astronaut Centre in Cologne, Germany, told Space.com that CIMON could respond within a couple of seconds after a question was asked, no slower than in ground-based tests.

A data link connects CIMON with the Columbus control center in Germany; from there, the signal travels first to the Biotechnology Space Support Center at the Lucerne University in Switzerland, where CIMON's control team is based. Then, the connection is made over the internet to the IBM Cloud in Frankfurt, Germany, Bernd Rattenbacher, the team leader at the ground control centre at Lucerne University, said in the statement...

"CIMON is a technology demonstration of what a future AI-based assistant on the International Space Station or on a future, longer-term exploration mission would look like," Trovatello said. "In the future, an astronaut could ask CIMON to show a procedure for a certain experiment, and CIMON would do that."

Read the rest

NASA InSight robot lander's amazing first images from Mars

After sticking a perfect landing on the Martian surface this afternoon, NASA's InSight robot lander has successfully deployed its solar panels. Tomorrow, InSight will fire up its scientific instruments and get to work collecting data about the planet's interior. From NASA/JPL-Caltech:

NASA's InSight has sent signals to Earth indicating that its solar panels are open and collecting sunlight on the Martian surface. NASA's Mars Odyssey orbiter relayed the signals, which were received on Earth at about 5:30 p.m. PST (8:30 p.m. EST). Solar array deployment ensures the spacecraft can recharge its batteries each day. Odyssey also relayed a pair of images showing InSight's landing site.

"The InSight team can rest a little easier tonight now that we know the spacecraft solar arrays are deployed and recharging the batteries," said Tom Hoffman, InSight's project manager at NASA's Jet Propulsion Laboratory in Pasadena, California, which leads the mission. "It's been a long day for the team. But tomorrow begins an exciting new chapter for InSight: surface operations and the beginning of the instrument deployment phase..."

In the coming days, the mission team will unstow InSight's robotic arm and use the attached camera to snap photos of the ground so that engineers can decide where to place the spacecraft's scientific instruments. It will take two to three months before those instruments are fully deployed and sending back data.

In the meantime, InSight will use its weather sensors and magnetometer to take readings from its landing site at Elysium Planitia — its new home on Mars.

Read the rest

Diners use chest-mounted robot arms to feed each other in unusual social experiment

Researchers at Melbourne, Australia's RMIT University devised these bizarre "third arm" chest-mounted"robots to experiment with what they call "playful eating." For science. Video below. From RMIT University's Exertion Games Lab:

In this experience, all three arms (the person’s own two arms and the “third” arm, the robotic arm) are used for feeding oneself and the other person. The robotic arm (third arm) is attached to the body via a vest. We playfully subverted the functioning of the robotic arm so that its final movements (once it has picked up the food), i.e. whether to feed the wearer or the partner, are guided by the facial expressions of the dining partner...

Mapping of the partner’s “more positive” facial expression to the feeding of food to the partner (via the wearer’s third arm) we hoped would elicit joy, laughter, and a sense of sharing based on the knowledge of feeding one another that is associated with positive emotions, however, this could also result in the perception of a loss of agency over what one eats. Through to-and-fro ambiguous movements of the third arm in the air (when sensing a “neutral” facial expression of the dining partner), it gave an opportunity to the diners to express their reactions more vividly, as we know that facial expressions become a key element to engage with a partner while eating.

"Arm-A-Dine: Towards Understanding the Design of Playful Embodied Eating Experiences" (PDF)

More at IEEE Spectrum: "Feed Your Friends With Autonomous Chest-Mounted Robot Arms"

Read the rest

Direct from the Uncanny Valley: Affetto, the freaky child android head

This is the new version of Affetto, the robot child head that's a testbed for synthetic facial expressions. According to the Osaka University researchers who birthed Affeto, their goal is to "offer a path for androids to express greater ranges of emotion, and ultimately have deeper interaction with humans." From Osaka University:

The researchers investigated 116 different facial points on Affetto to measure its three-dimensional movement. Facial points were underpinned by so-called deformation units. Each unit comprises a set of mechanisms that create a distinctive facial contortion, such as lowering or raising of part of a lip or eyelid. Measurements from these were then subjected to a mathematical model to quantify their surface motion patterns.

While the researchers encountered challenges in balancing the applied force and in adjusting the synthetic skin, they were able to employ their system to adjust the deformation units for precise control of Affetto’s facial surface motions.

“Android robot faces have persisted in being a black box problem: they have been implemented but have only been judged in vague and general terms,” study first author Hisashi Ishihara says. “Our precise findings will let us effectively control android facial movements to introduce more nuanced expressions, such as smiling and frowning.”

Read the rest

Kickstarting a Da Vinci-inspired, programmable, mechanical drawing robot-arm

https://drawmaton.com/wp-content/uploads/2018/10/45118351_358662618238424_3396031426926215168_n.mp4

Robert Sabuda (previously) writes, "It has long been a dream of the Leonardo da Vinci Robot Society to bring one of the Renaissance Master’s creations back to life. 2019 is the 500th anniversary of da Vinci’s death and the Society has chosen to honor his memory and celebrate his life through one of his best known inventions - the Robot Knight. This robot is an early proto-computer android whose read-only programmable memory allowed it to perform many actions. And it was also rumored that the robot’s arm could also perform an extremely complex task…draw a picture!" Read the rest

Watch The Grind Master copy a neoclassical statue

Set aside your feeble 3D printers, meatbags. The Grind Master demonstrates the supremacy of fully-automated robot milling of stone, wood and other sculptural materials. Here the Buddha's head is sculpted from foam:

Previously: Epic glove ad explains benefits of gloves Read the rest

Robots! A fantastic catalog of new species

IEEE Spectrum editor Erico Guizzo and colleagues have blown out their original Robots app into a fantastic catalog of 200 of today's fantastic species of robots. They're cleverly organized into fun categories like "Robots You Can Hug," "Robots That Can Dance," "Space Robots," and "Factory Workers." If they keep it updated, it'll be very helpful for the robot uprising. After all, you can't tell the players without a program!

Robots: Your Guide to the World of Robotics (IEEE Spectrum)

Read the rest

Dancing robot dog

They're figuring out what we really want from a 21st Century deathbot: moves.

When was the last time a human was seen in one of these videos? Perhaps in the next one we'll see a human crawling on all fours over ice, making loud engine noises between terrified whimpers, only for a perfectly stable bipedal robot to lunge in from off-screen and kick it. Then we'll know what has become of the fleshbags at Boston Dynamics. Read the rest

More posts