FDA warns against robotic surgery for breast cancer, cervical cancer, & other women's cancers

The United States Food and Drug Administration issued a warning Thursday about the use of surgical robots in breast cancer surgery. FDA says that use of the robotic medical devices in mastectomy, lumpectomy, and related surgery because of "preliminary" evidence that it may be linked to lower long-term survival. . Read the rest

Mother discovers, then destroys Chinese high school student's handwriting robot

A Chinese high schooler purchased a handwriting robot to draw Chinese characters for her homework. When her mother discovered the machine, she destroyed it. Read the rest

Japan's robot deity delivers Buddha's teaching

This is the Android Kannon, a robotic manifestation of the Buddhist bodhisattva associated with mercy. Read the rest

This robot plays Jenga to demonstrate the future of manufacturing

MIT researchers developed a robot that can play Jenga based on a novel approach to machine learning that synthesizes sight and touch. From MIT News:

Alberto Rodriguez, the Walter Henry Gale Career Development Assistant Professor in the Department of Mechanical Engineering at MIT, says the robot demonstrates something that’s been tricky to attain in previous systems: the ability to quickly learn the best way to carry out a task, not just from visual cues, as it is commonly studied today, but also from tactile, physical interactions.

“Unlike in more purely cognitive tasks or games such as chess or Go, playing the game of Jenga also requires mastery of physical skills such as probing, pushing, pulling, placing, and aligning pieces. It requires interactive perception and manipulation, where you have to go and touch the tower to learn how and when to move blocks,” Rodriguez says. “This is very difficult to simulate, so the robot has to learn in the real world, by interacting with the real Jenga tower. The key challenge is to learn from a relatively small number of experiments by exploiting common sense about objects and physics.”

He says the tactile learning system the researchers have developed can be used in applications beyond Jenga, especially in tasks that need careful physical interaction, including separating recyclable objects from landfill trash and assembling consumer products.

“In a cellphone assembly line, in almost every single step, the feeling of a snap-fit, or a threaded screw, is coming from force and touch rather than vision,” Rodriguez says.

Read the rest

This walking microrobot is smaller than an ant's head

The 3D-printed robot above weighs just one milligram and is only 2.5mm at its longest point. Designed by University of Maryland mechanical engineer Ryan St. Pierre and his colleagues, it is likely the smallest walking robot in the world. Video of the microbot scurrying along is below. From IEEE Spectrum:

Like its predecessors, this robot is far too small for traditional motors or electronics. Its legs are controlled by external magnetic fields acting on tiny cubic magnets embedded in the robot’s hips. Rotating magnetic fields cause the magnets to rotate, driving the legs at speeds of up to 150 Hz. With all of the magnets installed into the hips in the same orientation, you get a pronking gait, but other gaits are possible by shifting the magnets around a bit. Top speed is an impressive 37.3 mm/s, or 14.9 body lengths per second, and somewhat surprisingly, the robot seems to be quite durable—it was tested for 1,000,000 actuation cycles “with no signs of visible wear or decreased performance.”

Read the rest

Shapeshifting microrobots to travel through your bloodstream

Continuing the quest to design robots that could travel through our bodies to deliver drugs and cure disease, researchers at EPFL and ETH Zurich demonstrated tiny shape-shifting microrobots that swim through blood vessels. Made from hydrogel nanocomposites, the microbots can fold into various shapes for easy travel through tight spaces and flowing with dense, viscous, or fast-moving liquids. The microbots are peppered with magnetic nanoparticles so that they can be "steered" with an external magnetic field. From EPFL:

“Our robots have a special composition and structure that allow them to adapt to the characteristics of the fluid they are moving through. For instance, if they encounter a change in viscosity or osmotic concentration, they modify their shape to maintain their speed and maneuverability without losing control of the direction of motion,” says (EPFL researcher Selman) Sakar.

These deformations can be “programmed” in advance so as to maximize performance without the use of cumbersome sensors or actuators. The robots can be either controlled using an electromagnetic field or left to navigate on their own through cavities by utilizing fluid flow. Either way, they will automatically morph into the most efficient shape.

"Smart microrobots that can adapt to their surroundings" (EPFL)

Read the rest

The Furby was "coded for cuteness"

Released in 1998 by Tiger Electronics, more than 40 million Furbies were sold in its first three years of life. What made this bizarre animatronic toy so damn popular? Read the rest

Impressive robot praised by Russia state television revealed to be a man in a costume

State-owned TV network Russia-24 ran a story about an impressive humanoid robot named Boris that wowed attendees at a youth technology conference. Turns out, Boris the Robot was actually a man inside a commercially-available, high-end robot costume. From The Guardian:

A photograph published by MBKh Media, the news agency founded by the Vladimir Putin opponent Mikhail Khodorkovsky, appeared to show the actor in the robot suit ahead of the forum on Tuesday in Yaroslavl, a city about 150 miles north-east of Moscow.

The organisers of the Proyektoria technology forum, held each year for the “future intellectual leaders of Russia”, did not try to pass off the robot as real, the website reported.

But whether by mistake or design, the state television footage did just that. “It’s entirely possible one of these [students] could dedicate himself to robotics,” an anchor reported. “Especially as at the forum they have the opportunity to look at the most modern robots.”

Read the rest

This plant drives its own robot

Elowan is a "plant-robot hybrid" that uses its own bio-electromechanical signaling to drive itself around toward light sources. From an explanation by researcher Harpreet Sareen and his colleagues at the MIT Media Lab:

In this experimental setup, electrodes are inserted into the regions of interest (stems and ground, leaf and ground). The weak signals are then amplified and sent to the robot to trigger movements to respective directions.

Such symbiotic interplay with the artificial could be extended further with exogenous extensions that provide nutrition, growth frameworks, and new defense mechanisms.

Elowan: A plant-robot hybrid Read the rest

Space station astronauts have a new floating AI robot companion

Astronauts on board the International Space Station have switched on CIMON (Crew Interactive Mobile CompanioN), a new AI companion robot built by German space agency DLR, Airbus, and IBM. CIMON is an interface for IBM's WATSON AI system. From Space.com:

Marco Trovatello, a spokesman of the European Space Agency's Astronaut Centre in Cologne, Germany, told Space.com that CIMON could respond within a couple of seconds after a question was asked, no slower than in ground-based tests.

A data link connects CIMON with the Columbus control center in Germany; from there, the signal travels first to the Biotechnology Space Support Center at the Lucerne University in Switzerland, where CIMON's control team is based. Then, the connection is made over the internet to the IBM Cloud in Frankfurt, Germany, Bernd Rattenbacher, the team leader at the ground control centre at Lucerne University, said in the statement...

"CIMON is a technology demonstration of what a future AI-based assistant on the International Space Station or on a future, longer-term exploration mission would look like," Trovatello said. "In the future, an astronaut could ask CIMON to show a procedure for a certain experiment, and CIMON would do that."

Read the rest

Diners use chest-mounted robot arms to feed each other in unusual social experiment

Researchers at Melbourne, Australia's RMIT University devised these bizarre "third arm" chest-mounted"robots to experiment with what they call "playful eating." For science. Video below. From RMIT University's Exertion Games Lab:

In this experience, all three arms (the person’s own two arms and the “third” arm, the robotic arm) are used for feeding oneself and the other person. The robotic arm (third arm) is attached to the body via a vest. We playfully subverted the functioning of the robotic arm so that its final movements (once it has picked up the food), i.e. whether to feed the wearer or the partner, are guided by the facial expressions of the dining partner...

Mapping of the partner’s “more positive” facial expression to the feeding of food to the partner (via the wearer’s third arm) we hoped would elicit joy, laughter, and a sense of sharing based on the knowledge of feeding one another that is associated with positive emotions, however, this could also result in the perception of a loss of agency over what one eats. Through to-and-fro ambiguous movements of the third arm in the air (when sensing a “neutral” facial expression of the dining partner), it gave an opportunity to the diners to express their reactions more vividly, as we know that facial expressions become a key element to engage with a partner while eating.

"Arm-A-Dine: Towards Understanding the Design of Playful Embodied Eating Experiences" (PDF)

More at IEEE Spectrum: "Feed Your Friends With Autonomous Chest-Mounted Robot Arms"

Read the rest

Direct from the Uncanny Valley: Affetto, the freaky child android head

This is the new version of Affetto, the robot child head that's a testbed for synthetic facial expressions. According to the Osaka University researchers who birthed Affeto, their goal is to "offer a path for androids to express greater ranges of emotion, and ultimately have deeper interaction with humans." From Osaka University:

The researchers investigated 116 different facial points on Affetto to measure its three-dimensional movement. Facial points were underpinned by so-called deformation units. Each unit comprises a set of mechanisms that create a distinctive facial contortion, such as lowering or raising of part of a lip or eyelid. Measurements from these were then subjected to a mathematical model to quantify their surface motion patterns.

While the researchers encountered challenges in balancing the applied force and in adjusting the synthetic skin, they were able to employ their system to adjust the deformation units for precise control of Affetto’s facial surface motions.

“Android robot faces have persisted in being a black box problem: they have been implemented but have only been judged in vague and general terms,” study first author Hisashi Ishihara says. “Our precise findings will let us effectively control android facial movements to introduce more nuanced expressions, such as smiling and frowning.”

Read the rest

Prototyping the betentacled, inflatable soft robots of zero gee

The MIT Media Lab's Spatial Flux Project was created by Carson Smuts and Chrisoula Kapelonis to imagine and prototype soft inflatable robots that would be designed to operate in zero-gee, where there is no up or down and "we do not have to contend with architecture's greatest arch-nemesis, gravity." Read the rest

Robots! A fantastic catalog of new species

IEEE Spectrum editor Erico Guizzo and colleagues have blown out their original Robots app into a fantastic catalog of 200 of today's fantastic species of robots. They're cleverly organized into fun categories like "Robots You Can Hug," "Robots That Can Dance," "Space Robots," and "Factory Workers." If they keep it updated, it'll be very helpful for the robot uprising. After all, you can't tell the players without a program!

Robots: Your Guide to the World of Robotics (IEEE Spectrum)

Read the rest

Incredible video of Boston Dynamics' Atlas robot doing parkour

Boston Dynamics has just released this astounding video of their Atlas humanoid robot doing parkour:

The control software uses the whole body including legs, arms and torso, to marshal the energy and strength for jumping over the log and leaping up the steps without breaking its pace. (Step height 40 cm.) Atlas uses computer vision to locate itself with respect to visible markers on the approach to hit the terrain accurately.

Unfortunately the engineers failed to outfit Atlas with a speech synthesizer to yell "Parkour! Parkour! Parkour!" like so.

Read the rest

Scientists print robotic flippers based on sea lions

Most aquatic animals propel themselves with a tail or fluke, so roboticists have long been interested in the remarkable speeds possible by mimicking sea lion propulsion with front flippers. Read the rest

Electromechanical "skin" turns everyday objects into robots

Yale engineers developed "robotic skins" from elastic sheets integrating sensors and electromechanical actuators. The idea is that most any flexible object could be transformed into a robot. Professor Rebecca Kramer-Bottiglio and her colleagues reported on their project, called OmniSkins, in the journal Science Robotics. From YaleNews:

Placed on a deformable object — a stuffed animal or a foam tube, for instance — the skins animate these objects from their surfaces. The makeshift robots can perform different tasks depending on the properties of the soft objects and how the skins are applied.

“We can take the skins and wrap them around one object to perform a task — locomotion, for example — and then take them off and put them on a different object to perform a different task, such as grasping and moving an object,” she said. “We can then take those same skins off that object and put them on a shirt to make an active wearable device.”

Read the rest

More posts