Robot bird with real pigeon feathers to improve agility

PigeonBot is a robotic bird outfitted with real pigeon feathers that move to reshape its wings like an actual bird. Developed by researchers in Stanford's LentinkLab, the remote-controlled PigeonBot demonstrates how morphing wings improves flying agility. (Video below.) Their resulting technical paper is the cover story in the current issue of the journal Science Robotics. From Science News:

Birds can modify the shape of their wings by fanning out their feathers or shuffling them closer together. Those adjustments allow birds to cut through the sky more nimbly than rigid drones....

Researchers bent and extended the wings of dead pigeons to investigate how the birds control their wing shape. Those experiments revealed that the angles of two wing joints, the wrist and the finger, most affect the alignment of a wing’s flight feathers. The orientations of those long, stiff feathers, which support the bird in flight, help determine the wing’s shape. Based on those findings, the team built a robot with real pigeon feathers, whose faux wrists and fingers can morph its wing shape as seen in the pigeon cadavers.

Read the rest

Very weird faceless robot baby for elderly people

Hiro-chan is a very simple, inexpensive, and, er, faceless robotic baby doll designed to comfort elderly people. (Video below.) Unlike the very similar looking Amish dolls that lack faces for religious reasons, Hiro-chan's developers Vstone say that leaving the features up to the individual's imagination is an effective way to increase the emotional bond. From Evan Ackerman's article at IEEE Spectrum:

Hiro-chan’s entire existence seems to be based around transitioning from sad to happy in response to hugs. If left alone, Hiro-chan’s mood will gradually worsen and it’ll start crying. If you pick it up and hug it, an accelerometer will sense the motion, and Hiro-chan’s mood will improve until it starts to laugh. This is the extent of the interaction, but you’ll be glad to know that the robot has access to over 100 utterance variations collected from an actual baby (or babies) to make sure that mood changes are fluid and seamless.

...Since the functionality of the robot depends on you getting it go from sad to happy, Vstone says that giving the robot a face (and a fixed expression) would make that much less convincing and emotionally fulfilling—the robot would have the “wrong” expression half the time. Instead, the user can listen to Hiro-chan’s audio cues and imagine a face. Or not. Either way, the Uncanny Valley effect is avoided (as long as you can get over the complete lack of face, which I personally couldn’t), and the cost of the robot is kept low since there’s no need for actuators or a display.

Read the rest

Pizza-making robot startup lays off 80% of staff

Softbank-funded unicorn Zume ran out of dough

Students build pyramid from 27,434 toilet paper rolls (VIDEO)

A group of high school students in Michigan made good use of their holiday break: they built a toilet paper pyramid. Read the rest

Remembering Laundroid and other robotics companies that died in 2019

Robotics is tough business. “If you think 2018 was a tough year for robotics companies, 2019 wasn’t any better,” writes Peter Singer. Read the rest

North Carolina couple call cops on midnight intruder that turns out to be their robot vacuum

[The moral of this story is buy a Roomba, they last longer and have better software.]

A man and a woman in Forsyth County, North Carolina, called for help just after midnight when they awoke to loud noises and crashing downstairs, and hid in their closet to dial 911. Read the rest

Marvel at this autonomous, drifting DeLorean

MARTY is the name of this 1981 DeLorean that researchers from Stanford’s Dynamic Design Lab customized into a self-driving electric car. Now, Jon Goh and Tushar Goel have augmented MARTY so it's capable of drifting through a complicated driving course with incredible precision. From Stanford:

Conducting research in high-speed, complicated driving conditions like this is a bread-and-butter approach of the Dynamic Design Lab, where mechanical engineer Chris Gerdes and his students steer autonomous cars into challenging driving situations that only the top human drivers can reliably handle. On-board computers measure the car’s response over dozens of runs, and the engineers translate those vehicle dynamics into software that could one day help your car quickly dodge a pedestrian that darts into the road.

Most automated vehicles on the road have been designed to handle simpler cases of driving, such as staying in a lane or maintaining the right distance from other cars.

“We’re trying to develop automated vehicles that can handle emergency maneuvers or slippery surfaces like ice or snow,” Gerdes said. “We’d like to develop automated vehicles that can use all of the friction between the tire and the road to get the car out of harm’s way. We want the car to be able to avoid any accident that’s avoidable within the laws of physics.”

Read the rest

Robot Reindeer Happy Holidays

A classic robotics video from Boston Dynamics in 2015.

Think of how much more developed the robots are now!

Yikes.

[YouTube] Read the rest

Robots that can repair themselves and self-augment

University of Tokyo engineers have taught a robot how to repair itself. Well, they taught it to tighten its own screws. And with that skill, it also was able to self-install a hook for hanging a tote bag from its shoulder. From IEEE Spectrum:

At the moment, the robot can’t directly detect on its own whether a particular screw needs tightening, although it can tell if its physical pose doesn’t match its digital model, which suggests that something has gone wonky. It can also check its screws autonomously from time to time, or rely on a human physically pointing out that it has a screw loose, using the human’s finger location to identify which screw it is. Another challenge is that most robots, like most humans, are limited in the areas on themselves that they can comfortably reach. So to tighten up everything, they might have to find themselves a robot friend to help, just like humans help each other put on sunblock.

And here is their technical paper: "Self-Repair and Self-Extension by Tightening Screws based onPrecise Calculation of Screw Pose of Self-Body with CAD Dataand Graph Search with Regrasping a Driver" Read the rest

The Life Cycle podcast talks transhumanism with Kernel CEO Bryan Johnson and "To Be a Machine" author Mark O'Connell

Is your brain a machine? Are your thoughts and feelings just malware of the mind? (And what "really" is a machine, anyway?) John and Eva referee the transhumanist fight of the century. In the blue corner, we have Eva meeting founder and Bryan Johnson, CEO of Kernel, straight from his office in LA. And in the red corner, John meets with To Be a Machine author Mark O'Connell in a cafe in Dublin. Time to get out the popcorn! Round One, ding-ding...

The Life Cycle is a production of Klang Games, creator of Seed, the planet colonization MMO -- watch the new trailer here.  Subscribe to The Life Cycle on Apple PodcastsGoogle Podcasts, and Spotify. Follow The Life Cycle on Twitter and Instagram. Read the rest

Robot millipede printed by 3D printer that can spew out multiple materials

It's historically been tough and slow-going to 3D print objects made from multiple materials. Now, Harvard researchers developed an ingenious nozzle that enables the 3D printer to spew out eight different materials at the resolution of a human hair. To demonstrate the system, they printed fantastic flexible origami structures and even a "soft" robotic millipede from a variety of epoxy and silicone elastomer inks. Mark A. Skylar-Scott, Jochen Mueller, and their colleagues from Harvard's Wyss Institute for Biologically Inspired Engineering presented their work in the scientific journal Nature: "Voxelated soft matter via multimaterial multinozzle 3D printing"

Read the rest

The Life Cycle podcast meets neurologist Dr. Phil Kennedy, who had a brain-computer interface implanted in his head

In Episode 5 of this podcast on the future of humanity, co-host Eva Kelley travels to meet transhumanist pioneer and neurologist Dr. Phil Kennedy, who recently had a brain-computer interface installed in his own head. Dr. Kennedy tells Eva all about that experience (including gory footage from the operation), compares his approach to brain-computer interfaces with those being developed by people like Elon Musk ("they forgot the brain doesn't like electrodes"), and discusses the implications of this technology on human evolution. Eva and co-host John Holten close by reading an excerpt from Dr. Kennedy's self-published novel, which features a sex scene between a life support robot and his longtime wife.

The Life Cycle is a production of Klang Games, creator of Seed, the planet colonization MMO -- watch the new trailer here.  Subscribe to The Life Cycle on Apple PodcastsGoogle Podcasts, and Spotify. Follow The Life Cycle on Twitter and Instagram. Read the rest

Robot appendage "grows" like a plant

Inspired by the way plants grow, MIT researchers designed a flexible robot appendage that can work in tight spaces but is rigid enough to support heavy parts or twist tight screws. From MIT News:

The appendage design is inspired by the way plants grow, which involves the transport of nutrients, in a fluidized form, up to the plant’s tip. There, they are converted into solid material to produce, bit by bit, a supportive stem.

Likewise, the robot consists of a “growing point,” or gearbox, that pulls a loose chain of interlocking blocks into the box. Gears in the box then lock the chain units together and feed the chain out, unit by unit, as a rigid appendage...

“The realization of the robot is totally different from a real plant, but it exhibits the same kind of functionality, at a certain abstract level,” (mechanical engineer Harry) Asada says.

Read the rest

Adorable Mini Cheetah robots tested

The MIT's Biomimetics lab tested nine Mini Cheetah robots in the institute's Killian Court. Here's the adorable yet uncanny footage. Read the rest

Robot assemblers build structures out of identical modular pieces

Pushing forward on the vision of "programmable matter," MIT researchers demonstrated a new kind of assembly system based on robots that can collaboratively build complicated structures from small identical pieces. Professor Neil Gershenfeld, graduate student Benjamin Jenett, and their colleagues present their research in a scientific paper titled "Material–Robot System for Assembly of Discrete Cellular Structures." From MIT News:

“What’s at the heart of this is a new kind of robotics, that we call relative robots,” Gershenfeld says. Historically, he explains, there have been two broad categories of robotics — ones made out of expensive custom components that are carefully optimized for particular applications such as factory assembly, and ones made from inexpensive mass-produced modules with much lower performance. The new robots, however, are an alternative to both. They’re much simpler than the former, while much more capable than the latter, and they have the potential to revolutionize the production of large-scale systems, from airplanes to bridges to entire buildings.

According to Gershenfeld, the key difference lies in the relationship between the robotic device and the materials that it is handling and manipulating. With these new kinds of robots, “you can’t separate the robot from the structure — they work together as a system,” he says. For example, while most mobile robots require highly precise navigation systems to keep track of their position, the new assembler robots only need to keep track of where they are in relation to the small subunits, called voxels, that they are currently working on.

Read the rest

Watch: OpenAI enabled a one-handed robot to solve a Rubik's Cube

OpenAI Inc. demonstrated a one-handed robot solving a Rubik's Cube. Apparently the real breakthrough in this milestone was teaching the system to do the task in simulation. “While the video makes it easy to focus on the physical robot, the magic is mostly happening in simulation, and transferring things learned in simulation to the real world," writes Evan Ackerman in IEEE Spectrum:

The researchers point out that the method they’ve developed here is general purpose, and you can train a real-world robot to do pretty much any task that you can adequately simulate. You don’t need any real-world training at all, as long as your simulations are diverse enough, which is where the automatic domain randomization comes in. The long-term goal is to reduce the task specialization that’s inherent to most robots, which will help them be more useful and adaptable in real-world applications.

Read the rest

This soft inchworm robot could lead to new smart clothing and morphing airplane wings

This soft, inchworm robot changes shape in response to tiny electrical or temperature changes. The power-efficient robot is made from a specialized "programmable" polymer technology that, according to the University of Toronto researchers, could someday lead to lightweight and safer robots but also enable other kinds of smart materials. From EurekAlert!:

"In situations where humans could be in danger -- a gas leak or a fire -- we could outfit a crawling robot with a sensor to measure the harmful environment," explains Naguib. "In aerospace, we could see smart materials being the key to next-generation aircrafts with wings that morph."

Though he points out it will be some time before the world sees morphed-wing aircrafts, the most immediate impact will be seen in wearable technology.

"We're working to apply this material to garments. These garments would compress or release based on body temperature, which could be therapeutic to athletes," says Naguib. The team is also studying whether smart garments could be beneficial for spinal cord injuries.

"In this case, we've trained it to move like a worm," he says. "But our innovative approach means we could train robots to mimic many movements -- like the wings of a butterfly."

Read the rest

More posts