Disney Research Zurich and ETH Zurich (Swiss Federal Institute of Technology) developed VertiGo, a mobile robot that can roll up walls. It uses two tiltable propellers that keep it rolling and also provide the thrust that keeps it against the wall when moving vertically.
“About why Disney is interested in this area, I am not able to say specifics as you can understand," Disney Research scientist Paul Beardsley told IEEE Spectrum. "But just speaking in general, one can imagine that robots with lighting effects could be useful for entertainment effects or for wall games. This also relates to the question of why the ground-wall transition is useful. If you have to manually place a robot on a wall at the start of a deployment, and manually remove it at the end, then that's taking manpower and it's not flexible. If the robot can make those transitions automatically, then you are a step in the direction of autonomous deployment, and that makes the technology more powerful. We are motivated by making a practical device, so it is real-world feedback and challenges that drive our work.”
Harry Houdini thought he was a superstar, but there was one medium he was unable to conquer: movies.
Tomorrow evening (11/20), San Francisco's de Young Museum will celebrate "100 Years of Robot Art and Science in the Bay Area" with an event organized by UC Berkeley professor Ken Goldberg and Alexander Rose, executive director of the Long Now Foundation. The program includes a "Long Conversation," sort of a relay race discussion that I'll be participating in along with ten interesting people whose work is at the intersection of art and technology! Bonus: My friend Kal Spelletich will also bring two of his "praying robots" seen above! Best of all, it's free and starts early (6:30pm)!
Josette Melchor (Grey Area Foundation for the Arts)
Dorothy R. Santos (writer, curator)
Tim Roseborough (artist, musician, former Kimball Artist-in-Residence)
John Markoff (author of Machines of Loving Grace)
Karen Marcelo (dorkbotSF)
David Pescovitz (Boing Boing and Institute for the Future)
Catharine Clark (Catharine Clark Gallery)
Alexander Rose (director, Long Now Foundation)
Pieter Abbeel (professor, Computer Sciences, UC Berkeley)
Terry Winograd (Computer Science department, Stanford Univeristy)
Kal Spelletich (Seemen)
With special VJ Jenny Odell
Complimentary tickets for the long conversation are distributed beginning at 5:30 pm at the Koret Auditorium entrance. Seating is limited. Tickets are first come, first served.
Programming and general admission to the permanent collection galleries are free of charge during Friday Nights at the de Young. A discounted $15 ticket is required to visit the special exhibition galleries.
Long Conversation (de Young)
“100 Years of Robot Art and Science in the Bay Area” Long Conversation November 20th 02015 (The Long Now Foundation) Read the rest
Robots have a hard time making their way across uneven, unstable terrain. Read the rest
MIT researchers developed this "Soft Cube Capable of Controllable Continuous Jumping." From IEEE Spectrum:
Inside of the robot there are two motorized rotors, each connected to one end of four flattened loops of spring steel. Activating the rotors causes the spring steel loops that I’m just going to go ahead and call tongues to get pulled through rectangular openings (mouths) into a round cavity inside the body of the robot, compressing them. As the rotors continue to turn, eventually the compressed tongues get pulled all the way around back to the mouths, at which point they spring out, releasing that elastic energy all at once and causing the robot to jump.
"With some light-weight payloads, such as miniature cameras, the robot can be used for exploration tasks," write the researchers. "Moreover, a wireless sensor network can be automatically deployed and reconfigured for outdoor surveillance by using a group of our jumping robots."
Next, they hope to increase the robots' power so the cubes can jump higher and cover more ground.
"MIT's Cube Robot Uses Springy Metal Tongues to Jump" (IEEE Spectrum)
The US military's Defense Advanced Research Projects Agency is funding a new project to develop musical robots that can improvise a solo when playing with human jazz musicians. A collaboration between new media researchers at the University of Illinois at Urbana-Champaign and musicians at the University of Arizona, the goal of the MUSICA (Musical Improvising Collaborative Agent) project is to explore non-traditional "languages" for people and computers to interact. From Scientific American:
Read the rest
"There is definitely a desire for more natural kinds of communications with computational systems as they grow in their ability to be intelligent," Ben Grosser, an assistant professor of new media at the University of Illinois at Urbana-Champaign, told Live Science. "A lot of us are familiar with various methods of interacting with computers, such as text-based and touch-based interfaces, but language-based interfaces such as Siri or Google Now are extremely limited in their capabilities...."
To develop a machine capable of playing improvisational jazz, the researchers will create a database of jazz solos from a variety of musicians and have computers analyze the recordings to figure out the various processes that come into play when a musician improvises. The researchers will then develop a performance system to analyze the components of human jazz performances, including the beat, pitch, harmony and rhythm. The system will also consider what it has learned about jazz solos to communicate and respond musically in real time....
"Let's face it—trying to develop a system that can play jazz is a crazy idea," Grosser said.
Michael Froomkin writes, "We Robot is a cool conference that brings together lawyers, engineers, philosophers, robot builders, ethicists, and regulators who are on the front lines of robot theory, design, or development. The 2016 editioni will be in Coral Gables, Florida on April 1-2, 2016 at the University of Miami School of Law. The main conference will be preceded by a day of special workshops on March 31. Full details at Read the rest
Researchers from Japan's Chiba Institute of Technology demonstrated this spherical robot that rolls around until its four legs pop out for scurrying. Like a quadruped robot disguised as a Spheero! From the scientific paper (PDF):
We have proposed and developed a new quadruped walking robot with a spherical shell, called "QRoSS". QRoSS is a transformable robot that can store its legs in the spherical shell. The shell not only absorbs external forces from all directions, but also improves mobile performance because of its round shape. In rescue operations at a disaster site, carrying robots into a site is dangerous for operators because doing so may result in a second accident. If QRoSS is used, instead of carrying robots in, they are thrown in, making the operation safe and easy. We developed QRoSS-I and conducted basic experiments to verify performance which includes landing, rising and walking through a series of movements.(via IEEE Spectrum)
"I believe this is the first juggling robot to juggle more than 5 balls," Peterson says. "Yeah it's not toss juggling (into the air), but that would be my next project."
Build notes and images on imgur here.
Here's his full project page with previous designs.
The third incarnation of the University of Tokyo's Janken (Rock-Paper-Scissors) robot never loses. Ever. From the Ishikawa Watanabe Laboratory:
Read the rest
In this research we develop a janken (rock-paper-scissors) robot with 100% winning rate as one example of human-machine cooperation systems. Human being plays one of rock, paper and scissors at the timing of one, two, three. According to the timing, the robot hand plays one of three kinds so as to beat the human being.
Recognition of human hand can be performed at 1ms with a high-speed vision, and the position and the shape of the human hand are recognized. The wrist joint angle of the robot hand is controlled based on the position of the human hand. The vision recognizes one of rock, paper and scissors based on the shape of the human hand. After that, the robot hand plays one of rock, paper and scissors so as to beat the human being in 1ms.
This technology is one example that show a possibility of cooperation control within a few miliseconds. And this technology can be applied to motion support of human beings and cooperation work between human beings and robots etc. without time delay.
Considering from another point of view, locating factories oversea has been advantageous in labor-intensive process that requires human's eyes and hands because it is difficult to make the process automatic or it is not worth the cost. However, by realizing faster process than human's working speed, the productivity can be improved in regards to cost.
Literary podcaster Rick Kleffer writes, "I must admit that it was too much fun to sit down with John Markoff and talk (MP3) about his book Machines of Loving Grace. Long ago, I booted up a creaking, mothballed version of one of the first Xerox minicomputers equipped with a mouse to extract legacy software for E-mu. Fifteen years later I was at the first Singularity Summit; the book was a trip down many revisions of memory road."
Read the rest
John Markoff’s ‘Machines of Loving Grace: The Quest for Common Ground Between Humans and Robot’ is a fascinating, character-driven vision of how the recent past created the present and is shaping the near future. The strong and easily understood conflict at the heart of this work gives readers an easy means of grasping the increasingly complicated reality around us. If we do not understand this history, the chances are that we will not have the opportunity to be doomed to repeat it.
Our technological ecology began in two computer labs in Stanford in the early sixties. In one lab, John McCarthy coined the term “Artificial intelligence” with the intention of creating a robot that could think like, move like and replace a human in ten years. On the opposite side of the campus, Douglas Englebart wanted to make it easier for scholars to collaborate using an increasingly vast amount of information. He called it IA, Intelligence Augmentation as a direct response to AI. Thus were born two very different design philosophies that still drive the shape of our technology today – and will continue to do so in the future.
A robotic Shanah Tovah (Happy New Year!) from the Technion – Israel Institute of Technology! Read the rest