This talk, at TEDx Jaffa, discusses how animation, acting lessons, and jazz led to a new approach to artificial intelligence for robots that “think with their bodies”, plan less, improvise more, and take chances, even at the risk of making mistakes, and why this makes for robots that better fit our homes, offices, schools and hospitals.
Robotic Speaker Dock and Listening Companion
First demo of “Travis” – an expressive robotic platform. In this demo, Travis listens to a beat, find the song closest to the beat, and responds to the music in real time.
Human-Robot Jazz Improvisation
The stage debut of Shimon, the robotic marimba player. Also, the world first real-time human-robot real time improvisation system. Playing rendition of Duke Jordan’s “Jordu”, for human piano and robot marimba. Watch the full performance here >.
The Confessor: Human-Robot Theater
The robotic desk lamp AUR was part of the first human-robot theater play written and performed by human and robot actors. The robot was controlled through a novel actor point-of-view hybrid control system structured around scenes, beats, and actions, and enabling the robot to react in time and make eye-contact with the actors during rehearsal and performance. This system is the first step in an ongoing effort towards a fully autonomous robotic actor.
AUR Robotic Desk Lamp
First concept video featuring AUR, a robotic desk lamp developed at the Media Lab. In this video, the lamp tracks the human user illuminating the right area at the right time. The lamp also helps the user find things in the workspace by using a different color and a beam-narrowing iris to point out things.