Watch a human musician and his robots improvise together

By admin,

  Filed under: AI, Artificial Intelligence, Cognitive, Computing, Creativity, Electronics, Machine Improvisation, Machine Learning, Music, Pattern Recognition, Research, Robotics
  Comments: Comments Off on Watch a human musician and his robots improvise together

This performance by Shimon and the Shimi Robots showcases the PhD research of Georgia Tech doctoral student Mason Bretan on machine improvisation, path planning and embodied cognition. (Mason Bretan/YouTube)

This is a performance showcasing part of my PhD research in robotic musicianship at Georgia Tech including 

  • machine improvisation, 
  • path planning, and 
  • embodied cognition. 

The smaller Shimi robots figure out how to move based on an analysis of the music and Shimon generates an improvisation given a precomposed chord progression using a generative algorithm that jointly optimizes for higher level musical parameters and its physical constraints.

The piece is called “What You Say” and is inspired by the high energy funk piece, “What I Say”, from Miles Davis’ Live-Evil album. The incredible brilliance of the musicians on that album (as well as the numerous other great musicians around the world) are not only an inspiration to me and my own musical and instrumental aspirations, but also set the standard for the level of musicianship that I hope machines will one day achieve. And through the power of artificial intelligence, signal processing, and engineering I firmly believe it is possible for machines to be artistic, creative, and inspirational.

I hope you enjoy!

To use this video in a commercial player or in broadcasts, please email [email protected]

Rest assured that when our future robotic overlords come on the scene, they’ll have a sweet sense of rhythm.

The Robotic Musicianship Group at Georgia Tech has been working on Shimon, a musical robot that can improvise melodic accompaniment, for about six years now. And for three years, they’ve added Shimi — a small, smartphone-connected bot that can respond to music with dance and sound — to the mix.

Shimi shimmies.
Shimi shimmies. (Mason Bretan via The Washington Post)
(Mason Bretan via The Washington Post)

I’m always trying something new with the robots, and sometimes they surprise me with something that’s sort of out
there or pretty cool,
” he added.

His dissertation, which he hopes to turn in by the end of 2015, centers around teaching the robots to understand their physical constraints and abilities.

So the goal is that if you gave the same input to a robot with 20 arms, it would perform differently than an eight-armed robot because it would be optimizing its performance,” he said. “Combined with the new algorithm we have for jazz music improvisation, these skills really allow them to more optimally achieve musical goals.

And while he certainly doesn’t want to replace human musicians like himself with robots, he’s excited about the mechanical abilities they have that we don’t.

I mean, Shimon already has four arms and can hold eight mallets,” he said, “So it can already do something a person can’t.

ORIGINAL: Washington Post
By Rachel Feltman
Jan 14, 2015

Comments are closed for this post.