Robots have already mastered Rubik's cubes, have been controlled by the neural activity of a worm, and have even managed to uncover the secrets of the Great Pyramid. And now, apparently, they've got improvisation sorted too.
In the video above, PhD student Mason Bretan from the Robot Musicianship Group at Georgia Tech in the US jams with four of the robots he's helped create. And as you can see, they've definitely got rhythm.
Shimon, the robot who holds the mallets, and the three smartphone-connected, bopping Shimis, were created by the Georgia Tech group with the aim of being able to improvise musical accompaniments and dance.
And although Bretan gave the robots the arrangement that he'd be playing, as well as a few recordings and cues that they could play during the jam session, they also add plenty of their own into the song in response to the notes he's playing, as Rachel Feltman writes for the Washington Post.
For example, that amazing marimba solo played by Shimon in the middle of the track? Entire improvised by the robot, as are the Shimis' bopping dance moves.
The lab has been working on Shimon for six years now, with the end goal of teaching the robot how to compose jazz music by itself. And since Bretan joined, they've also added the Shimis to their band, which will hopefully be able to act more like musical accompaniments to musicians, adding dance and also electronic sounds.
To do this, the team found a way to break down musical theory and turn it into a programming language that the robots can recognise, understand and even create.
"I'm always trying something new with the robots, and sometimes they surprise me with something that's sort of out there or pretty cool," Bretan told Feltman.
With his doctoral project, he hopes to make the robots aware of their physical limitations dependent on their structure.
"Because their note generating algorithms rely on their physical constraints, if you give the same algorithm to a different robot (like a marimba playing robot with one arm or 10 arms), you will get different musical behaviours and outcomes," Bretan told Mashable via email.
As a musician since the age of four, his aim definitely isn't to replace human musicians, but simply to explore the creativity that might be hidden within artificial intelligence, and also the ways they could help compliment human artists with their unique traits, such as powerful processing and long-term memory.
As Bretan writes in his YouTube video description:
"The piece is called "What You Say" and is inspired by the high energy funk piece, "What I Say", from Miles Davis' Live-Evil album. The incredible brilliance of the musicians on that album (as well as the numerous other great musicians around the world) are not only an inspiration to me and my own musical and instrumental aspirations, but also set the standard for the level of musicianship that I hope machines will one day achieve. And through the power of artificial intelligence, signal processing, and engineering I firmly believe it is possible for machines to be artistic, creative, and inspirational."
Sources:The Washington Post, Mashable