Placeholder Image

Subtitles section Play video

  • MALE SPEAKER: Welcome.

  • It's my pleasure to introduce Gil Weinberg.

  • He's a professor at Georgia Tech for the Center of Music

  • and Technology.

  • And I went to Georgia Tech a long, long time ago

  • and we didn't have a music department back then.

  • So when I visited recently, the folks I was meeting with in ECE

  • sent me over to meet Gil and he's

  • doing some really cool stuff with interactive computing

  • and musicianship.

  • So building machines that you can interact with,

  • not just from a dialogue perspective,

  • like the kind of thing I'm thinking about,

  • but more from a music perspective

  • where you want to jam and do some jazz

  • improvisation with a robot.

  • And he's going to talk about that in many other things

  • today.

  • So welcome.

  • GIL WEINBERG: Thank you.

  • [APPLAUSE]

  • Thank you.

  • Thank you for coming.

  • So I'm going to talk about three or four main projects.

  • Only three of them I showed in these slides.

  • The last one is a surprise.

  • We did the project which we just finished and I have some very,

  • very fresh new slides.

  • The reason I started to be interested

  • in robotic musicianship is that I'm

  • a musician before I became interested

  • in computation or in robotics, I was a musician and still am.

  • And I was always fascinated by playing in a group.

  • By being constantly ready to change what I'm doing.

  • By trying to build on what other people

  • are playing in real time.

  • Improvising in a group, a visual cues, auditory cues, of course.

  • So when I started to be interested in robotics,

  • I wanted to capture this experience.

  • Before I show you some of the efforts that I did,

  • maybe I'll show a short clip of me playing with a trumpet

  • player and trying to see the kind of experiences

  • that I was trying to recreate.

  • [VIDEO PLAYBACK]

  • [JAZZ MUSIC]

  • [END PLAYBACK]

  • So I think you've seen a lot of eye contact,

  • trying to build a motif that I'm not sure what is going to be

  • and trying to create something interesting with them

  • back and forth.

  • And the first robots that I tried to develop it

  • was building on these ideas.

  • But what I wanted to do is to have the robot understand music

  • like humans do.

  • The big idea that I started with,

  • was to create robots that listen like humans,

  • but improvise like machines.

  • Because I felt that if I want to push what music

  • is about through new, novel improvisation

  • algorithms and new acoustic playing,

  • I first have to have connection between humans and robots,

  • and that's why there is the listening like human part.

  • Only then would I be able to start to make the robot

  • play like a machine in order to create this kind of connection

  • and relationship.

  • I'll start with something very simple

  • that probably many of you are familiar with.

  • If I want a robot to understand me,

  • maybe the first simple thing that I can make him do

  • is understand the beat of the music that I play.

  • And here we use auto-correlation and self-similarity algorithms.

  • This is a piece from Bach.

  • You see the time is both on the x and on the y.

  • And you see that it's symmetric and by comparing

  • the [INAUDIBLE] to the algorithm you

  • can try to find sections that are similar

  • and detect the beat from that.

  • But what we try to do is to actually have it in real time.

  • And you see here my student Scott Driscoll

  • used this algorithm based on Davies and Plumbley

  • from Queen Mary.

  • And you see that's it becomes much more

  • sophisticated because it's not just analyzing the beat in Bach

  • or in the Beatles-- Scott is playing in real-time so he

  • is trying to get the beat, but then with Haile,

  • start to play with it.

  • Scott is trying to fit what he is

  • doing to what Haile-- Haile is a robot-- and back and forth.

  • So it's a little more complicated you see.

  • Sometimes they escape from the beat and get back to the beat.

  • And here's a short example.

  • Got the beat.

  • Now Scott will start faster and slower.

  • Faster , got it.

  • So as you can see, it loses it, it gets it back.

  • I think in a second, it will play slower which

  • shows how the system fails.

  • The next thing that I was trying to get

  • is to have the robot understand other things at a high level

  • musically.

  • Not just the beat but concepts that we humans

  • understand such as stability and similarity.

  • And basically we had a huge database

  • of rhythm generated almost randomly-- with some rules,

  • some stochastic rules.

  • And then we had a coefficient for stability and similarity.

  • Whenever Haile listened to a particular beat

  • there's some settings and coefficient

  • for stability and similarities.

  • At some point the robot actually decided by itself.

  • First the human on the side can change the similarity

  • and stability and create some output to bring rhythm back.

  • And this similarity is based on Tanguiane from 1993,

  • basically looking at the overlapping

  • onset between beats.

  • I can get more into this if you want, maybe later.

  • And this is as similarity algorithm

  • based on Desain and Honing.

  • [INAUDIBLE] between adjunct intervals

  • is what set how stable the rhythm is.

  • This is based on music perception studies

  • that I've been doing.

  • And basically, there is a mathematical procedure

  • where you compare each one of the notes

  • to the note that comes after it.

  • And at some point, after giving preference to one and two,

  • which are stable, you can get for every particular rhythem--

  • for example, this one, the quarter quarter,

  • two-eighths quarter-- a particular onset stability

  • by combining all of the ratios and getting something stable.

  • And here's a short example of Scott

  • playing with Haile and Haile trying

  • to understand the stability of Scott's rhythms.

  • And then based on a curve of similarity,

  • starting most similar then going less similar.

  • And basically a human put a curve.

  • But I can see a scenario where Haile could come up

  • with a curve by itself, trying to start with something

  • that Scott understands.

  • Scott is playing seven quarters, and then slowly

  • introduce new ideas.

  • And you can see how Scott actually listens to the robot

  • and at some point, building on what the robot is doing.

  • So Haile is building on what Scott is doing, obviously,

  • by looking at the stability and similarity.

  • But at some point, Scott is almost inspired.

  • Maybe inspired is too big of a word,

  • but that's a goal that Scott will come up with an idea

  • that he would come up with if he played with humans.

  • [VIDEO PLAYBACK]

  • [DRUMMING]

  • That's almost a new idea.

  • Scott is building on it.

  • [END PLAYBACK]

  • And this is a darbuka drum player concert.

  • You will see how the professional darbuka

  • player, actually from Israel, is kind of surprised.

  • But I think his facial gestures were interesting for me

  • because I think he was surprised for the better.

  • And at some point you'll see how all of us are playing

  • and Haile tries to get to beat.

  • So we combine the stability similarity

  • and beat detection into a drum circle.

  • [VIDEO PLAYBACK]

  • [DRUMMING]

  • This is call and response.

  • Later it will be simultaneously.

  • [DRUMMING]

  • And now, what it does, it listened to these two drummers,

  • and tricked the pitch from one drummer,

  • and the rhythm from another.

  • And the other arm, most from pitch

  • and the timbre of the two players.

  • [END PLAYBACK]