Placeholder Image

Subtitles section Play video

  • MALE SPEAKER: Welcome.

  • It's my pleasure to introduce Gil Weinberg.

  • He's a professor at Georgia Tech for the Center of Music

  • and Technology.

  • And I went to Georgia Tech a long, long time ago

  • and we didn't have a music department back then.

  • So when I visited recently, the folks I was meeting with in ECE

  • sent me over to meet Gil and he's

  • doing some really cool stuff with interactive computing

  • and musicianship.

  • So building machines that you can interact with,

  • not just from a dialogue perspective,

  • like the kind of thing I'm thinking about,

  • but more from a music perspective

  • where you want to jam and do some jazz

  • improvisation with a robot.

  • And he's going to talk about that in many other things

  • today.

  • So welcome.

  • GIL WEINBERG: Thank you.

  • [APPLAUSE]

  • Thank you.

  • Thank you for coming.

  • So I'm going to talk about three or four main projects.

  • Only three of them I showed in these slides.

  • The last one is a surprise.

  • We did the project which we just finished and I have some very,

  • very fresh new slides.

  • The reason I started to be interested

  • in robotic musicianship is that I'm

  • a musician before I became interested

  • in computation or in robotics, I was a musician and still am.

  • And I was always fascinated by playing in a group.

  • By being constantly ready to change what I'm doing.

  • By trying to build on what other people

  • are playing in real time.

  • Improvising in a group, a visual cues, auditory cues, of course.

  • So when I started to be interested in robotics,

  • I wanted to capture this experience.

  • Before I show you some of the efforts that I did,

  • maybe I'll show a short clip of me playing with a trumpet

  • player and trying to see the kind of experiences

  • that I was trying to recreate.

  • [VIDEO PLAYBACK]

  • [JAZZ MUSIC]

  • [END PLAYBACK]

  • So I think you've seen a lot of eye contact,

  • trying to build a motif that I'm not sure what is going to be

  • and trying to create something interesting with them

  • back and forth.

  • And the first robots that I tried to develop it

  • was building on these ideas.

  • But what I wanted to do is to have the robot understand music

  • like humans do.

  • The big idea that I started with,

  • was to create robots that listen like humans,

  • but improvise like machines.

  • Because I felt that if I want to push what music

  • is about through new, novel improvisation

  • algorithms and new acoustic playing,

  • I first have to have connection between humans and robots,

  • and that's why there is the listening like human part.

  • Only then would I be able to start to make the robot

  • play like a machine in order to create this kind of connection

  • and relationship.

  • I'll start with something very simple

  • that probably many of you are familiar with.

  • If I want a robot to understand me,

  • maybe the first simple thing that I can make him do

  • is understand the beat of the music that I play.

  • And here we use auto-correlation and self-similarity algorithms.

  • This is a piece from Bach.

  • You see the time is both on the x and on the y.

  • And you see that it's symmetric and by comparing

  • the [INAUDIBLE] to the algorithm you

  • can try to find sections that are similar

  • and detect the beat from that.

  • But what we try to do is to actually have it in real time.

  • And you see here my student Scott Driscoll

  • used this algorithm based on Davies and Plumbley

  • from Queen Mary.

  • And you see that's it becomes much more

  • sophisticated because it's not just analyzing the beat in Bach

  • or in the Beatles-- Scott is playing in real-time so he

  • is trying to get the beat, but then with Haile,

  • start to play with it.

  • Scott is trying to fit what he is

  • doing to what Haile-- Haile is a robot-- and back and forth.

  • So it's a little more complicated you see.

  • Sometimes they escape from the beat and get back to the beat.

  • And here's a short example.

  • Got the beat.

  • Now Scott will start faster and slower.

  • Faster , got it.

  • So as you can see, it loses it, it gets it back.

  • I think in a second, it will play slower which

  • shows how the system fails.

  • The next thing that I was trying to get

  • is to have the robot understand other things at a high level

  • musically.

  • Not just the beat but concepts that we humans

  • understand such as stability and similarity.

  • And basically we had a huge database

  • of rhythm generated almost randomly-- with some rules,

  • some stochastic rules.

  • And then we had a coefficient for stability and similarity.

  • Whenever Haile listened to a particular beat

  • there's some settings and coefficient

  • for stability and similarities.

  • At some point the robot actually decided by itself.

  • First the human on the side can change the similarity

  • and stability and create some output to bring rhythm back.

  • And this similarity is based on Tanguiane from 1993,

  • basically looking at the overlapping

  • onset between beats.

  • I can get more into this if you want, maybe later.

  • And this is as similarity algorithm

  • based on Desain and Honing.

  • [INAUDIBLE] between adjunct intervals

  • is what set how stable the rhythm is.

  • This is based on music perception studies

  • that I've been doing.

  • And basically, there is a mathematical procedure

  • where you compare each one of the notes

  • to the note that comes after it.

  • And at some point, after giving preference to one and two,

  • which are stable, you can get for every particular rhythem--

  • for example, this one, the quarter quarter,

  • two-eighths quarter-- a particular onset stability

  • by combining all of the ratios and getting something stable.

  • And here's a short example of Scott

  • playing with Haile and Haile trying

  • to understand the stability of Scott's rhythms.

  • And then based on a curve of similarity,

  • starting most similar then going less similar.

  • And basically a human put a curve.

  • But I can see a scenario where Haile could come up

  • with a curve by itself, trying to start with something

  • that Scott understands.

  • Scott is playing seven quarters, and then slowly

  • introduce new ideas.

  • And you can see how Scott actually listens to the robot

  • and at some point, building on what the robot is doing.

  • So Haile is building on what Scott is doing, obviously,

  • by looking at the stability and similarity.

  • But at some point, Scott is almost inspired.

  • Maybe inspired is too big of a word,

  • but that's a goal that Scott will come up with an idea

  • that he would come up with if he played with humans.

  • [VIDEO PLAYBACK]

  • [DRUMMING]

  • That's almost a new idea.

  • Scott is building on it.

  • [END PLAYBACK]

  • And this is a darbuka drum player concert.

  • You will see how the professional darbuka

  • player, actually from Israel, is kind of surprised.

  • But I think his facial gestures were interesting for me

  • because I think he was surprised for the better.

  • And at some point you'll see how all of us are playing

  • and Haile tries to get to beat.

  • So we combine the stability similarity

  • and beat detection into a drum circle.

  • [VIDEO PLAYBACK]

  • [DRUMMING]

  • This is call and response.

  • Later it will be simultaneously.

  • [DRUMMING]

  • And now, what it does, it listened to these two drummers,

  • and tricked the pitch from one drummer,

  • and the rhythm from another.

  • And the other arm, most from pitch

  • and the timbre of the two players.

  • [END PLAYBACK]

  • So we played the rhythm that one player played,

  • and the pitch-- well, it's not really pitch, it's in the drum,

  • but this is lower and this is higher next to the rim,

  • and tried to create something that is really

  • morphing between these two.

  • Again, things that humans cannot do and maybe shouldn't do,

  • but here something interesting can come up.

  • Another thing that I was interested in is polyrhythm.

  • It's very easy for a robot to do things that humans cannot.

  • Sometimes I'll ask my student to clap.

  • I will not ask you to clap.

  • I'll give you an example.

  • I think there is two main reasons here.

  • [SPEAKING RHYTHM]

  • This is nine.

  • [SPEAKING RHYTHM]

  • I don't ask you to clap but sometimes I would.

  • It was [SPEAKING RHYTHM] seven.

  • And then I asked my students to do

  • the nine in one hand and the seven

  • in another hand which I will definitely not ask you.

  • But see how Haile here captured--

  • decided to record the rhythm.

  • So it records the nine.

  • He choose the nine and the seven, and at some point

  • he introduced them in polyrhythmic, interesting

  • rhythms.

  • [VIDEO PLAYBACK]

  • [DRUMMING]

  • And we add more and more rhythms over it.

  • [END PLAYBACK]

  • And I don't know if know, but Pat Metheny had a project

  • that he used robots in.

  • He came to our lab and I explained it to him,

  • I showed it to him.

  • I said, this is something no humans can do.

  • And he said, sure, my drummers can do it.

  • And he can also do the four with his leg, and another three

  • with another leg.

  • So I think everyone can do it except maybe Pat Metheny's

  • drummer.

  • And here is something that's at the end of concert,

  • it's obviously, you know, fishing for cheers.

  • We just have a little short MIDI file where

  • we playing with it together.

  • Unison always works, so I'll just play this.

  • We grabbed the nine and seven.

  • [VIDEO PLAYBACK]

  • [DRUMMING]

  • And that's from a performance in Odense in Denmark.

  • They had a robot festival.

  • [END PLAYBACK]

  • So the next project was Shimon.

  • And actually I have an story about this

  • because I put a video of Haile and Scott playing,

  • the first one, on my website.

  • It was before YouTube, or before I knew about YouTube.

  • And someone grabbed it and put it on YouTube,

  • and then CNN saw it and they asked to come and do a piece.

  • And when they put a piece, the next day

  • I got an email from the NSF, from the NSF director

  • who said we saw your piece, please submit a proposal

  • and continue to develop that.

  • Rarely happens.

  • Never happened since.

  • I tried.

  • I put so many videos--

  • And this is the next robot that we came up

  • which adds multiple things.

  • The main thing that it adds is the concept of pitch.

  • It plays a marimba.

  • And the second aspect it adds-- we're

  • talking about the personal connection, gestures,

  • visual cues-- is the head.

  • And many people ask me, why the head?