Subtitles section Play video Print subtitles JUSTIN UBERTI: So what you all think of the Google Glass Hangout this morning? [AUDIENCE CHEERING] JUSTIN UBERTI: What an amazing way to show off the power of real-time communication. When I first heard about it, they're going to jump out of an airship and have it all in a Hangout, I was like, that's pretty incredible. But anyway, I'm Justin Uberti. And today, I'd like to tell you all about WebRTC, a project to bring real-time communication-- RTC-- to the open web platform. So I currently work on the WebRTC team here at Google, which is part of the Chrome project. Before joining WebRTC, I was the lead in the Google Talk team, where we built some really exciting applications, and also learned a lot of really hard lessons about what it takes to do these kinds of apps in a browser. I also kind of learned that some people on the internet think I have really small hands. I don't get it. So enough about me. Let me see what kind of developers we have in the audience today. Raise your hand if you're really familiar with the following technologies. WebRTC? OK. HTML5? Good. WebSockets? App Engine? SIP? OK. H 323? OK. You folks who raised your hands, you have my condolences. So what exactly is WebRTC? Well, let me tell you a little story. A couple years ago, when we are working on what would become Hangouts, the Chrome team approached us. And they said, the open web platform has some amazing technology. And people are building incredible apps with it. But there's one big gap remaining. To build apps like Google Video Chat, you need plugins. And these plugins have all sorts of problems, security, everything. What would it take to take the stuff you built for Google Video Chat and Hangouts, and make it part of the open web platform? Well, we said, it would be really complicated. I mean, we'd have all these technology issues to figure out. And we'd have to deal with like codec licensing, and open sourcing. And most of all, we'd have to work with other browser manufacturers and other industry players to make an actual standard for something that can be implemented across all browsers and across the entire web. And so in typical Google fashion, the response we got back was, sounds like a plan. When can we have it? So we got to work. Where we didn't have the right technology pieces in house, we went out and we acquired state of the art, the very best technology out there. We assembled these pieces into a system. And as we started talking to other people, and we talked to other browser manufacturers, Firefox, Opera, Microsoft. We talked to people who would build apps on this platform, people like Skype, people like Cisco. And we talked to the typical telecom providers, folks like Ericsson, AT&T. They were all super excited about this because it's potential, not just for the web, but for the entire communications industry. So that's the premise of WebRTC, RTC to build real-time communications into the fabric of the web, where every browser has a built in, state of the art communication stack, and create a new open platform that any application and any device can use to communicate. So think about that. This is where we're having, the ability to get the communications industry-- a $2 trillion industry-- moving at web speed. And not only will the developers be able to build and deploy voice and video apps, just like any other web app, but we'll also start to see communication being built in as a feature to all sorts of apps. In a game, the ability to see the opponent's face right as you checkmate them. Or in customer service on a website, a shopping website, to be able to talk to a customer service rep live in person with a single click. As WebRTC takes hold across computers and all sorts of devices, we have the real ability to create the next generation phone network, where every WebRTC enabled device can communicate with amazing audio and video quality. So take this quote from NoJitter. This is a communications industry blog. "WebRTC and HTML5 could enable the same transformation for real time that the original browser did for information." That's a pretty lofty comparison. So how do we get there? Well first, we need to get WebRTC in the hands of you, the developers. And here's where we're at with that. The first WebRTC support is now shipping in Chrome 21, the current Chrome Dev track. And also in Opera 12. We're expecting to have Firefox join us before the end of the year. We've also brought WebRTC support to Internet Explorer via ChromeFrame. And so we'll soon have support across almost all desktop browsers. As this technology stabilizes, we're also going to see web WebRTC start to appear in the various mobile browsers. And for those building native applications, either on desktop or mobile, we have native versions of the WebRTC stack that are fully compatible with their web counterparts. So the functionality the WebRTC offers falls into three categories. The first, MediaStreams, also known as get user media, is about getting access to the user's camera and mic. There are a lot of cool apps that can be built with just this. Next the technology called PeerConnection. This is the engine behind making high quality peer to peer voice and video calls on the web. Last is a new bit of functionality called DataChannels. It's so new, the spec for this hasn't fully stabilized yet. But it has incredible potential. The ability for any web app to be a P2P app, to exchange application data peer to peer. Now let's take a look at each one of these. Now, if you're following along at home, and you want to try out the things about the show, and you're running Chrome, you want to turn on the flags to enable MediaStreams and PeerConnection. If you go to About Flags in your Chrome build, you'll see these options in a list. And you turn on MediaStream and PeerConnection. In the Dev channel on Chrome 21, you won't see a MediaStream option, because it's now on by default. And if you don't want to turn this on for your existing version of Chrome, you can download Google Chrome Canary and run it side by side with your existing version of Chrome. So first up, MediaStreams. A MediaStream represents a media source, and can contain multiple media tracks that can be of various types. So for example, if we get a MediaStream for the user's webcam and mic, we'll have a single stream, but a track for video, and a track for audio, as shown in the diagram here. Now, in a video conference, we could have multiple MediaStreams. And one MediaStream would exist for each participant, each one with an audio and video track. Now, once we have a MediaStream we need a way to actually play it out. And fortunately, we have an easy way to play audio and video in HTML via the aptly named, audio and video elements. Now, in order to plug a MediaStream into these elements, we first need a way to get a URL that references the MediaStream. Fortunately, there's a method called create object URL that