Placeholder Image

Subtitles section Play video

  • ♪ (electronic pop) ♪

  • (applause)

  • Good morning.

  • (cheering)

  • Welcome to Google I/O.

  • It's a beautiful day; I think warmer than last year.

  • I hope you're all enjoying it. Thank you for joining us.

  • I think we have over 7,000 people here today.

  • As well as many, many people--

  • we're live streaming this to many locations around the world.

  • So, thank you all for joining us today. We have a lot to cover.

  • But, before we get started,

  • I had one important business which I wanted to get over with.

  • Towards the end of last year, it came to my attention

  • that we had a major bug in one of our core products.

  • - It turns out... - (laughter)

  • ...we got the cheese wrong in our burger emoji.

  • Anyway, we went hard to work.

  • I never knew so many people cared about where the cheese is.

  • - (laughter) - We fixed it.

  • You know, the irony of the whole thing is I'm a vegetarian in the first place.

  • (laughter and applause)

  • So, we fixed it-- hopefully we got the cheese right,

  • but as we were working on this, this came to my attention.

  • (laughter)

  • I don't even want to tell you the explanation the team gave me

  • as to why the foam is floating above the beer.

  • (laughter)

  • ...but we restored the natural laws of physics.

  • (laughter)

  • (cheering)

  • So, all is well. We can get back to business.

  • We can talk about all the progress since last year's I/O.

  • I'm sure all of you would agree

  • it's been an extraordinary year on many fronts.

  • I'm sure you've all felt it.

  • We're at an important inflection point in computing.

  • And it's exciting to be driving technology forward.

  • And it's made us even more reflective about our responsibilities.

  • Expectations for technology vary greatly,

  • depending on where you are in the world,

  • or what opportunities are available to you.

  • For someone like me, who grew up without a phone,

  • I can distinctly remember

  • how gaining access to technology can make a difference in your life.

  • And we see this in the work we do around the world.

  • You see it when someone gets access to a smartphone for the first time.

  • And you can feel it in the huge demand for digital skills we see.

  • That's why we've been so focused on bringing digital skills

  • to communities around the world.

  • So far, we have trained over 25 million people

  • and we expect that number to rise over 60 million

  • in the next five years.

  • It's clear technology can be a positive force.

  • But it's equally clear that we just can't be wide-eyed

  • about the innovations technology creates.

  • There are very real and important questions being raised

  • about the impact of these advances

  • and the role they'll play in our lives.

  • So, we know the path ahead needs to be navigated carefully

  • and deliberately.

  • And we feel a deep sense of responsibility to get this right.

  • That's the spirit with which we're approaching our core mission--

  • to make information more useful,

  • accessible, and beneficial to society.

  • I've always felt that we were fortunate as a company

  • to have a timeless mission

  • that feels as relevant today as when we started.

  • We're excited about how we're going to approach our mission

  • with renewed vigor,

  • thanks to the progress we see in AI.

  • AI is enabling for us to do this in new ways,

  • solving problems for our users around the world.

  • Last year, at Google I/O, we announced Google AI.

  • It's a collection of our teams and efforts

  • to bring the benefits of AI to everyone.

  • And we want this to work globally,

  • so we are opening AI centers around the world.

  • AI is going to impact many, many fields.

  • I want to give you a couple of examples today.

  • Healthcare is one of the most important fields AI is going to transform.

  • Last year we announced our work on diabetic retinopathy.

  • This is a leading cause of blindness,

  • and we used deep learning to help doctors diagnose it earlier.

  • And we've been running field trials since then

  • at Aravind and Sankara hospitals in India,

  • and the field trials are going really well.

  • We are bringing expert diagnosis to places

  • where trained doctors are scarce.

  • It turned out, using the same retinal scans,

  • there were things which humans quite didn't know to look for,

  • but our AI systems offered more insights.

  • Your same eye scan,

  • it turns out, holds information

  • with which we can predict the five-year risk

  • of you having an adverse cardiovascular event--

  • heart attack or strokes.

  • So, to me, the interesting thing is that,

  • more than what doctors could find in these eye scans,

  • the machine learning systems offered newer insights.

  • This could be the basis for a new, non-invasive way

  • to detect cardiovascular risk.

  • And we're working-- we just published the research--

  • and we're going to be working to bring this to field trials

  • with our partners.

  • Another area where AI can help

  • is to actually help doctors predict medical events.

  • It turns out, doctors have a lot of difficult decisions to make,

  • and for them, getting advanced notice--

  • say, 24-48 hours before a patient is likely to get very sick--

  • has a tremendous difference in the outcome.

  • And so, we put our machine learning systems to work.

  • We've been working with our partners

  • using de-identified medical records.

  • And it turns out if you go and analyze over 100,000 data points per patient--

  • more than any single doctor could analyze--

  • we can actually quantitatively predict

  • the chance of readmission,

  • 24-48 hours earlier than traditional methods.

  • It gives doctors more time to act.

  • We are publishing our paper on this later today

  • and we're looking forward to partnering with hospitals and medical institutions.

  • Another area where AI can help is accessibility.

  • You know, we can make day-to-day use cases much easier for people.

  • Let's take a common use case.

  • You come back home at night and you turn your TV on.

  • It's not that uncommon to see two or more people

  • passionately talking over each other.

  • Imagine if you're hearing impaired

  • and you're relying on closed captioning to understand what's going on.

  • This is how it looks to you.

  • (two men talking over each other)

  • As you can see, it's gibberish-- you can't make sense of what's going on.

  • So, we have machine learning technology called looking to listen.

  • It not only looks for audio cues,

  • but combines it with visual cues

  • to clearly disambiguate the two voices.

  • Let's see how that can work, maybe, in YouTube.

  • (man on right) He's not on a Danny Ainge level.

  • But, he's above a Colangelo level.

  • In other words, he understands enough to...

  • (man on left) You said it was alright to lose on purpose.

  • You said it's alright to lose on purpose,

  • and advertise that to the fans.

  • It's perfectly okay. You said it's okay!

  • We have nothing else to talk about!

  • (Sundar) We have a lot to talk about. (chuckles)

  • (laughter)

  • (cheering)

  • But you can see how we can put technology to work

  • to make an important day-to-day use case profoundly better.

  • The great thing about technology is it's constantly evolving.

  • In fact, we can even apply machine learning

  • to a 200-year old technology-- Morse code--

  • and make an impact on someone's quality of life.

  • Let's take a look.

  • ♪ (music) ♪ (beeping)

  • (computer's voice) Hi, I am Tania.

  • This is my voice.

  • I use Morse code by putting dots and dashes

  • with switches mounted near my head.

  • As a very young child,

  • I used a communication word board.

  • I used a head stick to point to the words.

  • It was very attractive, to say the least.

  • Once Morse code was incorporated into my life,

  • it was a feeling of pure liberation and freedom.

  • (boy) See you later. Love you.

  • I think that is why I like sky diving so much.

  • It is the same kind of feeling.

  • Through sky diving, I met Ken, the love of my life,

  • and partner in crime.

  • It's always been very, very difficult

  • just to find Morse code devices,

  • to try Morse code.

  • (Tania) This is why I had to create my own.

  • With the help from Ken, I have a voice,

  • and more independence in my daily life.

  • But most people don't have Ken.

  • It is our hope that we can collaborate with the Gboard team

  • to help people who want to tap into the freedom of using Morse code.

  • (woman) Gboard is the Google keyboard.

  • What we have discovered, working on Gboard,

  • is that there are entire pockets of populations in the world--

  • and when I say "pockets" it's like tens of millions of people--

  • who have never had access to a keyboard that works in their own language.

  • With Tania, we've built support in Gboard for Morse code.

  • So, it's an input modality

  • that allows you to type in Morse code and get text out

  • with predictions, suggestions.

  • I think it's a beautiful example of where machine learning

  • can really assist someone in a way that a normal keyboard,

  • without artificial intelligence,

  • wouldn't be able to.

  • (Tania) I am very excited to continue on this journey.

  • Many, many people will benefit from this

  • and that thrills me to no end.

  • ♪ (music) ♪

  • (applause)

  • It's a very inspiring story.

  • We're very, very excited to have Tania and Ken join us today.

  • (cheering)

  • Tania and Ken are actually developers.

  • They really worked with our team

  • to harness the power of actually predictive suggestions

  • in Gboard, in the context of Morse code.

  • I'm really excited that Gboard with Morse code

  • is available in beta later today.

  • It's great to reinvent products with AI.

  • Gboard is actually a great example of it.

  • Every single day,

  • we offer users-- and users choose-- over 8 billion auto correction

  • each and every day.

  • Another example of one of our core products

  • which we are redesigning with AI

  • is Gmail.

  • We just had a new, fresher look for Gmail--

  • a recent redesign.

  • I hope you're all enjoying using it.

  • We're bringing another feature to Gmail.

  • We call it Smart Compose.

  • So, as the name suggests,

  • we use machine learning to start suggesting phrases for you

  • as you type.

  • All you need to do is to hit Tab and keep auto-completing.

  • (applause)

  • In this case, it understands the subject is "Taco Tuesday."

  • It suggests chips, salsa, guacamole.

  • It takes care of mundane things like addresses

  • so you don't need to worry about it--

  • you can actually focus on what you want to type.

  • I've been loving using it.

  • I've been sending a lot more emails to the company...

  • - ...not sure what the company thinks of it. - (laughter)

  • But it's been great.

  • We are rolling out Smart Compose to all our users this month

  • and hope you enjoy using it as well.

  • (applause)

  • Another product, which we built from the ground up using AI

  • is Google Photos.

  • Works amazingly well,

  • and it scales.

  • If you click on one of these photos,

  • what we call the "photo viewer experience"

  • where you're looking at one photo at a time,

  • so that you understand the scale.

  • Every single day there are over 5 billion photos viewed by our users,

  • each and every day.

  • So, we want to use the AI to help in those moments.

  • We are bringing a new feature called Suggested Actions--

  • essentially suggesting small actions

  • right in context for you to act on.

  • Say, for example, you went to a wedding

  • and you're looking through those pictures.

  • We understand your friend, Lisa, is in the picture,

  • and we offer to the share the three photos with Lisa,

  • and with one click those photos can be sent to her.

  • So, the anxiety where everyone is trying to get the picture on their phone,