Placeholder Image

Subtitles section Play video

  • [MUSIC PLAYING]

  • SAM BEDER: Hi, everyone.

  • My name is Sam Beder, and I'm a product manager

  • on Android Things.

  • Today, I'm going to talk to you about Google

  • services on Android Things, and how

  • adding these services to your device

  • can unlock your device's potential.

  • What I really want to convince you of today

  • is not only is integrating Google services on Android

  • Things really, really easy and really, really seamless,

  • but it can make a huge difference in the use cases

  • that you can put on your device as well as for your end users.

  • And I know this year, we have many sessions on Android Things

  • as well as demos in the sandbox area,

  • and code labs to learn more about what's

  • possible on Android Things.

  • I also know that many of you are coming to this session

  • already with ideas of devices that you

  • want to make on Android Things or for IoT devices in general.

  • And I want to show you today all the compelling use cases

  • that you can get when you integrate some of these Google

  • services.

  • So I'm going to go through a number of services today.

  • First, I'm going to talk about Google Play services, which

  • includes a whole suite of tools such as the mobile Vision

  • APIs, location services, as well as Firebase.

  • After that, I'm going to dive into Firebase in a little bit

  • more detail to show you how the real time

  • database that Firebase provides can

  • allow you to publish and persist data

  • and events in interesting ways.

  • After that, I'm going go into TensorFlow,

  • and how TensorFlow--

  • we think-- is the perfect application

  • of the powerful on-device processing

  • of your Android Things device to really add intelligence

  • to that device.

  • Next, I'm going to talk about Google Cloud platform

  • and how using Google Cloud platform,

  • you can train, visualize, and take action

  • on your devices in the field.

  • Finally, I'm going to touch on the Google Assistant and all

  • the amazing use cases that you can

  • get when you integrate the Google Assistant on Android

  • Things.

  • Before I dive into these services,

  • I want to quickly go over Android Things.

  • So, Android Things is based on a system on module design.

  • This means that we work really closely with our silicon

  • partners to bring you modules which you can place directly

  • into your IoT devices.

  • Now, these modules are such that it's

  • economical to put them in devices when you're

  • making millions of devices or if you have a very small run,

  • or if you're just prototyping a device.

  • So earlier today, we actually had a session

  • specifically on going from prototype to production

  • on Android Things, which can give you more detail about how

  • it's feasible to do all this, all the hardware design,

  • and bring your device to production on Android Things.

  • The Android Things operating system

  • is then placed on top of these modules.

  • So Android Things is a new vertical

  • of Android built for IoT devices.

  • Since we work so closely with our silicon partners,

  • we're able to maintain these modules in new ways.

  • It allows these devices to be more secure and updateable.

  • Also, since it's an Android vertical,

  • you get all the Android APIs they're

  • used to for Android development as well as the developer tools

  • and the Android ecosystem.

  • In addition, on Android Things we've

  • added some new APIs such as peripheral iO and user

  • drivers that allow you to control the hardware

  • on your device in new ways.

  • We've also added support for a zero display

  • build for IoT devices without a screen.

  • But really the key piece of Android Things, I believe,

  • is the services on top.

  • Because of the API surface that Android Things provides,

  • it makes it much easier for Google

  • to put our services on top of Android Things.

  • I say endless possibilities here because not only does Google

  • already support all the services I'm

  • going to walk you through today, but any services

  • that Google makes in the future will be much more portable

  • on Android Things because of this API surface.

  • So now, let's start diving into some of these services.

  • Let's talk about Google Play services and all

  • the useful tools that it provides.

  • Google Play services gives you access

  • to a suite of tools, some of which you see here.

  • So you get things like the mobile vision APIs,

  • which allow you to leverage the intelligence in your Android

  • camera to identify people in an image

  • as well as faces and their expressions.

  • You also get the nearby APIs, which lets you--

  • when you have two devices near each other--

  • allows those devices to interact with each other

  • in interesting ways.

  • You get all the Cast APIs, which lets you

  • from your Android device cast to a cast enabled device

  • somewhere else.

  • Next, you get all the location services,

  • which lets you query things like,

  • what are the cafes near me and what are their hours.

  • You also get the Google Fit APIs,

  • which allow you to attach sensors and accelerometers

  • to your device and then visualize

  • this data as steps or other activities in interesting ways.

  • Finally, you get Firebase, which we'll

  • talk about more in a minute.

  • Some of you might know about CTF certification

  • and how CTF certification is a necessary step in order

  • to get these Google Play services.

  • With Android Things, because of our hardware model

  • that I just talked about, these modules

  • actually come pre-certified.

  • So they're all pre-CTF certified,

  • meaning Google Play Services will work right out of the box.

  • You have to do absolutely no work

  • to get these Google Play services on your Android Things

  • device.

  • We also have, for Android Things,

  • a custom IoT variant of Google Play services.

  • Now I actually think this is a pretty big deal.

  • This allows us to make Google Play services more lightweight

  • by taking out things like phone specific UI elements

  • and game libraries that we don't think

  • are relevant for IoT devices.

  • We also give you a signed out experience

  • of Google Play services.

  • So, no unauthenticated APIs because these just aren't

  • relevant for many IoT devices.

  • So now, let's dive into Firebase in a little bit more detail.

  • I'm going to walk you through one of our code samples.

  • So this is the code sample for a smart doorbell using Firebase.

  • It involves one of our supported boards,

  • as well as a button and a camera.

  • So I'm going to walk you through this diagram.

  • On the left, you see a user interacting

  • with the smart doorbell.

  • What happens is, they press the button on the smart doorbell

  • and the camera takes a picture of them.

  • On the right, there's another user

  • who, in their Android phone, they

  • can use an app to connect to a Firebase database that

  • can retrieve that image in real time.

  • So how does this work?

  • When you press the button on the smart camera,

  • the camera takes a picture of you.

  • Then, using the Android Firebase SDK,

  • which uses the Google Play services APIs

  • all on the device, it sends this image

  • to the Firebase database in the cloud.

  • The user on the other end can then

  • use the exact same Google Play services and Android Firebase

  • SDK on their phone to connect to this Firebase database

  • and retrieve that image.

  • In our code sample, we also send this image

  • to the Cloud Vision APIs to get additional annotations

  • about what's in the image.

  • So these annotations could be something like, in this image

  • there is a person holding a package.

  • So that can give you additional context about what's going on.

  • It's pretty cool.

  • If you actually go and build this demo, you can see.

  • When you press the button and it takes a picture, in less than a

  • second the picture will appear.

  • And then a few seconds later, after the image

  • is propagated through the Cloud Vision APIs,

  • the annotations will appear as well.