Placeholder Image

Subtitles section Play video

  • BRIJESH KRISHNASWAMI: Hello, everyone.

  • Again, thank you for your patience.

  • We are super excited to be here and talking

  • to you about TensorFlow.js and how

  • to bring machine learning to JavaScript applications.

  • My name is Brijesh.

  • I am a technical program manager on the Google TensorFlow team.

  • And here is my colleague, Kangyi,

  • who is a software engineer on the TensorFlow team.

  • So here is an outline of what we are

  • going to cover in this talk.

  • We will start with the what and why of TensorFlow.js.

  • We will see some of the ready-to-use models

  • that TF.js supports.

  • We will do a code walk-through of the development workflow

  • both to use one of these pre-trained models,

  • as well as training a custom model.

  • We will delve a little deeper into the tech

  • stack and the roadmap.

  • We will show you what the community

  • is building with TensorFlow.js.

  • There are some very impactful applications

  • that are being built, and we'd love to show you some examples.

  • And finally, we will point to some resources

  • that you can start exploring.

  • All right, so, we have a lot of exciting content to cover,

  • so let's get started.

  • You may have heard at the TensorFlow.js keynote

  • an overview of the technology today morning.

  • We are going to build on that here.

  • But first, a quick recap of the basics of TensorFlow.js.

  • So in a nutshell, TensorFlow.js is

  • a library for machine learning in JavaScript.

  • It is built for JavaScript developers

  • to create and run machine learning models

  • with an intuitive JavaScript-friendly API.

  • This means you can use it to perform training and inference

  • in the browser, browser-based platforms, and in Node.js.

  • ML operations are GPU accelerated,

  • and the library is fully open source and anyone is

  • welcome to contribute.

  • OK, so TensorFlow.js provides multiple starting points

  • for your needs.

  • So, you can use the library, you can directly

  • use off-the-shelf models that the library provides,

  • and we will see a lot of these in a bit.

  • You could also use your existing Python TensorFlow models

  • with or without conversion, depending on the platform

  • that you're running on.

  • Or you can retrain an existing model with transfer learning

  • and then customize it to your data sets.

  • That's the second starting point.

  • Transfer learning typically needs a smaller data

  • set for retraining, so that might fit your needs better.

  • And the third starting point is you

  • can build your model entirely from scratch with a Keras-like

  • Layers API and train it.

  • You can train it either in the browser

  • or on server with Node.js.

  • So, we are going to delve much deeper into some

  • of these workflows today.

  • JavaScript, of course, is a ubiquitous language.

  • So by virtue of that, TensorFlow.js

  • works on a variety of platforms.

  • It lets you write ML code once and run it

  • on multiple surfaces.

  • As you can see, the library runs on any standard browser, so

  • regular web apps, progressive web apps are covered.

  • On mobile, TF.js is integrated with mini-app platforms

  • like WeChat.

  • We have just added first-class support

  • for the React Native framework, so apps can seamlessly

  • integrate with TensorFlow.js.

  • On server, TF.js runs on Node.

  • In addition, it can run on desktop applications

  • using the Electron framework.

  • All right, so, why TensorFlow.js?

  • Why run ML in the browser?

  • So we believe there are compelling reasons

  • to run ML on a browser client, especially

  • for model inferencing.

  • Let's look at some of these reasons.

  • Firstly, there are no drivers and nothing to install.

  • You include the TensorFlow.js library,

  • either at page load time by script sourcing it

  • into your HTML page, or by bundling with a package manager

  • into your client app, and you're good to go.

  • That's it.

  • The second advantage is you can utilize a variety of device

  • inputs and sensors using standard device

  • API, such as camera, microphone, GPS--

  • through standard web API, HTML API,

  • and through a simplified set of TF Data API,

  • and we are going to see some examples today.

  • TF.js lets you process data entirely

  • on the client, which means it's a great choice

  • for privacy-sensitive applications.

  • It avoids round trip latency to the server.

  • It is also WebGL accelerated.

  • So, these factors combine to make

  • for a more fluid and interactive user experience.

  • Also, running ML on the client helps reduce server-side costs

  • and simplify your serving infrastructure.

  • For example, no online ML serving that

  • needs to scale to increasing traffic

  • and so forth is needed because you're offloading

  • all your compute to the client.

  • You just host a ML model from a static file location

  • and that's it.

  • On the server, there are also benefits

  • to integrating TensorFlow.js into your Node.js environment.

  • If you are using a Node.js serving stack,

  • it lets you bring ML into the stack as opposed

  • to calling out to a Python-based stack.

  • So it lets you unify your serving stack all in Node.js

  • if you are using Node.js

  • You can also use your existing Core TensorFlow models to bring

  • them into Node.js, not just the pre-built,

  • off-the-shelf models, but rather your custom models that were

  • built with--

  • Python TensorFlow can be converted.

  • And in an upcoming release, you don't even

  • need the conversion process.

  • You can just use them directly in Node.

  • And finally, you can do all of this

  • without sacrificing performance.

  • You get CPU and GPU acceleration with the underlying TensorFlow

  • C library because that's what Node uses.

  • And we are also working on GPU acceleration via OpenGL,

  • so that removes the need for depending on CUDA drivers

  • as well.

  • So effectively, you get performance that's

  • similar to the Python library.

  • So these attributes of the library

  • enable a variety of use cases across

  • the client-server spectrum.

  • So let's take a look at some of those.

  • So, on the client side, it enables

  • you to build features that need high interactivity,

  • like augmented reality applications,

  • gesture-based interaction, speech recognition,

  • accessibility, and so forth.

  • On the server side of the spectrum,

  • it lets you have your more traditional ML

  • pipelines that solve enterprise-like use cases.

  • And in the middle, that can live either on the server

  • or on the client, are applications

  • that do sentiment analysis, toxicity and abuse reduction,

  • conversational AI, ML-assisted content authoring, and so

  • forth.

  • So you get the flexibility of choosing

  • where you want your ML to run--

  • on the client, on the server, either / or.

  • So whatever your use case is, TensorFlow.js

  • is production-ready-- ready to be leveraged.

  • So with that intro, I'd like to delve deeper

  • into the ready-to-use models available in TensorFlow.js.

  • Our collection of models has grown

  • and is growing to address the use cases that we just

  • mentioned, namely image classification for classifying

  • whole images, detecting and segmenting objects and object

  • boundaries, detecting the human body and estimating pose,

  • recognizing speech commands and common words from audio data,

  • and text models for text classification, toxicity

  • and abuse reduction.

  • You can explore all of these models today on GitHub.

  • You can use them by installing them with npm

  • or by directly including them from our hosted scripts.

  • So let's dive in to a few of these models

  • and see more details on these.

  • So this is the PoseNet model.

  • It performs pose estimation by detecting 17 landmark points

  • on the human body.