Placeholder Image

Subtitles section Play video

  • SPEAKER: It's a pleasure and an honor

  • to welcome our next speakers, Paige Bailey and Brennan Saeta,

  • who are going to talk to us about Swift for TensorFlow,

  • a next generation machine learning platform

  • that leverages innovation.

  • So please help me in welcoming Paige and Brennan.

  • [APPLAUSE]

  • PAIGE BAILEY: Good morning, everyone.

  • So thank you for coming to join us during the lunchtime.

  • We very much appreciate you-- you

  • taking the time to come to TensorFlow world

  • and also to learn a little bit more about Swift.

  • I'm Paige.

  • I'm the product manager for Swift for TF and--

  • BRENNAN SAETA: I'm Brennan.

  • I'm tech lead manager for Swift for TensorFlow.

  • PAIGE BAILEY: Excellent.

  • And so, first, we want to get a feel for the room.

  • How many of you have used Swift before?

  • Excellent.

  • BRENNAN SAETA: Awesome.

  • PAIGE BAILEY: And how many of you

  • attended Paris and Tim and their session, their workshop on--

  • earlier this week?

  • Few hands, cool.

  • So this is a great place to get started if you've never

  • used Swift before.

  • And it's also a wonderful place to learn

  • how Swift is getting more and more functionality

  • added specifically for machine learning and for deep learning.

  • So it's an exciting time.

  • Do you want to get started, Brennan?

  • BRENNAN SAETA: Yeah, let's do it.

  • PAIGE BAILEY: Perfect.

  • BRENNAN SAETA: So today we want to bring you a little bit

  • different kind of a presentation since a lot of you

  • actually know Swift, at least to some degree.

  • We wanted to really dig one level deeper and really

  • explore Swift for TensorFlow from its foundations.

  • So before we get into the meat of the talk,

  • I wanted to take a moment to acknowledge

  • a few very important folks.

  • So number one is--

  • this work would not be possible without the rest

  • of the Swift for TensorFlow team that really

  • does a huge amount of work to make this platform awesome,

  • so huge thank you to them.

  • But also not just the core team itself,

  • But there are a number of other teams

  • that we collaborate with at Google--

  • from the JAX team to the rest of TensorFlow and beyond.

  • And so we really are collaborating together

  • and building on each other's work to make something awesome.

  • So thank you to everyone, both at Google

  • and then, of course, our awesome open source contributors

  • beyond.

  • We have a lot of contributors that

  • are really making a huge difference to the project.

  • And, finally, for those of you who are familiar with Fast.ai,

  • some of the slides may be familiar.

  • So Jeremy Howard and Chris Lattner

  • co-presented a little bit Swift for TensorFlow

  • earlier this year.

  • And I encourage you all to check that out.

  • So, to begin with, I like to start with a question--

  • why Swift for TensorFlow?

  • PAIGE BAILEY: And I could give my answer.

  • BRENNAN SAETA: Let's hear it.

  • PAIGE BAILEY: Awesome.

  • So I've been using Python for a long time.

  • I really love the flexibility and often the user experience

  • that you get using the language.

  • It feels very understandable.

  • You read the code on the page and it reads almost

  • like a plain sentence and it also

  • has a vibrant ecosystem of data science tooling.

  • But something that I was always really frustrated

  • about whenever I was trying to deploy

  • Python models into production is that often you

  • have to do a lot of rewrites.

  • Part of the-- part of Python's benefits

  • don't necessarily translate so well to deployment scenarios.

  • So Swift is, to me, at least, a happy way

  • to get a lot of the great benefits of Python,

  • but also the benefits of a typed language.

  • BRENNAN SAETA: That's right.

  • And so what we are going for with Swift for TensorFlow

  • is an infinitely hackable platform

  • that gets you all the way from research, into production,

  • and, most importantly, that everything is customizable.

  • Because although there's a lot of flexibility

  • that you need to do research, you

  • need just the same amount of flexibility and capability,

  • often sometimes slightly different,

  • to really make a high quality production application.

  • PAIGE BAILEY: Yeah.

  • BRENNAN SAETA: And that's what we're

  • really excited about with Swift for TensorFlow,

  • and especially with the Swift platform,

  • how we can do interesting things to optimize our productivity.

  • So let's take a moment to ask the question what is Swift?

  • PAIGE BAILEY: And Swift is a programming language

  • that was created by Chris Lattner pretty recently.

  • BRENNAN SAETA: Yeah.

  • PAIGE BAILEY: One, he was spending some time

  • over at Apple.

  • Since then, Chris has come to join us and worked at Google.

  • You'll be hearing a lot about one of Chris's other projects

  • later this week.

  • MLIR, I believe, is in the keynote tomorrow.

  • But Swift is a programming language

  • that's focused on that extensibility component

  • that Brennan just mentioned as well as readability, usability,

  • and production readiness.

  • BRENNAN SAETA: That's right.

  • So this is Swift for TensorFlow from foundation.

  • So let's really dig in and let's really go from the foundations.

  • So I think an important question to understand

  • what is Swift at a deeper level is to understand

  • what is a compiler.

  • And so as you program computers, there's

  • two fundamental things that are immovable

  • and that like to think about the world

  • in their own particular ways that

  • are sometimes incompatible.

  • You've got humans, on one hand, that

  • want to think about the world in a particular way

  • and you've got hardware which also

  • thinks about the world in its own particular way.

  • And a programming language is basically an intermediate point

  • in between these two immovable, fundamental truths

  • of technology.

  • And so programming languages are designed for humans

  • to express ideas, but they're also designed

  • to be executed on hardware.

  • And they're different points in that space that you can pick

  • and Swift is just one of such points.

  • But we think it's a really interesting one

  • and we'll talk about why in a couple of minutes.

  • PAIGE BAILEY: Right, so one of the benefits of having

  • a typed language is that a lot of the responsibility that

  • might be pushed on you if you're using a language like Python

  • is suddenly abstracted away.

  • So I don't have to care so much about baking performance

  • optimizations into my code immediately,

  • the compiler can just do it for me.

  • Another great thing is that if you look at the screen

  • right now, you can see an example

  • of a few commands in Swift.

  • It looks very, very similar to Python.

  • There's some lets sprinkled in, there

  • are some-- a few indications of types, but other than that,

  • pretty straightforward to understand.

  • If you're only experienced using typed languages as something

  • like C++, this probably looks a great deal friendlier.

  • BRENNAN SAETA: That's right.

  • And Swift is a very modern and powerful programming language.

  • So even though Swift is statically typed,

  • you don't actually see very many types in advanced Swift code.

  • Here we're defining a dictionary numbers

  • from strings to lists of integers,

  • the compiler can just figure this out for you.

  • One of the principles of Swift is that you

  • have a very helpful compiler.

  • And you can define higher order operations,

  • reduces a common pattern from functional

  • programming across a sequence of elements.

  • And you can then call this x.sum,

  • mapped over all the values in the dictionary.

  • And voila, you've now used a number of higher order

  • functions together without having

  • to write types in a very seamless and nice manner.

  • So what we're trying to get across

  • is that Swift is almost as high level as Python.

  • So this is actually a slide that I am borrowing from Chris

  • and Jeremy's Fast AI lectures.

  • And at the top, you have for Swift for TensorFlow model.

  • And at the bottom, you actually have a Python model.

  • And if you look at the code, if you squint a little bit,

  • they're almost exactly the same thing, right?

  • You've got a few more self dots on the bottom.

  • And you've got some more curly braces on top.

  • But they're basically the same thing.

  • So Swift is just as high level, almost, as Python.

  • PAIGE BAILEY: We also have fewer lines of code in that example,

  • is that right?

  • BRENNAN SAETA: Well, it depends on how

  • you count the curly braces.

  • So little bit of here, little bit of there.

  • But what's really important about Swift is that it's

  • designed to be fast, right?

  • It's practically in the name.

  • So who here has actually looked at Assembly in the last six

  • months?

  • Ah, fewer hands this time.

  • Well, bet you didn't expect to be looking at Assembly today.

  • So for those of you who don't know,

  • the Godbolt Compiler Explorer is really neat.

  • You can plug-in some source code on the left-hand side.

  • And it will give you the assembly

  • on the right-hand side.

  • And here we have Swift.

  • We're calling a higher order function reduce on a generic

  • type-- so it's actually an array, in this case,

  • an array of integers--

  • passing in the plus function, reducing over everything.

  • And if you look at the loop that is going on here,

  • so you've got your function prelude.

  • You do a couple checks to make sure

  • that everything is reasonable.

  • You actually just have these five instructions

  • that form the main body of your loop.

  • You do the add.

  • You check to make sure that you don't have any overflows.

  • You then increment one to get to the next value in the array.

  • You compare to make sure you've not

  • reached the end of the array.

  • And if you haven't, then you do your loop again.

  • This is like the most efficient set of instructions

  • you could possibly do to sum across an array.

  • And so even though you have all these higher order things,

  • you're not even writing types, the Swift compiler

  • can helpfully optimize it all the way down for you.

  • Now actually, for those of you who

  • really know about Performance and Assembly,

  • a really helpful compiler will actually

  • vectorize this operation for you.

  • And so Swift, by default, is safe.

  • So when you use Plus, you're getting safe integer

  • add that checks for overflows.

  • But if you do Ampersand plus-- the Ampersand plus function,

  • this says just give me--

  • 2 is compliment, don't bother checking for overflows--

  • and the Swift compiler will hopefully

  • vectorize your operation for you to make

  • it run extremely quickly.

  • So Swift is designed to be safe, to be high level,

  • but of course, also fast.

  • So programming languages are all about atoms and composition.

  • And so the atoms in Python are C code

  • that implements Python objects.

  • And the composition is an interpreter

  • that lets you combine these calls in very interesting ways,

  • right, like Python array, Python dictionary.

  • They're very, very carefully implemented in Python

  • to be as quick as possible in C. And then

  • you get to build your whole universe on top of it.

  • C++, however, takes a different tack.

  • There is no C underneath C++, right?

  • It's just C++.

  • So the atoms in C++ are int, float, C arrays, and pointers.

  • You got a few other atoms as well.

  • And you could develop your own structures in C++,

  • like structs classes, std::complex, st::string,

  • std::vector.

  • So that's a C++, and C++'s view of the world.

  • But as you think about C++, what's the real difference

  • between string and array?

  • They both are sequences of things, right?

  • And so programming language has actually evolved since when C++

  • was initially created.

  • And we actually can do more interesting things today.

  • And so Swift builds on top of a lot of this programming

  • language evolution and design.

  • So Swift is really syntactic sugar for LLVM.

  • Remember getting back to that picture?

  • You've got humans.

  • And you've got hardware.

  • Programming languages are in the middle.

  • You've got text editors to have humans interface

  • with a programming language.

  • You've got compilers to interact with-- from program [INAUDIBLE]

  • to the hardware.

  • And so why don't you define your language

  • in terms of the compiler.

  • You can just get rid of a lot of extra intermediate things.

  • And by doing so you can make a programming language

  • that is more powerful.

  • And so the primitives in Swift are LLVM instructions.

  • And the composition are structs and classes just like C++.

  • But in Swift, instead of float and int

  • being defined by the language, they're

  • defined in the standard library.

  • So that you can define your own int and float.

  • Now this may seem like an esoteric question

  • that that isn't terribly important,

  • but, well, as it turns out, the definition of float

  • is something that's in question these days.

  • For those of you who are not familiar with TPUs, but TPUs

  • define their own version of floating point format.

  • They define bfloat16, which is short for the brain

  • floating point format that was developed as part of the Google

  • Brain research efforts.

  • And we found that the bfloat16 format

  • is much better for both efficiency in the hardware

  • and for training neural networks.

  • And so Swift as a language is especially interesting for us

  • because it lets us define a bfloat16 type that

  • works exactly like int and float in the rest of the language.

  • So what does that actually look like?

  • It can't be, is it super weird?

  • What's this black magic?

  • Well, again, this is Swift from the foundation.

  • So let's understand, what is the implementation of float?

  • You can totally check it out on GitHub.

  • But here's sort of what it looks like, right.

  • You've got a struct float because your composition is

  • structs.

  • And you define the value type, which

  • is defined in terms of the LLVM primitives,

  • this builtin.FPIEEE32.

  • Plus equals is just a function, just a normal function

  • that's defined in Swift.

  • And it's implemented using the LLVM primitives.

  • PAIGE BAILEY: And bfloat16, as Brennan mentioned,

  • is incredibly important to us specifically for TPUs.

  • But we're seeing even more kind of exotic data formats

  • and sort of requirements for floats

  • for other specialized hardware.

  • So we anticipate this will become even more important

  • in the future.

  • And it's a place that we're really

  • excited to explore more with Swift.

  • BRENNAN SAETA: That's right.

  • And so even though int and float,

  • our elements in the standard library,

  • they're just as fast straight as we looked at the assembly

  • instructions before.

  • But that also means that Swift as a language

  • is super flexible, right.

  • So you can define extensions on int.

  • And you can define the is/odd property of int

  • just under Jupyter Notebook.

  • You can define an extension on Boolean symbol, which

  • returns a Unicode character string of thumbs

  • up or thumbs down.

  • And you can then compose all of these

  • exactly as you'd expect to print out these silly true/falses,

  • and emojis, and whatnot.

  • But Swift, again, even though you're

  • using all these high level operations,

  • is just as fast as C. And so these notebooks, which

  • I encourage you to check out, from Jeremy Howard and Chris

  • Lattner's [INAUDIBLE] course, actually show you

  • how you can build a matmul in Swift that

  • is just as fast as C, perfect performance parody.

  • So that's a little bit about Swift.

  • Let's talk about deep learning.

  • PAIGE BAILEY: Yes.

  • BRENNAN SAETA: OK, no, let's actually

  • have a brief detour before we get there.

  • PAIGE BAILEY: Well, it's also very important

  • to understand the mathematics involved

  • in implementing a deep learning, a deep learning project.

  • You have to understand a little bit about linear algebra,

  • and a little bit about math in order

  • to effectively architecture experiments.

  • So for this next slide, we have a quiz.

  • And I promise it isn't too scary.

  • But somebody in the room, shout out

  • what you think B will print based on this slide.

  • AUDIENCE: [INAUDIBLE].

  • PAIGE BAILEY: I think I heard a 3.

  • So that is correct.

  • BRENNAN SAETA: 3, very good.

  • PAIGE BAILEY: Excellent.

  • BRENNAN SAETA: This is value semantics.

  • This is how integer is work in Python.

  • It is exactly what you'd expect.

  • If a is 3, b is equal to a.

  • That means b is 3.

  • No matter what you do with a, 3 is still 3.

  • B is still 3.

  • We're all good.

  • PAIGE BAILEY: Yeah, so let's take a look at this next slide.

  • We have a Python list.

  • And this, crazily enough, whenever you add 4

  • would print 3 and 4 for the values of b.

  • So this is not value semantics.

  • And this offers a great many frustrations for developers,

  • as they're attempting to architect their experiments.

  • BRENNAN SAETA: That's right.

  • Math operates on value semantics.

  • And Swift is designed to be a very, very fast programming

  • language, but at the same time really push forward

  • value semantics.

  • And so in Swift, arrays actually do have value semantics,

  • and behave exactly as you'd expect.

  • So here is a copy/paste from a terminal,

  • where b stays the same.

  • And just to show you that I don't have anything

  • up my sleeve, right, a is actually 3 and 4.

  • But see in Swift, not just arrays, but so do dictionaries,

  • and so do all other higher level types,

  • many high level types that you build

  • can have value semantics as well.

  • And this has important ramifications

  • for how to write neural networks,

  • and do machine learning, and automatic differentiation,

  • which leads us to our next topic.

  • PAIGE BAILEY: Excellent, automatic differentiation

  • is one of the sort of product-defining features

  • of Swift.

  • And it's one of the things that we're

  • most excited about upstreaming to the Swift programming

  • language.

  • So we showed in one of our slides previously, I believe,

  • a function that was prefaced with at differentiable.

  • And this is sort of functionality

  • that allows you to differentiate not just sort of aspects

  • of machine learning models, but literally any function in Swift

  • you can take the gradient of.

  • So you want to show an example, Brennan?

  • BRENNAN SAETA: You know what, it's better to show, not tell.

  • So let's see.

  • Here we are in a Google Colab.

  • So this is a hosted Jupyter Notebook.

  • And here we define my function.

  • That just takes two doubles, does some math operations,

  • and returns on the double.

  • And so to make this function differentiable,

  • you just annotate it at differentiable.

  • And once you've done that and run the cell,

  • the compiler has now constructed the forward pass, but also

  • the backwards pass for this function automatically.

  • And so to then figure out what the derivative of that function

  • is, you just call the gradient function.

  • So you do the gradient at 0.5 of my function.

  • And this gets you the partial derivative with respect to a.

  • Of course, you may want to take the partial derivative

  • with respect to both a and b.

  • And so that is exactly as you'd expect.

  • So you have now both points.

  • When you're training neural networks,

  • you sometimes also want like the value

  • of the function in addition to the partial derivatives

  • at a particular point.

  • And so to do that, instead of calling gradient,

  • just call value with gradient.

  • And this will return a result that has, it's just a tuple.

  • So in Swift, you've got named tuples, optionally.

  • So you've got the value in the gradient.

  • And you can manipulate them exactly as you'd expect.

  • PAIGE BAILEY: Excellent, and if you're coming from Python land,

  • or you stuck around for the presentation

  • a little bit earlier, this would be

  • a sort of taking the equivalent place of gradient tape

  • and TensorFlow.

  • So instead of having to create a gradient tape

  • and collecting your variables on that,

  • you can just go ahead and do the annotations for your functions.

  • BRENNAN SAETA: That's right.

  • And the thing that we're really excited about

  • is, again, this is language integrated.

  • So these aren't just on Tensor of double

  • that are 1D or 0D Tensors.

  • These are just regular, built-in int and flow

  • and double and beyond.

  • So that's how you write a function, a custom function

  • and take the derivative of it.

  • But we really want to build this next generation platform that

  • allows for maximum flexibility.

  • So you need to be able write your own types that

  • can also be differentiable.

  • So let's see how to do that.

  • So here we define a point in 2D space.

  • You've got x and y.

  • And we just mark it as differentiable.

  • We can then use, we defined some properties on it, right.

  • So the function dot and the [INAUDIBLE] helper function.

  • And you can then compute the gradients of this dot product,

  • this point dotted with itself.

  • And so there you go.

  • You then get the tangent of this point at that space.

  • Now often it's a good idea to mark your functions

  • as a differential because this helps the compiler catch

  • errors for you.

  • And it's also a great form of documentation.

  • And here we see an example of the compiler catching

  • an error for us.

  • So Paige, this says, error, can only differentiate functions

  • with results that conform to differentiable.

  • But int does not conform to differentiable.

  • What does that mean?

  • PAIGE BAILEY: Because all of the values are discrete.

  • Like, you can't add a little incremental step change.

  • BRENNAN SAETA: That's exactly right.

  • So integers, right, the derivatives

  • are about infinitesimally small steps.

  • And integers don't allow you to take

  • infinitesimally small steps, right?

  • You've got 0 and 1.

  • There's nothing in between.

  • And so this function is not differentiable.

  • And the compiler is very helpful in telling you, hey,

  • you can't do this.

  • And this is why, instead of silently giving you

  • a wrong answer.

  • So if we remove the annotation from that function,

  • compiler caught an error for us.

  • Great, we can keep going on.

  • But let's say we want to take things to the next level.

  • What if we want to define an additional property, say

  • like the magnitude property, which

  • is defined by a vector from the origin

  • to that particular point?

  • PAIGE BAILEY: And it looks like you're

  • importing Glibc here, which means,

  • do you get all of the mathematics operations

  • that are already within the C--

  • OK cool.

  • BRENNAN SAETA: That's right.

  • So this is a great excuse for us to show the Swift C interop.

  • So in Swift you can just import any arbitrary C header

  • and use the symbols to find in it,

  • no boilerplate, no wrappers, no nothing,

  • just go straight for it.

  • So here we're going to import Glibc.

  • And this is the square root function

  • defined by C standard library.

  • So we can then use that to define point,

  • the magnitude, sorry, extension on point,

  • which is the square root of x-squared plus y-squared.

  • So we run this.

  • And we get magnitude.

  • And dang it.

  • We've got another compiler.

  • What's this saying?

  • PAIGE BAILEY: So read through, cannot differentiate functions

  • that have not been marked differentiable and that are

  • defined in other files.

  • So it looks like you have to go and change

  • this in another file.

  • BRENNAN SAETA: That's right.

  • So square root is defined by the C compiler.

  • And the C compiler hasn't been taught how to take derivatives.

  • So this is a great excuse to show you

  • how you can write your own custom derivatives

  • for arbitrary functions.

  • So if you recall that the derivative of the square root

  • of x is 1 over 2 times the square root of x,

  • we can define my square root-- which underneath the hood,

  • just call C. And we can define the derivative right here

  • with this function closure, in line closure here.

  • So we run that.

  • And that compiles successfully.

  • Go here, change the square root to my square root.

  • That runs.

  • And voila, we're off to the races.

  • We've now defined a differentiable magnitude

  • function on point.

  • We can then use this function inside other differentiable

  • functions exactly way you'd expect,

  • everything fully composes.

  • So here we define this silly function, taking points

  • and doubles, right.

  • You can mix arbitrary differentiable types.

  • And we're going to define our point, take the gradient of it.

  • We can get our tangent vector, which

  • has the partial derivatives with respect to both x and y

  • at that particular point.

  • PAIGE BAILEY: Excellent.

  • And I so I love that you're able to use

  • all of the great code that's been

  • written to support mathematics.

  • But I also really love that with Swift you get C and C++

  • interop.

  • And as Brennan mentioned, you can import any header,

  • use it as is or extend it to meet your use case.

  • So if you're working in an enterprise and you have

  • a massive existing C++ code base, like well, Google.

  • And then it's really helpful to be

  • able to not have to rewrite the entire wheel.

  • Like you can just import things that people

  • have created over time, use it.

  • And then everything as part of a Swift program would compile

  • down to a .so files.

  • So you can deploy it on any platform, Windows, Linux, Mac,

  • iOS, Android, embedded devices, even.

  • BRENNAN SAETA: That's right.

  • PAIGE BAILEY: It really is as sort

  • of the most multi-platform approach

  • that I've seen for machine learning.

  • BRENNAN SAETA: That's right.

  • So I think this is enough.

  • There's actually more power in this Swift

  • for TensorFlow automatic differentiation system.

  • And I encourage you to check out more details later.

  • But let's move on to actually talking about neural networks.

  • PAIGE BAILEY: Cool.

  • BRENNAN SAETA: And here we go.

  • So, tell us about writing neural networks in Swift.

  • PAIGE BAILEY: Writing neural networks in Swift,

  • we just saw an example of it.

  • It's pretty straightforward.

  • You only have about five lines of code.

  • It looks very similar to Keras.

  • And if you open up a Colab notebook--

  • BRENNAN SAETA: There you go.

  • PAIGE BAILEY: --you can see an example of it here.

  • BRENNAN SAETA: Great, so why don't you walk us

  • through this right here.

  • PAIGE BAILEY: Right, so you create a model

  • by adding some layers.

  • Here we see a convolutional layer, Conv2D, a pooling layer.

  • We flatten it.

  • And then we call a function with active,

  • or we preface the callAsFunction without differentiable.

  • And that gets us our gradients.

  • BRENNAN SAETA: Yep.

  • So here we define a sequential model just in Swift.

  • So now let's actually use this as we would to actually train

  • this model on some data.

  • So here we define some random training data,

  • just because for convenience purposes.

  • We instantiate the model and an optimizer for it.

  • And then we're going to run this for 10 training steps.

  • So we do this in just a few lines of code.

  • We're actually handwriting out the training loop up here.

  • But here we get the value with gradient,

  • which gets us our loss and our gradient.

  • And we then print out the loss and use the optimizer

  • to update our model along the gradient.

  • And so just like that, we're training a model.

  • And you can see the loss value is decreasing.

  • PAIGE BAILEY: Yep, and if you're not

  • a fan of writing out as many custom steps

  • as we have expressed here, Jeremy Howard

  • did a great collaboration with Chris and with the Swift

  • for TensorFlow team, where he re-implemented

  • Fast AI on top of Swift.

  • So if you're looking for higher level constructs,

  • you just want things that are out of the box,

  • I highly recommend checking out those notebooks as well.

  • BRENNAN SAETA: That's right.

  • But one of the common workflows that anyone

  • who's doing machine learning is they

  • have their first version of the model.

  • They train it.

  • And they figure how to make it better.

  • Right, that's one of the great promises of machine learning,

  • is you can keep incrementing and optimizing.

  • So let's see how we can optimize this model

  • to make it a little bit better.

  • So you may be following a lot of research.

  • And you may hear that skip connections,

  • or residual networks are a really good idea.

  • And so in the next 30 seconds, we're

  • going to convert this network to be a residual network

  • with a skip connection.

  • You ready?

  • All right, here we go.

  • So what we need to do is we need to define

  • our second dense layer that we're going to use

  • as part of the skip connection.

  • So we're going to dense float.

  • And I can't type.

  • And here you can see the autocomplete, right.

  • We care a lot about developer productivity.

  • Autocomplete, that's helping to fill in the initialize

  • or parameters for us.

  • Fix up a little bit of the shapes,

  • great, now we're all good.

  • So we've now added the extra layer.

  • Now we need to add in that skip connection.

  • So let tmp is equal to input.

  • And we need to capture the value after the flatten step.

  • Let temp 2 is equal to dense of tmp.

  • So this is going to be the first part.

  • And now we do our skip connection,

  • dense of tmp plus tmp 2.

  • Voila, we have now added an additional layer

  • and added a skip connection to our model right

  • here in a Jupyter notebook.

  • So we now recompile that to make sure that that's working well.

  • Looks like it is.

  • And so we're going to reinstantiate our model,

  • because it's now a different model.

  • And we're going to retrain it for 10 steps.

  • And looks like we actually got slightly lower, or similar loss

  • values.

  • So this is good.

  • But we can, of course, do better.

  • But this is sort of an example workflow

  • how easy it is to customize models

  • using Swift for TensorFlow.

  • But for those of you who are, for those of you

  • who are just applying models to standard data sets,

  • customizing the [INAUDIBLE] architecture

  • is the bread and butter workflow.

  • But we're finding more and more that

  • both research and production, you sometimes

  • need to be a little bit more advanced.

  • And so Swift for TensorFlow, the platform

  • is infinite hackability, infinite customizability.

  • Let's actually see how you can [INAUDIBLE] your own custom

  • layer also in a Jupyter Notebook.

  • All right, so let's say if we're a researcher.

  • And we wanted to define a custom layer

  • that we add two bias terms to our dense layer instead of just

  • one.

  • Well, in about 10 seconds we're going to write this.

  • All right, ready?

  • Here we go.

  • Boom, all right.

  • Great, it worked this time.

  • All right, so here we define our weights, our weight tensor,

  • and our first bias term, and our second bias term.

  • We have our constructor.

  • And then finally, the actual forward function,

  • or callAsFunction, where you take the matrix multiplication

  • of the input in the weights and you add in the two bias terms.

  • So there we go.

  • This should compile.

  • Ha ha ha, and once it does--

  • there we go-- we can then use it in our new model.

  • So instead of a dense layer, we use a double bias dense layer.

  • We can then compile this, instantiate the model,

  • and whoops, this doesn't quite work.

  • Again, here you see the Swift compiler

  • being extremely helpful.

  • You have a quick typo here.

  • Instead of label. it should be labels.

  • And it just tells you this is what you should be doing.

  • And you're off to the races.

  • So we fix that up.

  • And we can run the training steps.

  • And--

  • PAIGE BAILEY: The loss is worse.

  • BRENNAN SAETA: Dang it.

  • We have a little bit of bug in our implementation.

  • And so exercise for the audience to figure out

  • where that bug was, but we'll let you think

  • about that on your own time.

  • Anyway, that's a brief tour about how

  • to use Swift for TensorFlow as an ML practitioner,

  • applying and fitting against data sets and models.

  • Excellent, so we'll head back to the slides.

  • Oh actually, I forgot.

  • So one other thing, right, so we showed

  • how to write a custom model.

  • We showed how to write a custom layer.

  • But if you're doing research into different optimizers,

  • you also going to be able to customize that.

  • And so optimizer in Swift are just pure Swift code.

  • And so here we have the salient bits of the stochastic gradient

  • descent optimizer that we were using just now.

  • And so it's just 10 lines of code

  • to implement momentum, nesterov and other advanced features

  • for a stochastic gradient descent optimizer.

  • So we've been spending a lot of time really thinking

  • about what are the right APIs, and what

  • are the right ways to factor machine learning

  • to make it easy to use, powerful, and flexible.

  • And we're pretty excited by where we are today.

  • But we're always looking, of course, to improve.

  • So if you're interested in pushing the limits

  • of the state-of-the-art, here are a bunch of other things

  • that we'd love to have you try and give us feedback,

  • and really to help us improve our APIs.

  • PAIGE BAILEY: So we mentioned before

  • that one of the biggest sort of wins for Python

  • is that an entire ecosystem of data science products

  • have been built around it.

  • And the good news is with Swift, you

  • don't have to give any of that up.

  • So I think Brennan has a demo coming up

  • showing that you can use tooling like Matplotlib, or NumPy,

  • or SciPy, or any of your favorite Python packages

  • directly from Swift, almost identical to the same way

  • that you would run it in a Python Jupyter notebook.

  • Is that correct?

  • BRENNAN SAETA: That's right.

  • PAIGE BAILEY: Yeah.

  • BRENNAN SAETA: So let's take a look.

  • Ah, my secret notes.

  • All right, there we go.

  • So here is a Jupyter Notebook, or actually Google Colab,

  • where we import TensorFlow.

  • And we also import Python.

  • And once we've imported Python, we

  • can use this Python object to import

  • arbitrary Python libraries with no wrappers at all.

  • So here we're going to import Matplotlib, the pyplot actually

  • type from there, and NumPy.

  • And we're gonna assign these to plot and NP.

  • And after that, you can write Swift code-- now again,

  • this is Swift code that looks almost exactly like Python.

  • So here we're gonna call np.linspace, assign that to x.

  • And we're going to plot x, and sine of x, and x and cosine

  • of x.

  • And just to show you that this really does work, we'll run it.

  • And voila, you're using pyplot from that Matplotlib and NumPy

  • right from Swift.

  • In short, you don't have to lose all the great Python ecosystem

  • while taking advantage of all the capabilities

  • that Swift offers.

  • Now sure, what about a more wackier library?

  • I mean, these are kind of wacky.

  • But really one of the important parts of the Python data

  • science ecosystem are all the machine learning libraries.

  • And one of my favorites is actually OpenAI Gym.

  • So this is a Python library that has a bunch of reinforcement

  • learning environments that you can use to train and evaluate

  • neural networks as you're doing research in reinforcement

  • learning.

  • And so it's a Python library.

  • But as it turns out, with Swift for TensorFlow,

  • you can just use it.

  • So here we can use Python to import Gym.

  • We're going to define some hyper-parameters.

  • Here are going to define our neural network

  • using Swift for TensorFlow. so just like you saw before, we've

  • got two dense layers.

  • We have some code to interact with the Python environment,

  • and just including defining the next batch.

  • But here we use that Gym library to make

  • the cartpole environment.

  • We instantiate our neural network

  • and our optimizer for it.

  • And then we're going to train it for as many steps as we need.

  • Again, we see that the Swift compiler is actually

  • really helpful.

  • It's telling us that on line 24, instead of one hot labels,

  • we probably mean probabilities.

  • And if I type that right, we should now see this training.

  • So here we are using a neural network,

  • defined in Swift for TensorFlow, training in the Python OpenAI

  • Gym library, totally seamlessly, right in a Jupyter notebook.

  • In fact, it solves it.

  • And of course, you can take the results that you've just,

  • the intermediate values that you've

  • computed while you've been training this model,

  • and plot it using that Matplotlib

  • right in the notebook.

  • So it all totally works, all back and forth.

  • PAIGE BAILEY: And as a machine learning engineer, I

  • think an important thing to note with this

  • is that your workflow doesn't change, right?

  • The only sort of incremental step that you need to add

  • is importing Python and assigning [INAUDIBLE]..

  • And that's about it.

  • BRENNAN SAETA: Yeah, excellent.

  • So that's a number of demos and notebooks

  • that you can see, sort of the state of the world today.

  • But let's talk a little bit more about what's

  • been going on in the future, the directions that we're heading.

  • So we have C interop, and we've been working on C++ interop

  • as well.

  • So here's sort of an example of what we have.

  • So on the left-hand, the left column, you define a C++ type,

  • header file, et cetera.

  • And on the right-hand side, if you define this in example.h,

  • you can then import it in Swift as import example.

  • And you can then call these methods directly on it.

  • We have a number of features already supported, including

  • instantiated templates.

  • Because as it turns out, C++ is not quite the simplest

  • language.

  • But we're seeing that it's available to be useful

  • for a number of things already today.

  • One of the mantras is infinite flexibility

  • and an infinite hackability.

  • And so we've talked about writing custom networks, custom

  • layers, custom optimizers, but what

  • about custom kernels as well?

  • As it turns out, we've been working on that.

  • And so this is some of the work that is

  • preliminary work on the team.

  • But here you can define 1D average pooling in pure Swift.

  • And this actually runs on both CPUs and GPUs in the prototype

  • that we put together.

  • Now we're still iterating and getting

  • the fundamentals of the design right.

  • But this is sort of an example or the direction

  • that we're trying to go with Swift for TensorFlow

  • with infinite hackability, so that you're not limited

  • to what's in the box for you.

  • The whole world's your oyster.

  • You get to program at any level of the stack doing

  • whatever you need to do.

  • You're never limited when you're building

  • on the Swift for TensorFlow platform.

  • We've had a lot of amazing things

  • happen as part of the community that's been growing

  • around Swift for TensorFlow.

  • So we have in development a model garden

  • with a variety of supervised learning models,

  • and some unsupervised learning models.

  • But one of the collaborations that I've

  • been super excited about has been with the DeepMind team.

  • That's part of Alphabet.

  • They recently released OpenSpiel,

  • which is a collection of reinforcement environments

  • and algorithms.

  • And a subset of those have been imported to Swift.

  • And so that's a great place if you want to check it out,

  • if you're really interested in reinforcement learning.

  • So if you've been interested, how do you get started, Paige?

  • PAIGE BAILEY: Well, you can get started

  • writing Google Colab, so the examples that Brennan showed.

  • We also have a variety of tutorials available

  • at tensorflow.org/swift.

  • And I also encourage you to look at Jeremy Howard's Swift AI

  • implementation of Fast AI that's available on GitHub.

  • BRENNAN SAETA: I did want to take a moment

  • to really highlight some of the community projects

  • that's been going on.

  • So we talked about OpenSpiel.

  • But Anthony Pantanios has done amazing work

  • with Swift RL, another reinforcement learning library,

  • and Swift Atari Learning Environment, which I think

  • are really great.

  • There's SwiftPlot, swiftML that have happened

  • as a result of the Google Summer of Code,

  • projects that have happened this summer

  • that Paige has organized that I think are really fantastic.

  • SwiftAI came out of the Fast AI collaboration.

  • We have Tensors Fitting Perfectly,

  • where Adam [INAUDIBLE] join the Swift for TensorFlow

  • team this past summer and did some really great research.

  • Everything is open source.

  • We'd love to have you join our community.

  • We have open design meetings every Friday at 9 AM Pacific,

  • or 1600 UTC.

  • And we'd love to have you all join in

  • and really help shape this platform together with us.

  • We recently released 0.5 with a number of improvements.

  • And stay tuned for our next release is coming up.

  • And with that, I'd like to thank you

  • all for listening to us today.

  • PAIGE BAILEY: Excellent.

  • [APPLAUSE]

  • BRENNAN SAETA: We will be around here and in the hallways

  • for the rest of the day.

  • We have a-- for those of you here

  • in person at the conference, we will

  • be hanging out at the Ask a TensorFlower Cafe

  • in the expo hall.

  • Come with your questions, feedback.

  • We'd love to hear what you're thinking

  • and see you at our open design meetings.

  • Thanks.

SPEAKER: It's a pleasure and an honor

Subtitles and vocabulary

Click the word to look it up Click the word to find further inforamtion about it