Placeholder Image

Subtitles section Play video

  • ♪ (intro music) ♪

  • Shall we take a look at the first one that came in, was from @alpharthur:

  • "Can I ask about any prebuilt binary for the RTX 2080 GPU on Ubuntu 16?"

  • That is very specific.

  • - That is very specific. - (laughter)

  • So, I like specific questions, even if I can't answer them.

  • (laughter)

  • So, in this case,

  • the prebuilt binaries for TensorFlow

  • tend to be associated with a specific driver from Nvidia.

  • So, the version of CUDA that we support

  • or the version of cuDNN that we support.

  • So, my recommendation would be

  • if you're taking a look at any of the prebuilt binaries,

  • take a look at what driver or what version of the driver

  • you have supported on that specific card.

  • I'm not an expert on Nvidia cards, although I love them,

  • so I don't really know what's supported by that card, Arthur.

  • But if you go over here, like on my laptop,

  • I've called up some of what Nvidia say as their TensorFlow system requirements

  • and the specific versions of the drivers that they support.

  • And the one gotcha--

  • and we had this in the last segment as well--

  • that I find one working with GPUs

  • is that it's easy for you to go to the driver vendor

  • and download the latest version.

  • But that may not be the one that TensorFlow is built for

  • or the one that it supports.

  • So, just make sure that they actually match each other.

  • And you should be good to go, even with that particular card.

  • Yes. And if you have warm feelings and excitement

  • about builds in general for TensorFlow,

  • we have a great special interest group

  • specifically focused on that called SIG-Build.

  • (Laurence) SIG-Build.

  • So, I strongly suggest going to the community section of our GitHub

  • and checking out the SIG-Build listserv

  • and joining it and joining our weekly stand-ups.

  • Right. So, thanks, Arthur, for that question.

  • And the next question is a really funny one, I think.

  • How many times have you been asked this today?

  • Oh, my God, at least 12.

  • - At least! - (laughter)

  • And then the other flavor of it is,

  • well, is this is a particular symbol that I use all the time?

  • Is this going to also be supported in TensorFlow 2.0?

  • And if not, what is changed?

  • (giggling)

  • People have invested so much time building stuff in TensorFlow 1.x,

  • they don't want it to be deprecated,

  • - they don't want it to go away... - (Paige) Understandable.

  • So, how do we answer it?

  • "Do my TensorFlow scripts work with TensorFlow 2.0"?

  • The sad fact is that probably not.

  • They would not work with TensorFlow 2.0 out of the box.

  • But we have created an upgrade utility for you to use.

  • It's automatically downloaded with TensorFlow 2.0

  • whenever you download it.

  • For more information on it and what in particular it's doing,

  • you can check out this medium blog post that I and my colleague Anna created,

  • as well as this Upgrade to TensorFlow 2.0 video.

  • It goes through, and with GIFs,

  • which is the best communication medium possible.

  • It shows you how you can use the upgrade script on an end file.

  • So, any sort of arbitrary Python file or even Jupyter Notebooks--

  • one of our machine-learning GDEs created an extension

  • that will allow you to do that as well.

  • And it'll give you an export.txt file that shows you all of the symbol renames,

  • the added keywords,

  • and then also some manual changes if you have to make manual changes.

  • - (Lawrence) Cool. - (Paige) Usually, you do not.

  • So, to see this in action, we can go and take a look

  • at this particular text generation example that we have running a Shakespeare--

  • well, it takes all of the corpus of Shakespeare text,

  • trains against the Shakespeare text and generates something

  • that the bard could have potentially written

  • should he have had access to deep learning resources.

  • (Laurence) "I know you all and will uphold

  • the wildest unyoked humour of your [idleness.]"

  • (Paige) I did not know you knew Shakespeare.

  • (Lawrence chuckles)

  • I actually played Henry IV in high school.

  • That's amazing.

  • That's why I love this notebook.

  • I was Beatrice in Much Ado About Nothing.

  • (Laurence) Oh, cool.

  • While we're on Much Ado About Nothing, maybe we should go back to the notebook.

  • Yes, so here's what it looks like in collab form,

  • text generation using an RRN with eager execution.

  • You could export the Python file,

  • and then to upgrade it--

  • (Laurence) You've got to reconnect the runtime first.

  • (Paige) This is true.

  • So... starting it.

  • It looks like the requirements--

  • we can check to see that we're using TensorFlow Alpha.

  • And then, like I mentioned before,

  • all you would have to do is preface this with the !tf_upgrade_v2,

  • the name of the Python file is text_generation.

  • I want to create an upgrade.

  • Shift+Enter.

  • It does its upgrading magic,

  • and very quickly,

  • tells me all of the things that would need to be changed

  • to make it 2.0 compatible

  • and creates that file for me off to the side.

  • So, now, if I wanted to run this model,

  • it should be able to train as it would.

  • So, let's just check to make sure that would be the case.

  • (Laurence) I think a lot of the errors that you're seeing here--

  • it's more just renamed APIs

  • rather than breaking changes within the API--

  • (Paige) This is true.

  • So, you can see that you have some renames and some additional keywords.

  • Sounds good. And I saw you have

  • some handy-dandy GIFs in there?

  • Yes! Absolutely.

  • Are there any GIFs for those who don't say "JIF"?

  • (laughter)

  • Sorry, I had to work that joke in.

  • Well, I'm PB,

  • so peanut butter automatically works.

  • Exactly. Sounds good.

  • So, when it comes to upgrade, there are a few little gotchas in summary,

  • but hopefully this blog post and your video

  • and all of the stuff that we're doing

  • will help you get around those gotchas.

  • And even more amazingly,

  • the community that you were mentioning before--

  • we've had such an interest in testing TensorFlow 2.0

  • and trying it out against historic models

  • that we've formed a weekly testing stand-up,

  • and also we have a migration support hour

  • that's being implemented with the internal support hour.

  • So, if you have an external group to Google

  • that's interested in upgrading your models,

  • please join the testing group, and we can get you situated.

  • And a lot of stuff that we've seen, like in Keras models, for example--

  • Karmel had that great slide where she was training Fashion MNIST.

  • - The code is the exact same. - It's exactly the same.

  • So, while there might be stuff changing under the hood,

  • a lot of the surface-level code

  • that you're going to be writing in Keras, at least, isn't changing.

  • If you've used Keras,

  • you're probably not going to have any problems.

  • So... good stuff.

  • So, shall we move on to the next question?

  • - Yes! - I know we could talk about 2.0 all day.

  • Okay, we just mentioned Keras, and it appears.

  • So, I guess I could ask you this question.

  • Hopefully, you know the answer.

  • "What is the purpose of keeping Estimators and Keras as separate APIs?

  • Is there going to be something native to Keras models

  • that allows for distributed training à la train_and_evaluate?"

  • Okay, so the purpose of keeping them, I think,

  • there are many purposes, right?

  • So, I think for me, the main purpose that I would like to think of

  • is one that is because a lot of people are using them.

  • And including internal Google teams

  • that would tar and feather us if we removed them.

  • (laughter)

  • So, when it comes to Estimators,

  • Estimators are really great for large-scale training.

  • Yes!

  • A lot of the time, if you're doing a lot of large-scale training,

  • keep going with Estimators because they're great!

  • Because when I started with TensorFlow, I started with Estimators,

  • because I couldn't figure out what a node was in a neural network,

  • and there were all these concepts that I had to learn,

  • while I had this simple Estimator that I could use

  • to do a DNN or something like that.

  • So, they're there for a reason, and they're staying for the reason.

  • Keras is one of the things

  • that, from the point of view of making life easier for developers,

  • that we've really been doubling down on TensorFlow 2.0.

  • And things like we just spoke about, the code is the same between 1 and 2,

  • and it's the layers API, I think, makes it super simple

  • for you to design in a neural network,

  • and then the fact that you can go low level beyond that--

  • like define your own layers.

  • It really allows you to drive stick instead of driving automatic.

  • Absolutely. One of the beauties of Keras and 2.0 is that you have Keras

  • the way that you're probably familiar with using it,

  • and then, if you need to do additional customizations,

  • there's a subclassing component.

  • And then, if you need to go even lower, then we have something called TF Module,

  • and we even expose some of the basic, most core ops of TensorFlow as well.

  • So, at any sort of level

  • you want to interact with the API, you can.

  • I think there was another part of the question

  • was around distributed training.

  • Sorry, it scrolled off, so I can't see it now.

  • But there's something called distributed strategy

  • with Keras and TensorFlow 2,

  • and the whole idea behind that

  • is to allow you to be able to distribute your training,

  • maybe across multiple GPUs on the same machine,

  • maybe across multiple GPUs on different machines,

  • maybe across TPUs spread all over the place,

  • that kind of thing.

  • So, distribution strategy is really all about that--

  • to help you with that.

  • So, Estimators and Keras, we love them both,

  • they're both still there.

  • Hopefully, this is something that will help you with that question.

  • I think we've got time for just one more.

  • - Absolutely. - Oh, this is a Paige question!

  • This is totally a me question.

  • I am the Python person.

  • So, "Ask TensorFlow, when will TensorFlow be supported

  • in Python 3.7 and hence be accessed in Anaconda 3?"

  • So, I can certainly answer the Python 3.7,

  • and also, I would love to speak a little bit more about support

  • for Python going forward.

  • So, to answer the 3.7 question,

  • I'm going to bounce over to our TensorFlow 2.0 project tracker.

  • These are all of the standing issues

  • that we have when doing development for TensorFlow 2.0.

  • - It's transparent-- - (Lawrence) I see your avatar.

  • (Paige) Yes, I have filed many issues.

  • And all of them are transparent to the public.

  • So, if you ever want to have context on where we stand currently,

  • and what we have yet to do,

  • this project tracker is a great way to understand that.

  • But let's take a look at 3.7.

  • And there we go.

  • So, in process of releasing binaries for Python 3.5 and 3.7.

  • That's issued 25420--

  • and it's going a little bit off the screen--

  • 429.

  • But you can take a look at that issue and see that it's currently in progress.

  • There's not really an ETA,

  • but it's something that we want to have complete

  • by the time that the alpha or [C] is released.

  • So, that is wonderful to see.

  • There's also a website called Python 3 Statement.

  • I think it's python3statement.com

  • Maybe it's .org

  • There we go, cool!

  • So, TensorFlow has made the commitment that as of January 1, 2020,

  • we no longer support Python 2.

  • And we have done that with a plethora of our Python community.

  • So, TensorFlow, pandas, scikit-learn, etc.

  • We are firmly committed to Python 3 and Python 3 support.

  • So, you will be getting your Python 3 support,

  • and we're firmly committed to having that.

  • The nice thing about the issue tracker is it's not going to be a big--

  • "Hey, we have it!"--

  • coming at some random point in the future.

  • It'll be a case of totally transparent,

  • and you can keep an eye on what we're doing.

  • And you can see people commenting and our engineers commenting back.

  • Like, "Yeah, man, I totally ran the thing last night,

  • and it's almost there, one more test."

  • (chuckles) Sounds good.

  • Okay, I think that's all we have time for.

  • So, whatever you do, don't forget to hit that subscribe button.

  • Alright, thank you so much, and thanks for being engaged.

  • Thank you.

  • ♪ (music) ♪

♪ (intro music) ♪

Subtitles and vocabulary

Click the word to look it up Click the word to find further inforamtion about it