Placeholder Image

Subtitles section Play video

  • what's going on, everybody, and welcome to a very exciting announcement.

  • And that is the neural networks from scratch.

  • Siri's is finally upon us.

  • It has been probably the most requested Siri's since I did the practical machine learning Siri's where we did all of those typical classical machine learning algorithms like a nearest neighbors for vector machines, clustering algorithms and so on.

  • What we did was we did him from scratch as well, and people really wanted to have neural networks from scratch.

  • But that's, of course, a huge, huge Siri's as way more complicated than any of the other machine learning algorithms.

  • So it's just kind of been one of these things that has been like, Yeah, I really want to do that.

  • But come even get that done.

  • And so for the last few months, Daniel and I have been working on doing just that, trying to figure out Can we even get this done?

  • And if so, how do we do that?

  • And by neural networks from scratch?

  • I mean, if we do it, it would it would have had to have been truly from scratch.

  • So no third party libraries at all.

  • So what I'm here to tell you today is we have done it.

  • It is possible, and now it's kind of being molded into an actual course.

  • Now two things.

  • Their first off.

  • We do everything from scratch in python, but that this gets extremely laborious in terms of like putting everything finally together.

  • So it's kind of silly to ignore numb pie like numb pies, the most important library for python at all.

  • So we're gonna show everything from scratch truly in python, and then we do it in numb pie just because it would be honestly, a disservice to not use numb pi.

  • Plus, you get to learn a ton about numb pie.

  • So again, everything is truly from scratch in python, and then we show you how to do it in numb pie.

  • And then that's it is as faras um, how we're with the only library, really that we're using, uh, with this.

  • So anyway, it's it's super super interesting.

  • There's like there's truly no stone unturned.

  • So if you wanna learn how neural networks actually work, there's really no better way than doing it this way.

  • So what I'm talking about is truly coating every neuron you're coding wth e activation function.

  • So we're talking like sigmoid soft max obviously rectified, linear and in just a linear activation.

  • So this means we can do both classification and regression on and then coating in, like calculating, lost with cross entropy and then four hour Optimizers.

  • We've got obviously stochastic grading descent and also obviously Adam Ah, but as well as at a grad and or M s prop.

  • So coating your neurons, creating layers, activation functions, calculating loss, optimizing, doing back propagation.

  • All of this is covered.

  • It's a ton of material.

  • And, uh, just in notes it was It is about 200 pages.

  • So that's just in notes as the right up forms that's really long otherwise known as a book.

  • So along with the free videos and sample code, as always, that's not changing.

  • Along with that, we're going to release this in book form as well.

  • So the topic and material honestly is something that you are highly likely to 1st 1st you're gonna need multiple sittings to digest its too much material.

  • You couldn't even physically do it in a day or a few days.

  • It's Kenny multiple sittings and then also having multiple mediums just makes sense of some sort of text based version, something you can take away and go look at it somewhere else in a different room.

  • Um, and then obviously in video form, there's things that we can do in video that we just can't do in text form, like showing like animations and diagrams and stuff like that.

  • So, well, diagrams.

  • Obviously, we could do it in like a book, but animations wise, especially when we're doing some some concepts, like doing dot products or transpose and that kind of stuff where it might be a little confusing if I just tell you in words how it's done.

  • So having stuff like animations just makes sense.

  • Also, what we're gonna do is put like you are codes in the book, so you could just scan the Q R code and see the animation, and we'll try to put his good of diagrams as possible.

  • But like I said for things like a transpose, it kind of is very useful to us.

  • Just see the animation of it happening anyways, so we're gonna put it out also in book form, and there's a few benefits to that.

  • So what I'm gonna do is the book.

  • We're just going to crowd fund the book now.

  • That's because this course is still somewhere between six and eight months out.

  • There's so much information here, it's just it's going to take a very long time to cover everything and make sure everything connects correctly.

  • This isn't really something that, like even in the machine practical machine learning Siri's, um, that kind of stuff was that basically, I release in almost all tutorials I release in what I would call draft form.

  • It's one pass right up.

  • Push it because I can always add comments to the video.

  • I can always edit the tutorial.

  • It's really not that big of a deal in the case of a book.

  • Obviously I can't do that.

  • But also this topic in general is it's very complicated, and I feel like if if it's not done correctly right out of the gate, it's gonna be very cumbersome to anyone who's trying to learn all this stuff.

  • If later we've got to release some update to the book and be like, Oh, by the way, here, you gotta change.

  • It does it just shouldn't work like that.

  • So This is something that's gonna take a lot more polishing than what I typically d'oh.

  • So because it's gonna take that long, what we can dio is I can share, like the draft version basically of the book.

  • So if you're going to if you if you become a backer, there's going to be a link to the Kickstarter page in the description.

  • What we can do for you is offer a an access to the actual draft of the book.

  • Now what that allows us to also do is allow anyone who's reading that to, like, highlight and post comments.

  • So if you're confused, you've got a question.

  • You think something's wrong?

  • Whatever it is, you can highlight that post a comment.

  • There's draft out right now as I'm recording this from creating neurons to activation functions and calculating loss, and that's gonna be continually updated.

  • So really, it kind of lets you treat this more like a course or something.

  • So you, as you go through it, you can if you've got any sort of pain point or whatever, you compose that comment.

  • We can actually help you, and for us as we're developing this, this Siris, it helps us to see.

  • Okay, maybe we should change this.

  • Or we should add in some more information here.

  • And we can kind of see where people are having a hard time that way, before the course goes public before a film on these videos before he actually publish a book, Um, we can get all those things, right.

  • So that's the plan.

  • Ah, lot of information, But honestly, this is This is probably the coolest course I think, by far ever to date, that will come out.

  • So I'm super excited for it.

  • It is.

  • It's a lot of material.

  • My expectation is the book will be somewhere between 305 100 pages.

  • It's just so much material, but it's but it's super useful.

  • So, you know, why might you want to do a neural network from scratch?

  • Just just from, You know, I think for some people, it's just obvious or they're just generally interested.

  • But for the rest of us, you know, like, why?

  • Why this?

  • Why would you do this?

  • So if you're like me, the path I took tow learning deep learning started with tensorflow, which it boom jumped in a tensorflow where the optimizers coded for me.

  • Losses coded for me like I just import things for this.

  • The layers I just import the layer and that's super.

  • That's totally fine if you're trying to solve an already solved problem.

  • And I think for many people, when they're going through tutorials and this was me, too.

  • You know, I'm learning about things and I'm solving things, But it's like a pre solved problem.

  • It's a pre solved data set.

  • It's a pre solved neural network that is being given to me that I already know is goingto work.

  • And when I've done it, it's the same thing, right?

  • We're because that's kind of the job of the educator.

  • We simplify the problem so you can learn this thing pretty complex topic.

  • But when you go out into the real world and start trying to apply neural networks to problems that have not yet been solved, it becomes very hard and doing things like even just like a custom lost function, like what he's lost.

  • How do I do that?

  • How could I code that into my neural network?

  • Which optimizer should you always use?

  • Which activation function should you always use different problems need different activation functions.

  • And then you start wondering, you know, you might you could memorize in some cases, like, Okay, I'm gonna use tan H here.

  • I'm going to use rectified linear for everything else.

  • I'm going to use maybe a sigmoid here soft Max here, that kind of thing you can sort of memorized.

  • Like in this case, I could do this, but you're you don't really fully understand him when you have problems like exploding radiance or other things that maybe you can diagnose, But you have no idea how to fix.

  • And if you understand how this stuff actually works, you can quite literally pop the hood and fix it and figure out how can I go about doing this?

  • Also, just solving like doing things like reinforcement, learning and some of these other less like, if you're just not classic if you're classifying images, but you really probably don't need this.

  • But if you're doing other things, like every people wanted me to do a tutorial on the audio, write the text to speech.

  • There's so much custom stuff going on there that without a solid baseline of information, first, I can't even begin to have a tutorial.

  • Siris on that same thing with a lot of these reinforcement learning topics without a heavy based on what we're doing, why we're doing that.

  • Um, I can't do it.

  • The chatbots, for example.

  • That's another one.

  • People want much more in depth.

  • Siri's Well, I can't really do that because we're a custom lost function is exactly what we want to do there, right?

  • So all of these things kind of culminate into.

  • I really want this Siri's toe happen, but it's such a huge Siri's that, um, it's just going to take a lot of time, but I think this is the best way for me to do it.

  • Like I said, everything free as as, you know, in love.

  • But we're also going to release the book form.

  • And if you do crowdfund that book, you can get access to the drafts, make comments, gonna follow along.

  • If you really want to learn this stuff, I can't.

  • I just can't imagine a better way than doing it this way, especially with the ability to just post comments, right.

  • You just highlight the spot right where you're confused.

  • Um, that's got to be the way.

  • So, anyways, I'm super excited about this.

  • I hope you guys are too.

  • I think this is, um, possibly a new way of doing way more indepth stuff on my channel.

  • I feel like I've always kind of been, like, one step beyond basics, but I could never get really deep into things.

  • And I'm hoping this could be a way that I can make that happen.

  • So anyways, you know, basically after this book, there could be, you know, the, uh doing things like chap odds, Texas speech, that kind of, like, way more interesting, but much larger concepts in general.

  • So anyways, I'm really excited to see how this pans out again.

  • There's a link in the description for the Kickstarter and, um, yeah, questions, comments, whatever.

  • If you're trying to leave him below Otherwise, I'm really looking forward to this.

what's going on, everybody, and welcome to a very exciting announcement.

Subtitles and vocabulary

Click the word to look it up Click the word to find further inforamtion about it