Subtitles section Play video Print subtitles >> [MUSIC PLAYING] >> This is CS50-- Harvard University's introduction to the intellectual enterprises of computer science and the art of programming. And my name is David Malan, and I was just thinking this morning, it's been amazingly 20 years today since I last sat where you guys do now. >> It was 1996. I was a sophomore, and I was taking CS50 for the very first time. And I hadn't even gotten up the nerve to take it myself freshman year, partly because of the time. Computer science to me was kind of like, meh. I was a bit of a geek growing up, but I didn't really have any intellectual interest in what appeared to just be a whole bunch of people programming all the time. >> And I was scared to be honest. The course and computer science more generally had and to some extent, still has this reputation of a field to beware, if only because so many of us are unfamiliar with it and unsure of it. And it really wasn't until I shopped this class that sophomore fall-- and even then, I only enrolled because the professor-- one of my first mentors, Brian Kernighan now at Princeton-- allowed me to take the class pass fail. And indeed, that's why today we allow and encourage students to take this class sat/unsat. >> And only then, by the end of the semester did I realize like, wow, this wasn't such an unfamiliar field. Indeed, this was a very empowering field, and more excitingly, especially later on, as I took courses in Dramatic Arts 101 and Latin A and then eventually grad school archeology, did I really start to see the intersections of this field, computer science, with the humanities, natural sciences, the arts, medicine, and the like. And so that's what's just so neat about computer science ultimately, as we hope you'll see-- is its applicability to these other fields, and how you can take some of today's and the semester's ideas and practical skills back to your own domain, and actually explore this intersection of the liberal arts and the sciences. >> So 73% of you, if last year is any indication, have never taken a CS course before. So if, like me, you are feeling a little bit scared, or frankly you're not really sure why you're even here. Perhaps you just followed some friends over to Sanders right now. That's totally fine. The goal here is to hook you and to reassure you that if you do look to the left and to the right, you're going to see classmates with as little or as much experience that you yourself might have. And indeed, we'll share some statistics later today as to what the demographics of the class typically look like. >> And as added reassurance-- and this we do mean since I took over the course some years ago-- in the course's syllabus is this-- that what ultimately matters in this course is not so much where you end up relative to your classmates, but where you in week 11, the end of the semester, end up relative to yourself in week 0, which is where we are here today. And this is what I realized all those years ago. And I know a lot of classes say this, but it's especially true in computer science. At the end of the day, this field is unfamiliar as it was to me and might be to you, is really just about problem solving. And as such, it does have this applicability to get other fields. And in fact, if we tried to distill what this means, this is problem solving in its essence, I daresay. There's input-- so whatever it is that you're trying to solve. There's output, which is hopefully the solution to that problem. And then, as we would say in computer science, there's this black box in the middle that you don't necessarily have to care about how it works. You yourself eventually might implement what's inside that box. But for today's purposes and more generally in life, all you care about is that these problems get solved. >> And what this course is ultimately about is exploring the intersection of these inputs and outputs, and these so-called algorithms, as we'll soon see, that implement what is underneath there, the hood. But these inputs and these outputs-- what does that actually mean? Well, at the end of the day, we need some way of representing information. This is especially true in a computer, which as fancy and complex as it might seem, is a pretty dumb device. It takes electricity-- whether from a cable or a battery as input-- and then it produces some preprogramed responses on the screen. >> But how do we get from start to finish there? Well, what's a problem to be solved? Well, maybe we might, at the start of any semester, try to take attendance in a room like this. So I might do like one, two, three. Or maybe, if I did it to sort of keep track of myself-- to keep track of things-- I could quickly run out of fingers. So I might just make hash marks-- one person, two, three, four, five, six, seven, eight. And all of us have probably done this, whether on your hands or on a piece of paper. And this is actually just something called unary notation-- where if you only have one letter in your alphabet, one or hash mark in this case, for every input you want to count, you need to put down one of these letters-- one of these marks. >> All right. That's all fine and good and not all that complicated. But computers aren't all that much more complicated. Indeed, most of you probably know even if you've not really considered what this means, that computers only understand zeros and ones-- the so-called binary system. We humans, by contrast, are so much more sophisticated insofar as we understand zeros through nines. >> But even if binary is, at first glance, not all that familiar, it turns out it's just like the systems and the ideas that we already know. So for instance, consider this. This is just a sequence of symbols. And all of you, when glancing at it, probably think 123-- nothing really interesting there. But why is it this number, 123? These are just glyphs on the screen-- just patterns that someone might have drawn or typed. But if you're like me, you probably remember from grade school that there are sort of columns or places here. There's the one's place and the ten's place and the hundred's place. And the reason that this is 123 and not just a pattern of three symbols is because, of course, if we have a one in the hundreds place, you do the math of 100 times one, and then two in the ten's place. So that's 10 times 2, and then three in the one's place and that's 1 times 3. And when you add all of those up, of course, you get 100 plus 20 plus 3. >> So we started with just a pattern of symbols-- an alphabet-- but then we mapped meaning onto it by way of these columns. Well, it turns out that computers are really not all that different from you and me. But instead of using powers of 10, so to speak-- 1, 10, 100, 1,000, 10,000 place and so forth-- they actually just use powers of 2-- so one, 2, 4, and then if we put more digits, 8, 16, 32, 64, 128, and so forth. And so this is how a computer would represent the number 0, just like we humans. >> 0, 0, 0-- and you can probably guess what pattern of zeros and ones, if a computer can only speak 0 or 1-- what pattern is going to represent the number we humans know as 1? Yeah-- 0, 0, 1. All right. So 0, 0, 1 is how we represent 1, so you might be inclined then to represent the number 2, if you have the four's place and the two's place as the one place, you might say, well, if we had a 1 in the one's place, and now we want to count up to 2, you might do this and leave this to be a zero. But of course this is not how the decimal system works either. If you put a digit in both of those columns, you've got to do the arithmetic. So what number did I accidentally just represent? >> So it's 3, because 2 times 1 plus 1 times 1, of course, gives us three. So this would be two. The bit sort of flips, so to speak, as 0 becomes a one, much like a 9 roles over and becomes a 0 when you carry the 1. This then would be three of course. Four-- another interesting thing happens, where the ones roll over and you carry the 1, so to speak. So this, of course, is 4. >> But if you fast forward now, what's the biggest number going to be that a computer can represent? So it's just seven in this case, right? Because you have a one in the four, a one in the two, a one in the one. So that's 4 plus 2 plus 1. So that gives you seven. And indeed, it would seem at first glance that computers can count no higher than this. >> But this of course is not true. What do we humans do when we want to count higher than like 999? Just carry the one and just add a fourth digit to the left. And so indeed we could. We could have an eight's place and a 16th's place, and a 32's place, 64, 128-- and you can just keep going on up to infinity. So these zeros and ones-- the so-called binary system--