Placeholder Image

Subtitles section Play video

  • This is CS50.

  • DAVID MALAN: Hello, world.

  • This is the CS50 podcast, and my name is David Malan.

  • COLTON OGDEN: And my name is Colton Ogden.

  • DAVID MALAN: And so glad to have everyone back with us today

  • for our second ever episode of the CS50 podcast.

  • COLTON OGDEN: Yeah, super excited.

  • So curious, before we start.

  • I walked in to your office just before this podcast started,

  • and you were on the phone.

  • Who were you on the phone with?

  • DAVID MALAN: You couldn't have asked that before we started rolling?

  • COLTON OGDEN: You seemed a little bit disgruntled.

  • DAVID MALAN: No, if you can believe it.

  • It was a robocall.

  • And in fact, ever since our discussion thereof,

  • and since the last week's Tonight Show with John Oliver started

  • focusing on this topic, I have legitimately

  • started getting more and more of these calls.

  • Where they're just spam calls.

  • And then you pick up and there's a very cheery computer

  • voice on the other end of the line.

  • COLTON OGDEN: You know, actually, I had to block a number,

  • because I was actually getting called consistently.

  • I got called--

  • DAVID MALAN: I'm sorry I'll call you less.

  • COLTON OGDEN: I got called everywhere every five to 10 seconds.

  • The same phone number was calling me after I called.

  • It must've been on a loop or something.

  • DAVID MALAN: That's awful.

  • Well, I mean the nice thing about iPhone is that you can actually

  • block calls pretty easily.

  • And I'm guessing you can do the same on Android.

  • But landlines from yesteryear, you're pretty much out of luck

  • unless you punch in some code with your phone provider to do it as well.

  • COLTON OGDEN: Yeah you can bet that actor got blocked.

  • DAVID MALAN: You know it's really obnoxious too.

  • And they think they're being clever.

  • Because most of the spam calls I get nowadays, like in the past week

  • are 617- 555- something, something, something, something.

  • Where 617- 555- matches my own phone number's prefix.

  • And I think the presumption is, and I think

  • John Oliver might have pointed this out, that they're

  • trying to trick you sort of social engineering-wise into thinking like,

  • oh, this must be a neighbor down the road

  • because their phone number is so similar.

  • And it's really frustrating now.

  • This really has peaked.

  • COLTON OGDEN: I don't think I ever got a call from anybody

  • in my life who is a legitimate actor who had the same prefix to my phone number.

  • DAVID MALAN: Yeah that's a good point actually.

  • I don't even notice, frankly, because it comes up sometimes

  • with my contact information.

  • And, anyhow.

  • Thanks for that.

  • Well, sort of a tie back into actually the last podcast episode,

  • where we talked about Facebook.

  • And they're sort of storing unhashed passwords out in the clear.

  • It looks like recently they committed another sort of offense

  • where they were actually asking people for their email passwords.

  • Not a Facebook password, but their actual external email passwords

  • through Facebook.

  • COLTON OGDEN: Yeah.

  • I read this, and I think they were trying to do this for well-intentioned,

  • at least we can perhaps give them the benefit of the doubt, in that they

  • wanted people to be able to confirm that some email address was

  • in fact their own.

  • And I presume some developer thought, well,

  • it'll be easy if we just ask them for their username, their password,

  • pretend to log into that actual system on the user's behalf,

  • and if they get in successfully, hopefully just

  • disconnect without poking around.

  • And just assume that the address is indeed theirs.

  • But this is just so unnecessary and so wrong on multiple levels.

  • I mean this is why companies actually instead send

  • you an email, usually with a special number, or word in it,

  • or URL that you can then click.

  • Because the presumption there is that.

  • Well if we send you an email, and you are

  • able to click on that email within 15 minutes,

  • presumably you do indeed know the username and password to that email.

  • And so therefore you are indeed who you say you are.

  • That's sort of the right, or at least the industry standard way

  • of doing this.

  • Even though it does add some friction.

  • You have to go check your mail.

  • You might have to hit refresh a few times.

  • So there's some UX downsides.

  • User experience downsides.

  • But that's secure, because you're not asking the user

  • to divulge private information.

  • This is just reckless, especially for a company

  • as big as Facebook to be conditioning people into thinking this is OK.

  • DAVID MALAN: I think letting big entities

  • like this act on our behalf in the security realm.

  • I mean, especially Facebook given that they've recently been

  • caught storing plain text passwords.

  • Putting my email password in Facebook's hands,

  • I don't know where that's going to end up at the end of the day.

  • COLTON OGDEN: No.

  • And honestly, even if it's not malicious,

  • and it is just foolish or accidental.

  • The reality is that servers log input or log transactions in their databases.

  • And so the data may end up just sticking around unintentionally so.

  • So it doesn't matter even that the intentions are good.

  • This is just bad practice.

  • And again to my point earlier, if you see this behavior being normalized

  • on very popular websites like Facebook.

  • Well what's to stop a user, especially a less technically proficient user

  • from thinking, oh, I guess that's OK.

  • That's the norm.

  • That's how this is done.

  • If they see it on some random adversaries

  • website that they get socially engineered into clicking on.

  • DAVID MALAN: It was kind of entertaining when

  • it was sort of brought to their Facebook attention that this was a bad idea.

  • In the Daily Beast article that I actually read about this,

  • someone actually brought this to Facebook's attention.

  • And Facebook came out, and to the world said, you know,

  • this probably wasn't the best way to approach solving this problem.

  • Because they had been caught doing [INAUDIBLE]..

  • COLTON OGDEN: To say the least.

  • You know, and it's interesting because big companies, Facebook among them,

  • presumably do have code review processes in place.

  • Involving multiple humans and design reviews.

  • And so what's especially worrisome here, or certainly surprising,

  • is how did this even ship?

  • Right.

  • At no point did some human presumably object to doing this.

  • And so that's I think the sort of fundamental flaw

  • or the fundamental concern is that how did something like this even happen?

  • Because students coming out of CS50 might certainly

  • be inclined to implement things in this way

  • and frankly, if you don't really think about it adversary,

  • or if you haven't been taught to think about things defensively,

  • you might make this mistake too.

  • But that's what mentorship is there for, more experienced personnel

  • or older folks are there for.

  • To actually catch these kinds of things.

  • And so that's the sort of process flaw that's of concern too.

  • DAVID MALAN: I'm certainly grateful that we have so many folks online who

  • are getting more technically literate in this domain

  • and are bringing this to everyone's attention.

  • Looking for these kinds of patterns and catching Facebook red-handed

  • when they do these types of things.

  • Not necessarily just Facebook, I'm sure this

  • happens at scale with many companies.

  • But it's nice to know that people are actually on the lookout for this.

  • COLTON OGDEN: Yeah.

  • And you know I should disclaim too, because I'm

  • sure we have students out there who will remember this.

  • Some 10 or so years ago, even CS50 actually

  • did foolishly use this technique on one or more of our web apps.

  • Because at the time there actually was no,

  • I believe, sort of standard that we could have used

  • to authenticate users in a better way.

  • OAuth, for instance has come onto the scene since.

  • And maybe even if it existed then, it wasn't nearly as omnipresent

  • as it is now.

  • So in short, there are technical solutions to this problem.

  • Whereby, the right way to do this is you don't ask the user

  • for their username and password.

  • You redirect them to Yahoo or Gmail or whatever

  • the account owner's website is.

  • Have them log in and then be redirected back to you.

  • Essentially with some kind of cryptographic token.

  • Something that's mathematically significant and very hard to forge that

  • proves, yes.

  • Colton did in fact just log into his actual Yahoo email account.

  • You can trust that he is who he say says he is.

  • That mechanism either didn't exist, or wasn't familiar

  • even to me back in the day.

  • And so we would just have a web form on CS50 site

  • to log in with their Harvard email address and their password.

  • And again, we were not intending this to be malicious.

  • We certainly didn't log anything deliberately, or thankfully,

  • accidentally.

  • But we could have.

  • And I think the fact that even we, as a course,

  • conveyed the message that, oh, this is OK, was a very bad message to send.

  • And so thankfully, some years ago we actually transitioned

  • to using more industry standard approaches like OAuth,

  • again this mechanism where you bounce the user to their Harvard Law login

  • then back to CS50 as just a sample client, or an application.

  • That's a much better way of doing this.

  • DAVID MALAN: Yeah.

  • Because in that scenario, you're actually allowing a third party

  • to let you perform this handshake I think

  • more securely than just having one entity perform

  • all the security for you.

  • COLTON OGDEN: Yeah.

  • No, and if you look closely, there might actually be examples of this elsewhere.

  • For instance, and it's been a few months since I looked.

  • In Gmail, I believe under your settings you can actually

  • add accounts to your account so that you can retrieve mail

  • from another account via POP.

  • Post Office Protocol.

  • Which is a way of downloading email.

  • There too, you're doing exactly the same thing.

  • You are trusting Google with your username and password

  • to some other email account.

  • The design there, though, is to enable Google to import that email for you

  • into this account.

  • And so as such, there's really no other way

  • to do that unless there is some other process involved where, via OAuth, they

  • can do that.

  • But that's actually not how POP works.

  • And so there, it's too is a technical constraint.

  • And that's kind of an artifact of yesteryear's designs

  • with a lot of these systems.

  • But it's worth keeping in mind that these things still happen.

  • And I think even when you log into sites for the first time

  • you're sometimes prompted for a username and password.

  • Maybe it's LinkedIn I'm thinking of, or Yahoo.

  • Because they want to make it easy to import your contacts.

  • So what better way than to just access your outright account.

  • But there, too.

  • You're trusting someone.

  • You are normalizing a behavior that's probably not best.

  • And so I think we as a society should really

  • start to resist this and distrust this.

  • Just don't do that.

  • DAVID MALAN: I feel like distrust is a very common theme

  • in the world of higher CS.

  • What was the article?

  • The very famous article on trust?

  • COLTON OGDEN: Oh, Trusting Trust.

  • DAVID MALAN: Yeah.

  • Yeah, indeed.

  • It was actually a Turning Award acceptance speech

  • that was then put into paper form.

  • If you really get into the weeds here, nothing is really trustable.

  • Right.

  • In CS50, we talk about compilers, which are programs that, of course, convert

  • one language into another.

  • Usually source code into machine code, at least in our case of C.

  • And who's to say that Clang, the compiler we happen to use in CS50,

  • doesn't have some malicious lines of code in there.

  • Such that if you're implementing any program that does use usernames

  • and passwords, what if the author of the compiler

  • is inserting his or her user name automatically always into your code?

  • Even unbeknownst to you.

  • So there too, unless you actually built the hardware yourself

  • and wrote the software that's running it.

  • At some point, you either need to just curl up into a ball,

  • terrified that you can't trust anyone, or you

  • have to trust some of those lower lying building blocks.

  • COLTON OGDEN: It's kind of a testament to just how pivotal trust is

  • to where we are with technology.

  • Where we are with computers today.

  • I don't think any of this would be possible

  • if we were on the far end of the paranoid spectrum.

  • I think there is definitely pragmatically an inflection

  • point at which we do need to actually trust people.

  • Most definitely.

  • DAVID MALAN: No, I'm guessing that this is why some people, not that many,

  • live off the grid, so to speak.

  • Or in a cabin somewhere, disconnected from all of this,

  • because they don't trust.

  • And honestly, we've seen enough articles, and revelations, and news

  • lately that, they're kind of right.

  • All of these big companies, too, that you would have thought were adhering

  • to best practices aren't.

  • So there's something to that.

  • COLTON OGDEN: There has to be a certain, I

  • guess, maybe based comfort level folks have to have with at least some

  • of their information being publicly accessible.

  • DAVID MALAN: Yeah.

  • No, and I think you have to make an individual decision as to whether,

  • does the convenience you derive from some tool, or the pleasure

  • you derive from some game, or whatever the application is,

  • outweigh the price that you're paying?

  • And that certainly is a theme in computer science,

  • in CS50 specifically, making a reasoned choice based on the pluses and minuses.

  • But I think the concern here, as with sort of liberties more generally

  • in a republic or in a government, is that it's very easy incrementally

  • to say, oh, I'll give up a little bit of my privacy

  • for this additional convenience or this feature.

  • OK, I'll give you a little bit more.

  • OK, I'll give you a little bit more.

  • And then when you actually turn around and look

  • at the trail of things you've given up can actually

  • start to add up quite a bit.

  • And then some other party or company or government

  • has much more control, or access, than you

  • might have originally had agreed to.

  • COLTON OGDEN: Sure.

  • All makes sense.

  • I guess maybe to sort of pivot away from the trust discussion.

  • Back into it maybe something little more technical.

  • It looks like this last week Apache actually patched

  • a bug that granted folks root access on shared hosting environments.

  • The Apache web server, which is such a ubiquitous web server,

  • there were malicious CGI scripts that were capable of actually running

  • on a shared hosting environment.

  • Which I think CS50 is even used Apache for shared hosting.

  • DAVID MALAN: Yeah many years.

  • COLTON OGDEN: At least in the V host.

  • Is V host the technically shared hosting?

  • DAVID MALAN: V host is a technical term saying virtual hosting.

  • Which generally means hosting multiple domains on the same physical server.

  • And Apache makes that very easy.

  • Yeah.

  • No, we used Apache for years.

  • It's free open source software, it's very highly performing,

  • can handle lots and lots of requests.

  • It's a competitor, essentially to Engine X, which then swept onto the scene

  • and took sort of a different technical approach

  • to the same problem of scaling web services.

  • And Yeah.

  • This was an example of a bug whereby if you have an account on a server

  • that's running the Apache web server.

  • As you would if you were running yourself,

  • or if you're paying someone a few dollars a month for shared web hosting.

  • Which is still very common, especially for languages like PHP.

  • You have typically a shell account.

  • A username and password and therefore home directory.

  • And the ability, sometimes, to run programs.

  • Otherwise known in the web as CGI scripts.

  • Common Gateway Interface.

  • Which is a way of running languages like Python.

  • You can do it with Python, but more commonly PHP, or Perl,

  • or other languages as well.

  • And in short, if you have the ability to install these CGI scripts on a server,

  • you can write a program in such a way that it actually

  • gives you, as you know, root access, or administrator access

  • to the whole darn server.

  • Which is horrible if you're not on your own server,

  • but you are on someone else's shared host.

  • Because now you have access to all the other customers or users

  • accounts potentially.

  • COLTON OGDEN: Yeah.

  • Nick and I, on the stream, this was part of one of the CTF, Capture the Flag,

  • challenges we did.

  • Where we had to sort of finagle our way into getting

  • privilege escalation from several user groups up until a root

  • access by exploiting these kinds of vulnerabilities.

  • DAVID MALAN: Yeah.

  • No, the threat, of course, is that if somehow you

  • have a bad actor on your staff, or in your course,

  • or really just on your server, he or she can, of course, install something

  • like this.

  • And then gain or grant root access to someone else too.

  • So even if it's your own server, you certainly

  • don't want your own code to accidentally be able to slip into route mode.

  • Because that means any commands that are executed thereafter

  • could damage anything on the system.

  • You can add files, remove files, send files elsewhere.

  • Once you have route, the front door is wide open.

  • COLTON OGDEN: Yeah.

  • Delete databases.

  • Delete users.

  • DAVID MALAN: Yeah everything.

  • So this is a very serious threat.

  • And it's a simple fix to just run the update and actually

  • a patch the software, so to speak.

  • But these are the kinds of things that you want to be cognizant of.

  • And frankly, I think far too many system administrators and people

  • running web servers don't necessarily pay attention to these kinds of alerts.

  • And so, making sure you're keeping an eye on Apache's own mailing list

  • or Twitter account these days, or Tech Crunch,

  • or other such sites that tends to propagate

  • announcements of security flaws.

  • You really do want to keep an eye out.

  • Because you're going to be regretting it,

  • I think, otherwise, if the fix were available

  • and you just didn't realize you need to apply it to defend yourself.

  • COLTON OGDEN: Sure.

  • So unrelated to that.

  • An interesting thing that we saw in the last week

  • was that Office Depot recently was accused

  • of forging computer scan results.

  • Folks would bring their computers in, and Office Depot would just flat out

  • lie about the computer's safety of the folks that brought their computers in.

  • What do you have to say about that?

  • What do you think about that?

  • DAVID MALAN: Well, today's podcast is brought to you by Office Depot.

  • [LAUGHTER]

  • DAVID MALAN: No.

  • No one actually.

  • No, this is horrible thing.

  • This isn't even necessarily related to technology.

  • This seems to be, and I presume this is true.

  • I'm reading the same thing you are off the FTC website in the US here.

  • That it was just outright deception.

  • And the software was configured, or designed,

  • or the humans chose to give misleading information.

  • Incorrect information to people just to trick them, presumably,

  • into upselling them to have their computer disinfected from some virus.

  • Or some malware when it wasn't actually there.

  • COLTON OGDEN: And this is horrible.

  • But how do we fix this problem?

  • How do we protect the folks that don't necessarily

  • know any better, that the computer is infected?

  • DAVID MALAN: Yeah I mean you'd like to think

  • that these are anomalous situations.

  • Where, at least if you're going to brand name places,

  • you would like to think that you can trust them

  • with higher probability than say some random person

  • you find on Craigslist to disinfect your computer for you.

  • But case in point, even the big fish company like Office Depot.

  • For those unfamiliar, is a pretty big company in the US

  • that sells office furniture and apparently will steal money from you

  • and pretend your hard drive is infected with malware when it's not.

  • So, as we've seen with Facebook and other companies, these mea culpas,

  • big companies are doing this too.

  • And maybe it's not systematically across the company,

  • maybe it's some bad actor, or management, or one or few stores.

  • But I think the nice thing about the software world,

  • and the nice thing about the open source world

  • is, that there's a lot of free products and tools

  • that you can actually download at home.

  • And while you might need a bit more technical savvy,

  • it's definitely more convenient to be able to do it yourself.

  • You can perhaps trust the process a bit more.

  • At least if you have identified a good, compelling product or open source tool.

  • Not some random thing that you were tricked into downloading,

  • and that's a whole other can of worms there.

  • But there tend to be popular programs that, frankly, I used

  • to use when I used PCs more frequently.

  • I would run them myself.

  • And then when they did detect something, I would have it clean my own computer.

  • It's definitely not something have to pay someone else for.

  • But even for those least comfortable, honestly,

  • invite someone over that you trust.

  • Whether it's a colleague a friend or family member,

  • have him or her run such software for you.

  • And then trust their judgment, not necessarily a random third party.

  • COLTON OGDEN: Back to the theme, of course, of trust.

  • DAVID MALAN: Yeah.

  • You have to trust, too, that your niece or nephew isn't

  • coming over just trying to cheat you out of 20 bucks to scan your computer.

  • But you can also just pay them to show you how to run the software,

  • and then do this perhaps yourself.

  • COLTON OGDEN: Always, always gets into somewhat of a dark realm

  • when we talk about trust.

  • In the context of, I think, general trust.

  • But in the context of CS, especially.

  • I think going down that rabbit hole can often be somewhat depressing.

  • DAVID MALAN: Yeah.

  • But I think this is true if we really want to depress ourselves

  • in the real world too.

  • Like driving a car, you generally need to trust

  • that the other humans are going to obey the traffic laws, the traffic lights,

  • so that you can behave in a logical way without actually hitting or being

  • hit from someone else.

  • So I think that's kind of omnipresent.

  • And when you go out to a restaurant, you'd

  • like to assume that everything is sanitary.

  • And unfortunately I've watched far too many Gordon Ramsay shows

  • and Kitchen Nightmares to know that I shouldn't be trusting all restaurants,

  • actually.

  • So I think this is not necessarily unique to technology,

  • but I think it's all the more present lately.

  • This concern.

  • Or these threats.

  • COLTON OGDEN: Yeah, certainly.

  • Back more to the technical side of our discussion, and sort of

  • related to the Apache thing.

  • The other actor that you mentioned, Engine X.

  • There was a vulnerability with some Cisco routers recently.

  • The RV 320, and I think another series.

  • And there was the RedTeam pen testing group that, I guess,

  • ended up doing some tests on those routers.

  • And found a config file in which it specified

  • that one of the fixes for a vulnerability that they found

  • was actually just a banned Curl.

  • The program Curl.

  • What are your thoughts on that as an approach?

  • DAVID MALAN: Yeah, I think we have such a knack in this podcast already.

  • Anything we have to say is not going to be positive

  • when we point out that something was in the news it seems here.

  • Technologically so--

  • COLTON OGDEN: I need to do some more research on positive, friendly topics.

  • DAVID MALAN: We do.

  • Well, what are some websites where you can see some puppies?

  • Wholesome--

  • COLTON OGDEN: Well, this is the podcast, so people can't see anything.

  • DAVID MALAN: That's true, so we need some--

  • COLTON OGDEN: We need audio clips of puppies.

  • DAVID MALAN: There we go.

  • Next time, next time in episode two.

  • So Curl, for those unfamiliar, allows you

  • to connect to a URL generally with a command line client.

  • Or there's actually a library version where you

  • can write code that connects to a URL.

  • And you can use Curl therefore to download content or download HTTP

  • headers.

  • It essentially pretends to be a browser in a headless way.

  • Without a GUI.

  • It just does everything textually.

  • And so in the case of this Cisco router, it

  • seems as though there was indeed a vulnerability in their code,

  • on the routers themselves.

  • Such that you could trick.

  • These devices into executing code that they were not supposed to execute.

  • And that, in general, is a bad thing.

  • You don't want a piece of hardware able to execute code

  • that you did not intend.

  • Because, of course, it can maybe do things malicious.

  • It can steal data, write data, read data, delete data, any number of things

  • could be possible.

  • And it indeed seems that this penetration testing

  • team noticed that, well, gee.

  • It seems that Cisco's fix for this problem

  • is just to blacklist a certain user agent, so to speak.

  • And a user agent is a term of our, in HTTP, that refers to a string.

  • A unique string that's passed from client to browser that says I

  • am Chrome.

  • Or I am Safari.

  • Or I am Firefox.

  • Or in this case, I am Curl.

  • And this is useful just statistically so that servers can actually

  • keep track of who's using which operating system, which browser,

  • which piece of software, and so forth.

  • But this is entirely the honor system, right.

  • Every HTTP header in a packet from client to server could be forged.

  • You can write code to do this, or you can even

  • run Curl to do this using a command line argument.

  • And so with dash capital A can you change your user agent string.

  • And so with the RedTeam actually did, with the proof of concept here,

  • is if you read the advisory, you'll see that they just

  • changed it from Curl with a C. To Kurl with a K. Just to be cute.

  • With a c.

  • [LAUGHTER]

  • That demonstrated that, essentially, the regular expression that Cisco had built

  • into the server software, Engine X was just checking for C-U-R-L.

  • So if you literally pass in anything other than C-U-R-L,

  • for instance K-U-R-L, that request actually gets through.

  • They didn't actually fix the underlying bug.

  • COLTON OGDEN: Yeah.

  • Such heavy-handed, but also such a simple, naive approach too.

  • DAVID MALAN: It really is.

  • In here too.

  • Yeah, I don't necessarily fault the developer,

  • because this is a mistake I might have made.

  • I might still make perhaps.

  • CS50 student might make, certainly shortly

  • after graduating from the course.

  • You need to be taught these things.

  • You need to realize these things from news articles, or discussions thereof,

  • but someone should have caught this.

  • This also, not only being a technical mistake, is a procedural mistake.

  • How did this slip through?

  • Someone hopefully, and yet tragically, reviewed this code and said, yes.

  • Ship it.

  • This is OK.

  • And that seems to be where the threat really is.

  • COLTON OGDEN: Yeah it's almost operating under the assumption

  • that user agents are baked permanently.

  • They're immutable by default. Which clearly is not the case.

  • DAVID MALAN: Well, to be honest.

  • Not to get all lofty, but I'd like to think that in CS50, this

  • is one of the things we do try to do.

  • Not just with this topic, but many others.

  • Where we really try to introduce students to low-level primitives.

  • Case in point, we use C. Which is about as close to the hardware

  • as you can get before you drop down into assembly code and actual machine

  • instructions.

  • And I think via that bottom up exploration,

  • do you begin with higher probability, hopefully, than otherwise.

  • To think about what threats might be, right.

  • Even if you don't necessarily know that much about HTTP.

  • You just know that there are these text based messages going

  • back and forth from client to server.

  • At some point, it probably starts to dawn on you.

  • Well, wait a minute.

  • If I can write software that generates requests,

  • maybe I can just forge these requests.

  • And indeed, all I have to do is print out

  • this string instead of this other one.

  • So we can't possibly in CS50, or any course,

  • teach everyone something about everything.

  • So if you instead focus more on the primitives

  • the underlying building blocks.

  • What is HTTP.

  • What is a header.

  • What is a TCP client.

  • Can they begin to assemble for themselves

  • critically what is actually possible and what those threats actually are.

  • COLTON OGDEN: Yeah.

  • Pretty amusing.

  • Pretty depressing altogether.

  • Seeing all of this things.

  • DAVID MALAN: And these are the things we're seeing.

  • Right.

  • This is thanks to companies and people, like RedTeam, which

  • actually noticed something like this.

  • Can you even imagine how omnipresent these mistakes are

  • that we just haven't discovered yet.

  • COLTON OGDEN: Yeah.

  • Thank goodness for folks like RedTeam and folks

  • that are paying attention to the validity of things like the Office

  • Depot scans.

  • And question why Facebook is asking for their email, password,

  • and bring it to the public conscious.

  • Because otherwise this would be a little bit trickier.

  • A lot of bad things might be happening underneath us,

  • and we would be none the wiser about it.

  • DAVID MALAN: Yeah.

  • Well, and there's a term of art in tech.

  • White hats, or ethical hackers, so to speak.

  • People whose job it is, or mission in life,

  • is to actually think like an adversary.

  • Or sort of pretend to be the bad guy, at least in your mind,

  • but to use those powers for good.

  • And to actually build a business around or reputation around.

  • Discovering these kinds of things.

  • And honestly, it's taken the industry some time

  • to get comfortable with this idea.

  • Especially with outsiders.

  • There's another term, bounties, for instance.

  • And some companies, not all, will actually

  • offer you a few hundred dollars, few thousand dollars,

  • if you identify in a responsible way some security hole in their software.

  • Report it via the appropriate channels.

  • Not Twitter, but via email or some web form.

  • And allow them a reasonable amount of time to fix the problem.

  • And I think a lot of companies might be scared to invite that kind of attention

  • on their code.

  • But it probably is a net positive, and you

  • get a lot of smart people trying to help you help yourself.

  • The worrisome part is that if you just leave it to the bad guys,

  • they're not going to be telling you when they find these mistakes.

  • They're just going to be attacking your systems and your product.

  • COLTON OGDEN: And it's going to be hard and this is something

  • that I know you've mentioned many times.

  • But it is practically infinitely easier being an attacker

  • than it is being a defender in the computer science realm.

  • DAVID MALAN: Yeah.

  • We are on the losing end of this against the adversaries.

  • We, if I may be so bold and to call us the good guys, we have to be perfect.

  • We have to find and fix every possible mistake in our code.

  • Every possible exploit.

  • Fix every possible bug.

  • But all the adversaries need to find is just one oversight, one mistake.

  • It's like leaving, in your house, if you've

  • got all the doors locked and dead bolted,

  • and you've got the alarm system on.

  • But you have got one window open already that the person can slip through.

  • It doesn't matter, any of the other stuff.

  • It all reduces to the weakest link, so to speak.

  • COLTON OGDEN: It's so brutally unfair.

  • DAVID MALAN: Yeah.

  • But I think that's why talking about this

  • and emphasizing themes in computer science classes,

  • like that of trade offs and that of security itself,

  • just gets people thinking more consciously.

  • Because at the end of the day, it's just a cost, right.

  • You could put bars on your windows, which

  • would partly mitigate that threat, but there's a physical cost there.

  • There's an aesthetic cost, and so at some point

  • you just have to draw the line.

  • But security really is all about raising the cost to the adversary.

  • Either financially or time-wise, resource-wise.

  • And just making it worth their while no longer to attack you.

  • I think there's an expression along the lines of,

  • security is all about getting the adversary to attack someone else.

  • Right.

  • Because if the price they must pay to attack you is too high,

  • they're indeed going to turn their attention elsewhere.

  • And so that's perhaps a bit of a perverse way of thinking about it,

  • but that's how a logical adversary would presumably think about it.

  • COLTON OGDEN: Yeah, even Nick and I were talking when

  • we did a steam Kali Linux recently.

  • Which Kali Linux is a version of Linux that

  • has some tools built into it to help folks get into penetration testing.

  • And he was saying that one of the biggest ways, easiest ways,

  • to get adversaries to stop messing with you is just

  • choose extremely secure passwords.

  • And along those lines, generally speaking, just adopt as secure things

  • as you can.

  • Don't do a lot of things that are very easy to guess, basically.

  • DAVID MALAN: Yeah.

  • No, absolutely.

  • Right.

  • Because if you're running an attack script on your server and my server,

  • and I have the longer more secure password.

  • The adversary is going to get into your server and not mine,

  • and then start focusing on you.

  • So, woof!

  • I escaped detection there.

  • COLTON OGDEN: You've deflected the burden.

  • And then ideally, in a world where your neighbor also does

  • the same thing and so on and so forth.

  • In a theoretical model, you don't have attackers at least doing as much damage

  • nearly as they aren't now.

  • Because they just can't find anybody to attack.

  • DAVID MALAN: Right.

  • No.

  • So I think, ideally, you want societally to sort of raise the cost all around.

  • And help each other patch these holes, because it

  • does no one any good if attacks are being

  • waged from other people's servers.

  • Case in point, worse than a denial of service attack, or DOS,

  • is typically a DDOS, a distributed denial of service attack,

  • which is the act of an adversary taking over somehow multiple machines

  • and using those multiple machines to attack

  • one or some number of other parties.

  • So it does not behoove me to allow your house to be broken into, so to speak,

  • or your server to be compromised, because I could then

  • be the next victim.

  • Because your machine is now part of the threat.

  • COLTON OGDEN: And you did a lot of this new PHD, right, with botnets.

  • Right?

  • DAVID MALAN: Yeah.

  • A botnet is a collection of servers that has somehow

  • been taken over by an adversary.

  • By some virus or worm running on those systems.

  • And a botnet is really just a silly term describing a whole collection

  • of servers that have been commandeered by some adversary via some software.

  • And it's among the scarier threats because via software commands,

  • can that botnet do anything that a piece of software can do,

  • including attack other systems.

  • COLTON OGDEN: Yeah.

  • It's pretty frightening.

  • We will try and maybe segue into something slightly less frightening.

  • DAVID MALAN: How can we play the puppy sound now?

  • COLTON OGDEN: This could still have a slightly negative connotation

  • depending on how you look at it.

  • But I was doing a little bit of research within the last couple of weeks.

  • Google has launched a sort of announcement, or preview for,

  • a feature called AMP.

  • Accelerated Mobile Pages for Gmail.

  • Whereby, within an email, you can sort of

  • embed these cool web page looking functionality driven mini

  • pages I guess.

  • DAVID MALAN: Yeah.

  • Much more interactive snippets of HTML-like code

  • in emails, especially, that make them, indeed, more interactive, clickable,

  • more visually interesting.

  • You know, I kind of have mixed feelings on this.

  • Because on the one hand, you're just describing the web.

  • And we can certainly just have users click on a link in an email

  • and open up a full fledged web browser.

  • And with it all of the protections that are in place from the browsers

  • as to what JavaScript, for instance, can and cannot do.

  • What HTML and CSS can and cannot do.

  • And AMP is proposing to add some additional features, essentially

  • by way of additional HTML attributes, and properties, and so forth.

  • Other features into products like Gmail.

  • And so, it's still kind of appealing to me.

  • You even use the word cool, because it does

  • make a static tool, that's been static for a very

  • long time, a little more interactive.

  • And I kind of am willing to accept that, because of the coolness, as you say.

  • But really the additional features you can get.

  • You can get carousels of images within an email

  • so that you can stay in the email.

  • See a bunch of images.

  • Maybe it's an album that someone posted.

  • Maybe worse.

  • It's a series of ads or products that you want to flip through,

  • but it makes them more interactive.

  • Which keeping me in SITU, in the same place, is probably a compelling thing.

  • We use Slack, for instance, within CS50 to communicate.

  • Which is a chat based mechanism.

  • And Slack has done an amazing job at adding

  • integrations, or supporting integrations, via their API.

  • Where you can have other products tap into Slack.

  • So that you can stay in the Slack environment

  • and execute commands that somehow influence those other products.

  • And it's really convenient, honestly, to for instance, get a Slack message

  • and be able to respond in that chat window,

  • but have it posted to some other web server or some other tool.

  • And so even I appreciate it.

  • It's such a marginal difference.

  • I can absolutely just click a link, go to a web page, and do the same thing.

  • But I'm doing a lot per day.

  • We're not getting any younger.

  • There's only a finite number of seconds in the day.

  • Every few seconds I can spend doing tedious work

  • is probably some compelling time saved.

  • COLTON OGDEN: Yeah.

  • I think altogether I do agree that it is a cool, dynamic addition to Gmail.

  • The one thing that I was thinking about as I looked at it

  • was, would this make it easier for phishing attacks?

  • DAVID MALAN: Yeah.

  • Probably.

  • Let's just assume, yes.

  • With every good thing comes some bad.

  • And, yeah.

  • Because one of the features, too, besides carousels of images,

  • you can actually embed forms, for instance, in the email.

  • And allow the user to submit those forms within the email itself.

  • And, yeah.

  • I'm guessing that's going to be the first threat we

  • see is someone tricking users into actually typing information

  • into those websites.

  • COLTON OGDEN: Yeah.

  • I would just visualizing in my head an email from some malicious actor.

  • But it's the Twitter login page.

  • Oh, log into your Twitter to verify your account or whatnot.

  • And that leading to some bad guy dot com or whatever.

  • It seems like now it's all the easier for this because of that embedded form

  • functionality.

  • DAVID MALAN: Absolutely.

  • No, I agree.

  • No, and I think they could easily pretend

  • to be some bank that they aren't, and actually then trick you

  • into typing in your credentials, your account number, or some information

  • like that.

  • COLTON OGDEN: Sure.

  • One of the last things that I think we might want to talk about

  • before we wrap up is within the last few years it's been common, I think,

  • to start see the big fish open source a lot of the technologies.

  • Facebook open sourced React.

  • Microsoft has open sourced VS code which is now arguably the top, I think,

  • per number of stars, it's actually the top text editor on GitHub.

  • And now Uber recently open sourced their resource scheduler, Peloton.

  • And we won't go necessarily into the specifics on Peloton.

  • But I wanted to get your thoughts on, are these big companies

  • necessarily obligated to do this?

  • Is this a good move on their part?

  • how do you feel about this?

  • DAVID MALAN: Open sourcing their software?

  • COLTON OGDEN: Open sourcing some of their tools at least.

  • DAVID MALAN: I am a fan of open source software.

  • It is a wonderful entry point for aspiring programmers

  • to cut their teeth on a product to which they have access.

  • They can contribute back and kind of put a toe in the water.

  • And learn more about real world software,

  • especially if they're not quite ready or don't yet have access

  • to like a full fledged developer job.

  • Free, I think is a very compelling things

  • and allows so many more people to solve problems

  • using some common functionality.

  • Libraries, frameworks, and so forth.

  • You mentioned React, for instance.

  • It's a wonderful set of shoulders you can stand on

  • to do something even cooler and more impactful yourself.

  • And I think too it's a shame that we have so many companies out there,

  • in general, writing software that isn't necessarily

  • juicy, intellectual property.

  • It's not the core of their business, but it's a commodity type problem

  • that others might benefit from.

  • For instance, mapping tools from someone like Google.

  • And some of the work, certainly, that Uber

  • is now doing when it comes to their services.

  • And so there's sort of a social good to open sourcing that.

  • Because we humans have a finite amount of time on this earth.

  • We might as well stand on each other's shoulders as best we can.

  • And move ourselves forward and hope via karma, and sort of collaboration,

  • will that benefit us in turn too.

  • By our having initiated the same.

  • So I think in principle it's a good thing.

  • With that said, I think there are some costs.

  • Even you and I, when we've written code, were embarrassed by it

  • sometimes, if I might say.

  • And I've written things that I don't really

  • want open source, because it's going to take me a non-trivial amount of time

  • to go clean it up, and comment it, and really feel proud of it.

  • Such that I'd be comfy saying, hello, world I wrote this code.

  • So there's that price.

  • And a lot of companies might think.

  • That's not our business.

  • That doesn't generate revenue.

  • That's not a good use of our limited human time.

  • So I can appreciate the tension, but I think finding

  • that balance is pretty compelling.

  • COLTON OGDEN: Yeah, I agree.

  • I think so.

  • And given that so many large actors have been open sourcing.

  • I guess maybe companies may start to get a little bit of pressure to think,

  • oh all these other companies have all these awesome projects out there

  • that people are using, and seeing, and contributing to,

  • but we don't have anything.

  • Do you think that's something that companies

  • should, and will, worry about?

  • DAVID MALAN: I don't know, to be honest.

  • It's a worthy experiment to see.

  • I think that it's a potential recruiting tool, right.

  • If you gain exposure to some company because you

  • are using, or looking at, or contributing to their software.

  • It feels like a very natural next step to aspire

  • to have a part time or full time job with them.

  • And so honestly, strategically, it might help you identify amazing developers,

  • because you have these volunteers essentially initially contributing

  • freely to your product via open source.

  • Whether it's on GitHub or somewhere else.

  • Submitting pull requests, and participating in issues,

  • and reporting bugs.

  • And you kind of get to know someone and then can very comfortably say,

  • you know what.

  • Why don't you come onto our side of the fence and do this full time?

  • So I don't know if that's a theoretical upside, or an actual one,

  • but it certainly feels worth trying on some scale.

  • COLTON OGDEN: And that's been actually achieved at CS50 too, right.

  • Folks like Kareem, and Chad, and other folks that have worked with us.

  • DAVID MALAN: Yeah.

  • To be honest, and that was actually very organic.

  • It wasn't part of some overall clever strategy, I'll admit.

  • We discovered Chad Sharp because he was submitting pull requests and opening

  • issues on some of our open source libraries.

  • And Kareem, of course, was contributing so actively in CS50's Facebook group

  • and later to software development.

  • And it's a great way to get to know someone

  • in a way that's not a more traditional 30 minute

  • interview where everyone's trying to impress the other person.

  • You don't really have a sense of what's it

  • going to be like to work with this person on a project.

  • Open source software makes it very easy to get to know, and become friendly

  • with people, and technically collaborate with people in a way

  • that a whiteboard and a conference room don't really allow.

  • COLTON OGDEN: Yeah.

  • It's much more organic.

  • DAVID MALAN: Yeah.

  • Hey, case in point, Colton and I met primarily

  • by playing Scramble with Friends as I recall to--

  • COLTON OGDEN: That's a very professional way to get acquainted.

  • DAVID MALAN: I noticed you're very good with words, and here we are talking.

  • COLTON OGDEN: I don't even know if that's true.

  • DAVID MALAN: OK.

  • Well, I'm trying to say something nice.

  • COLTON OGDEN: I appreciate it.

  • I have fond memories of losing almost every single match of that actually.

  • DAVID MALAN: But you kept trying.

  • And that was what we were looking.

  • COLTON OGDEN: I think it was the banter.

  • It was the friendly banter.

  • DAVID MALAN: Well, and to be fair too, we

  • got to chatting early on with CS50 because you offered to get

  • involved with the transcriptions.

  • And helping us actually caption things for folks

  • for whom English is a second language, or who

  • would need it for accessibility sake.

  • So that was a very worthy contribution as well early on.

  • COLTON OGDEN: It's been fun.

  • It's been good.

  • And we've only done more and more in that domain.

  • Which is great.

  • DAVID MALAN: Case in point.

  • I have not played Scramble with Friends for years.

  • COLTON OGDEN: We haven't had time.

  • Well, I think that's a good amount of topics to cover, actually.

  • Are there any takeaways that you'd like to say for the folks listening in?

  • DAVID MALAN: Be afraid.

  • Be very afraid.

  • COLTON OGDEN: People are going to stop tuning in.

  • They're going to get too depressed.

  • DAVID MALAN: That is true.

  • Go Google some puppies right now if you could.

  • But no, I think the real takeaway is to just be more thoughtful and deliberate

  • about choices you make.

  • And yes, there is the theoretical risk that data

  • you're inputting into a website like Facebook might be misused.

  • But does the value of your gaining by using that tool perhaps outweigh that?

  • And as we discussed earlier, you want to make

  • sure that you're not making all of those locally optimal decisions

  • again, and again, and again.

  • Such that weeks or months later, you look back and realize, wow,

  • I globally made a very poor decision, because now this website

  • knows everything about me.

  • So I think just making very conscious choices along the way

  • and realizing what prices you're paying, and what benefits you're getting,

  • is the real takeaway.

  • Because these threats have been omnipresent.

  • Not just in tech, but in the physical real world as well.

  • And so I think being sensitized to them is the real takeaway.

  • COLTON OGDEN: Getting sensitized and getting educated too, probably.

  • DAVID MALAN: Absolutely.

  • COLTON OGDEN: Tune into places like Hacker News.

  • Tune into the CS50 podcast.

  • This was episode 1.

  • It was a pleasure.

  • And I look forward to doing the next episode with you.

  • DAVID MALAN: Chat with you all soon.

This is CS50.

Subtitles and vocabulary

Click the word to look it up Click the word to find further inforamtion about it