 ## Subtitles section Play video

• Let's learn a little bit about the law of large numbers, which

• is on many levels, one of the most intuitive laws in

• mathematics and in probability theory.

• But because it's so applicable to so many things, it's often a

• misused law or sometimes, slightly misunderstood.

• So just to be a little bit formal in our mathematics, let

• me just define it for you first and then we'll talk a little

• So let's say I have a random variable, X.

• And we know its expected value or its population mean.

• The law of large numbers just says that if we take a sample

• of n observations of our random variable, and if we were

• to average all of those observations-- and let me

• define another variable.

• Let's call that x sub n with a line on top of it.

• This is the mean of n observations of our

• random variable.

• So it's literally this is my first observation.

• So you can kind of say I run the experiment once and I get

• this observation and I run it again, I get that observation.

• And I keep running it n times and then I divide by my

• number of observations.

• So this is my sample mean.

• This is the mean of all the observations I've made.

• The law of large numbers just tells us that my sample mean

• will approach my expected value of the random variable.

• Or I could also write it as my sample mean will approach my

• population mean for n approaching infinity.

• And I'll be a little informal with what does approach or

• what does convergence mean?

• But I think you have the general intuitive sense that if

• I take a large enough sample here that I'm going to end up

• getting the expected value of the population as a whole.

• And I think to a lot of us that's kind of intuitive.

• That if I do enough trials that over large samples, the trials

• would kind of give me the numbers that I would expect

• given the expected value and the probability and all that.

• But I think it's often a little bit misunderstood in terms

• of why that happens.

• And before I go into that let me give you

• a particular example.

• The law of large numbers will just tell us that-- let's say I

• have a random variable-- X is equal to the number of heads

• after 100 tosses of a fair coin-- tosses or flips

• of a fair coin.

• First of all, we know what the expected value of

• this random variable is.

• It's the number of tosses, the number of trials times

• the probabilities of success of any trial.

• So that's equal to 50.

• So the law of large numbers just says if I were to take a

• sample or if I were to average the sample of a bunch of these

• trials, so you know, I get-- my first time I run this trial I

• flip 100 coins or have 100 coins in a shoe box and I shake

• the shoe box and I count the number of heads, and I get 55.

• So that Would be X1.

• Then I shake the box again and I get 65.

• Then I shake the box again and I get 45.

• And I do this n times and then I divide it by the number

• of times I did it.

• The law of large numbers just tells us that this the

• average-- the average of all of my observations, is going

• to converge to 50 as n approaches infinity.

• Or for n approaching 50.

• I'm sorry, n approaching infinity.

• And I want to talk a little bit about why this happens

• or intuitively why this is.

• A lot of people kind of feel that oh, this means that if

• after 100 trials that if I'm above the average that somehow

• the laws of probability are going to give me more heads

• or fewer heads to kind of make up the difference.

• That's not quite what's going to happen.

• That's often called the gambler's fallacy.

• Let me differentiate.

• And I'll use this example.

• So let's say-- let me make a graph.

• And I'll switch colors.

• This is n, my x-axis is n.

• This is the number of trials I take.

• And my y-axis, let me make that the sample mean.

• And we know what the expected value is, we know the expected

• value of this random variable is 50.

• Let me draw that here.

• This is 50.

• So just going to the example I did.

• So when n is equal to-- let me just [INAUDIBLE]

• here.

• So my first trial I got 55 and so that was my average.

• I only had one data point.

• Then after two trials, let's see, then I have 65.

• And so my average is going to be 65 plus 55 divided by 2.

• which is 60.

• So then my average went up a little bit.

• Then I had a 45, which will bring my average

• down a little bit.

• I won't plot a 45 here.

• Now I have to average all of these out.

• What's 45 plus 65?

• Let me actually just get the number just

• so you get the point.

• So it's 55 plus 65.

• It's 120 plus 45 is 165.

• Divided by 3.

• 3 goes into 165 5-- 5 times 3 is 15.

• It's 53.

• No, no, no.

• 55.

• So the average goes down back down to 55.

• And we could keep doing these trials.

• So you might say that the law of large numbers tell this,

• OK, after we've done 3 trials and our average is there.

• So a lot of people think that somehow the gods of probability

• are going to make it more likely that we get fewer

• That somehow the next couple of trials are going to have to

• be down here in order to bring our average down.

• And that's not necessarily the case.

• Going forward the probabilities are always the same.

• The probabilities are always 50% that I'm

• It's not like if I had a bunch of heads to start off with or

• more than I would have expected to start off with, that all of

• a sudden things would be made up and I would get more tails.

• That would the gambler's fallacy.

• That if you have a long streak of heads or you have a

• disproportionate number of heads, that at some point

• you're going to have-- you have a higher likelihood of having a

• disproportionate number of tails.

• And that's not quite true.

• What the law of large numbers tells us is that it doesn't

• care-- let's say after some finite number of trials your

• average actually-- it's a low probability of this happening,

• but let's say your average is actually up here.

• Is actually at 70.

• You're like, wow, we really diverged a good bit from

• the expected value.

• But what the law of large numbers says, well, I don't

• care how many trials this is.

• We have an infinite number of trials left.

• And the expected value for that infinite number of trials,

• especially in this type of situation is going to be this.

• So when you average a finite number that averages out to

• some high number, and then an infinite number that's going to

• converge to this, you're going to over time, converge back

• to the expected value.

• And that was a very informal way of describing it, but

• that's what the law or large numbers tells you.

• And it's an important thing.

• It's not telling you that if you get a bunch of heads that

• somehow the probability of getting tails is going

• to increase to kind of make up for the heads.

• What it's telling you is, is that no matter what happened

• over a finite number of trials, no matter what the average is

• over a finite number of trials, you have an infinite

• number of trials left.

• And if you do enough of them it's going to converge back

• And this is an important thing to think about.

• But this isn't used in practice every day with the lottery and

• with casinos because they know that if you do large enough

• samples-- and we could even calculate-- if you do large

• enough samples, what's the probability that things

• deviate significantly?

• But casinos and the lottery every day operate on this

• principle that if you take enough people-- sure, in the

• short-term or with a few samples, a couple people

• might beat the house.

• But over the long-term the house is always going to win

• because of the parameters of the games that they're

• making you play.

• Anyway, this is an important thing in probability and I

• think it's fairly intuitive.

• Although, sometimes when you see it formally explained like

• this with the random variables and that it's a little

• bit confusing.

• All it's saying is that as you take more and more samples, the

• average of that sample is going to approximate the

• true average.

• Or I should be a little bit more particular.

• The mean of your sample is going to converge to the true

• mean of the population or to the expected value of

• the random variable.

• Anyway, see you in the next video.

Let's learn a little bit about the law of large numbers, which

Subtitles and vocabulary

Operation of videos Adjust the video here to display the subtitles

B1 average expected sample large variable probability

# Law of Large Numbers

• 31 3
fisher posted on 2013/04/09
Video vocabulary