Variance of sum and difference of random variables | Random variables | AP Statistics | Khan Academy
So we've defined two random variables here. The first random variable, X, is the weight of the cereal in a random box of our favorite cereal, Matthews. We know a few other things about it. We know what the expected value of X is; it is equal to 16 ounces. In fact, they tell it to us on a box. They say, you know, net weight 16 ounces.
Now, when you see that on a cereal box, it doesn't mean that every box is going to be exactly 16 ounces. Remember, you have a discrete number of these flakes in here. They might have slightly different densities, slightly different shapes, depending on how they get packed into this volume. So there is some variation, which we can measure with standard deviation.
So the standard deviation, let's just say for the sake of argument, for the random variable X is 0.8 ounces. Just to build our intuition a little bit later in this video, let's say that this random variable X always stays constrained within a range. If it goes above a certain weight or below a certain weight, then the company that produces it just throws out that box. So let's say that our random variable X is always greater than or equal to 15 ounces and it is always less than or equal to 17 ounces. Just for argument, this will help us build our intuition later on.
Now separately, let's consider a bowl. We're always going to consider the same size bowl. Let's consider this a four-ounce bowl because the expected value of Y, if you took a random one of these bowls, always the same bowl, and if you took the same bowl and someone filled it with Matthews, the expected weight of the Matthews in that bowl is going to be four ounces.
But once again, there's going to be some variation. It depends on who filled it in, how it packed in. Did they shake it before while they were filling it? There could be all sorts of things that could make some variation here. So for the sake of argument, let's say that variation can be measured by standard deviation and is 0.6 ounces.
Let's say whoever the bowl fillers are, they also don't like bowls that are too heavy or too light, and so they'll also throw out bowls. So we can say that Y can—its maximum value that'll ever take on is five ounces and the minimum value that it could ever take on, let's say it is 3 ounces.
So given all of this information, what I want to do is, let's just say I take a random box of Matthews and I take a random filled bowl, and I want to think about the combined weight in the closed box and the filled bowl. So what I want to think about is really X plus Y. I want to think about the sum of the random variables.
So in previous videos, we already know that the expected value of this is just going to be the sum of the expected values of each of the random variables. So it would be the expected value of X plus the expected value of Y.
So it would be 16 plus 4 ounces. In this case, this would be equal to 20 ounces. But what about the variation? Can we just add up the standard deviations? If I want to figure out the standard deviation of X plus Y, how can I do this?
Well, it turns out that you can't just add up the standard deviations, but you can add up the variances. So it is the case that the variance of X plus Y is equal to the variance of X plus the variance of Y.
And so this is going to have an X right over here, X, and then we have plus Y, and our Y. And actually, both of these assume independent random variables. So it assumes X and Y are independent. I'm going to write it in caps. In a future video, I'm going to give you a hopefully better intuition for why this must be true, that they're independent, in order to make this claim right over here.
I'm not going to prove it in this video, but we could build a little bit of intuition here. For each of these random variables, we have a range of 2 ounces over which this random variable can take, and that's true for both of them.
But what about this sum? Well, this sum here could get as high as, so let me write it this way. So X plus Y, X plus Y, what's the maximum value that it could take on? Well, if you get a heavy version of each of these, then it's going to be 17 plus 5. So this has to be less than 22 ounces.
That's going to be greater than or equal to, well, what's the lightest possible scenario? Well, you could get a 15 ounce here and a 3 ouncer here, and it is 18 ounces. And so notice now the variation for the sum is larger. We have a range that this thing can take on now of 4, while the range for each of these was just 2.
Or another way you could think about it is these upper and lower ends of the range are further from the mean than these upper and lower ends of the range were from their respective means. So hopefully, this gives you an intuition for why this makes sense.
But let me ask you another question. What if I were to say, what about the variance? What about the variance of X minus Y? What would this be? Would you subtract the variances of each of the random variables here?
Well, let's just do the exact same exercise. Let's take X minus Y, X minus Y, and think about it. What would be the lowest value that X minus Y could take on? Well, the lowest value is if you have a low X and you have a high Y. So it would be 15 minus 5. So this would be 10 right over here; that would be the lowest value that you could take on.
And what would be the highest value? Well, the highest value is if you have a high X and a low Y. So 17 minus 3 is 14. So notice, just in the case of, just as we saw in this case of the sum, even in the difference, your variability seems to have increased.
This is still going to be—this is the end; the extremes are still further than the mean of the difference. The mean of the difference would be 16 minus 4 is 12. These extreme values are 2 away from the 12. This is just to give us an intuition.
Once again, it's not a rigorous proof. So it actually turns out that in either case, when you're taking the variance of X plus Y or X minus Y, you would sum the variances, assuming X and Y are independent variables.
Now, with that out of the way, let's just calculate the standard deviation of X plus Y. Well, we know this. Let me just write it using this sigma notation. So another way of writing the variance of X plus Y is to write the standard deviation of X plus Y squared; that's going to be equal to the variance of X plus the variance of Y.
Now, what is the variance of X? What's the standard deviation of X squared? 0.8 squared; this is 0.64. 0.64. The standard deviation of Y is 0.6; you square it to get the variance; that's 0.36. You add these two up, and you are going to get 1.
So the variance of the sum is one, and then if you take the square root of both of these, you get the standard deviation of the sum is also going to be one. And that just happened to work out because we're dealing with the scenario where the variance where the square root of one is, well, one.
So this hopefully builds your intuition. Whether we are adding or subtracting two independent random variables, the variance of that sum or the difference, the variability will increase. In the next video, we'll go into some depth talking about getting an intuition for why independence is an important condition for making this statement, this claim.