Mean of sum and difference of random variables | Random variables | AP Statistics | Khan Academy
Let's say that I have a random variable X, which is equal to the number of dogs that I see in a day. Random variable Y is equal to the number of cats that I see in a day. Let's say I also know what the mean of each of these random variables are, the expected value.
So, the expected value of X, which I could also denote as the mean of our random variable X, let's say I expect to see three dogs a day. Similarly, for the cats, the expected value of Y is equal to, I could also denote that as the mean of Y, is going to be equal to, and this is just for the sake of argument, let's say I expect to see four cats a day.
In previous videos, we defined how do you take the mean of a random variable or the expected value of a random variable. What we're going to think about now is what would be the expected value of X plus Y, or another way of saying that, the mean of the sum of these two random variables.
Well, it turns out—and I'm not proving it just yet—that the mean of the sum of random variables is equal to the sum of the means. So, this is going to be equal to the mean of random variable X plus the mean of random variable Y.
In this particular case, if I were to say, well, what's the expected number of dogs and cats that I would see in a given day? I would add these two means: it would be 3 + 4, and it would be equal to 7. So, in this particular case, it is equal to 3 + 4, which is equal to 7.
Similarly, if I were to ask you the difference, if I were to say, well, how many more cats in a given day would I expect to see than dogs? The expected value of Y minus X, what would that be? Well, intuitively, you might say, well, hey, if we can add random variables, if the expected value of the sum is the sum of the expected values, then the expected value—or the mean—of the difference will be the difference of the means, and that is absolutely true.
So, this is the same thing as the mean of Y minus X, which is equal to the mean of Y, is going to be equal to the mean of Y minus the mean of X. In this particular case, it would be equal to 4 - 3, which is equal to 1.
So, another way of thinking about this intuitively is I would expect to see, on a given day, one more cat than dogs. Now, the example that I've just used—this is discrete random variables. On a given day, I wouldn't see 2.2 dogs or pi dogs. The expected value itself does not have to be a whole number because you could, of course, average it over many days.
But this same idea—that the mean of a sum is the same thing as the sum of means, and that the mean of a difference of random variables is the same as the difference of the means—in a future video, I'll do a proof of this.