Archive for the ‘GRE’ Category
In our previous posts, we've talked about the basic concepts of probability and some fundamental facts about probabilities. Here, we'll show how to calculate the probability of a single event when all the outcome are equally likely. This is, in a sense, the simplest case that we will cover, and it is crucial for everything we'll do later (e.g. in finding the probability of two events occurring).
Suppose we flip a fair coin. What is the probability we get heads? Intuitively, the answer should be 1/2. And that's exactly what the following rule would say:
Probability of Equally Likely Outcomes:
If you have n possible outcomes, all of which are equally likely, then the probability of any particular outcome occurring is 1/n.
So when we flip a fair coin, there are 2 possible outcomes (heads and tails). So n = 2 and the probability of one outcome (e.g. heads) occurring is 1/n = 1/2. And if we roll a six-sided die, there are 6 possible outcomes. So the probability of any particular outcome (e.g. rolling a 4) is 1/6. And if we held a raffle where there were 109 different entrants, the probability of any one of them winning would be 1/109.
Note that this rule only applies when all the outcomes are equally likely. In most GRE problems, the outcomes will be equally likely, and the question will signal that by saying that the outcome is "random" or that the outcomes are "equally likely." So, the question might say things like: "a name is chosen at random" or that "each outcome is equally likely." When the outcomes are not equally likely, all bets are off, and you will have to be more careful in how you approach the problem.
Now, we want to find the probability of some event occurring. Suppose I am going to roll a six-sided die, numbered 1 through 6. What is the probability that I get an even number? To calculate this, we use the following rule:
Probability of Single Events (for equally likely outcomes)
Suppose you have n equally likely outcomes. Then, the probability of some event E occurring is:
where the # of total outcomes = n.
So to find the probability of rolling an even number, we need to find the number of outcomes where we roll an even number. If we roll an even number, then we must have rolled a 2, 4, or 6. Then, we divide by the number of total outcomes, in our case 6. So, P(Roll an even number) = 3/6 = 1/2.
Here’s another example in a similar spirit:
Suppose you randomly choose a number from 1 to 50. What is the probability that you chose a prime number?
There are 50 possible outcomes to your random choice. Now, we need to know: in how many of those outcomes do you choose a prime number? In other words, how many of the numbers 1 to 50 are prime? Here, we just have to go through the list: 2, 3, 5, 7, 11, 13, 17, 19, 23, 29, 31, 37, 41, 43, 47. So there are fifteen such prime numbers. Thus: P(Pick a prime number) = 15/50 = 3/10
Now we know how to find the probability of a single event when the possible outcomes are equally likely. Our next step is to learn how to combine these probabilities in order to get the probabilities of more complex events.
130 people line up to buy raffle tickets. Every 10th person who buys a ticket gets a teddy bear as a promotional item. What is the probability that a randomly chosen person from the line will receive a teddy bear?
There are 130 people in line. Since every 10th person gets a teddy bear, we know 13 people got teddy bears. To find the probability that a randomly selected person gets a teddy bear, we just need to calculate: # of people who get teddy bears/# of people total. Thus, we get 13/130 = 1/10.
You have 50 friends. 12 of them have blue hair. You randomly pick one of your friends to invite to dinner tomorrow. What is the probability that you invite a person with blue hair?
We are looking for P(I invite a person with blue hair). Now, if I randomly pick a friend, that means there are 50 possible outcomes: I get dinner with friend 1, with friend 2, ..., with friend 50. Since the question tells us that the outcome is randomly selected, we know that they are all equally likely. So we can apply our rule. In how many of the outcomes will you get dinner with a blue-haired friend? In 12 of them. And we already know how many total outcomes there are. Thus, P(I invite a person with blue hair) = 12/50 = 6/25.
You still have 50 friends. 12 of them still have blue hair. What is the probability that you do not invite a person with blue hair?
This is just like our previous question, except now we want to find P(I do not invite a person with blue hair). If there are 12 people with blue hair, then there are 50 - 12 = 38 people without blue hair. So there are 38 outcomes where I do not invite a person with blue hair. Again, since the outcomes are chosen randomly, we can apply our principle to get P(I do not invite a person with blue hair) = 38/50 = 19/25.
In our previous posts, we talked about the notion of probability, some of its basic features, and how to find the probability of a single event. Here, we will find the probability of a compound event, namely an event where multiple events occur.
For example, we now know that the probability of a fair die landing on 6 is ⅙, while the probability of a fair coin landing heads is ½. But what is the probability that, if I flip a fair coin and roll a fair die, I get that the coin lands heads and the die lands on 6? What is probability that the coin lands heads or the die lands on 6?
To answer that question, we need to introduce a new idea: independence. In ordinary language, talk of “independence” suggests a rebellious child or ideas of liberty and freedom. But here, we use a different notion of independence: two events are independent if neither event affects the likelihood of the other.
An example will help to illustrate the concept. You roll a fair die and then flip a fair coin. We know that, ordinarily, a fair coin has a ½ chance of coming up heads. But now, suppose you knew that the die came up 6. Now, what is the probability that the coin came up heads? Clearly, the answer should remain: ½. The fact that the die came up 6 has nothing to do with the coin! We say that the two events are independent of one another. More precisely, we say:
Definition: Two events, A and B, are independent of one another if:
(i) A occurring does not affect the likelihood of B occurring, and
(ii) B occurring does not affect the likelihood of A occurring
For the purposes of the GRE, it will generally be clear when two events are independent. Here are some standard examples of independence:
- You flip a coin and roll a die. Whether you get heads on the coin ( = Event A) and whether you get 6 on the die ( = Event B) are independent.
- You draw a marble from a bag, replace that marble, and then draw a second marble. Whether the first marble is green ( = Event A) is independent of whether the second marble is purple ( = Event B). Similarly, whether the first marble is green is independent of whether the second marble is green.
Now consider the following case:
A bag contains 10 purple marbles and 7 green marbles. You will randomly draw one marble from the bag and then, without returning the first marble to the bag, you draw a second marble from the bag. Is the event of getting a green marble first independent of the event of getting a purple marble second?
No. This is because whether you get a green marble first or not can actually affect your chances of getting a purple marble second since you do not return the first marble to the bag. Suppose you draw a green marble first. Then, the chance you draw a purple marble second is: (number of purple marbles remaining)/(total number of marbles in bag) = 10/16. But if you did not draw a green marble first, then you must have drawn a purple marble. So then, the odds that you draw a purple marble second is: 9/16.
Here is a more intuitive way to put the same point: suppose your first marble is not green. Then it must be purple (since the bag just has purple and green marbles). That’s one fewer purple marble for you to draw next turn!
The reason why we care about independence is because independent events allow us to easily calculate the probabilities of compound events. What is a compound event?
Definition: Let A and B represent two events. A compound event E is the event where both A and B occur.
For example, suppose I am wondering about the weather this afternoon. Let A = “It rains.” and B = “The Broncos win their game today.” Then, the compound event A and B will be the event where “It rains and the Broncos win.” It will generally be obvious when two events are independent, and sometimes the question will explicitly state that fact. Other examples of independent events include:
- There is a bag with 10 green marbles and 14 yellow marbles. I pick a marble, look at its color, return it to the bag and pick another marble. The color of the first marble is independent of the color of the second.
- I roll a fair die, record its outcome, and then roll it again. The outcome of the first roll is independent of the outcome of the second roll.
- I randomly draw a person's name for a raffle. Then, I flip a coin. The name I draw is independent of my coin flip.
When events are independent, we can calculate the probability of both events occurring via the following rule:
Probabilities of Compound Events
Let A and B be independent of one another. Then, P(A and B) = P(A)P(B)
Let's see this rule in action:
Suppose I roll a fair six-sided die and flip a fair coin. What is the probability that the coin lands heads and the die lands on six?
Following our above rule, we say A = coin lands heads, and B = die lands on six. Then, we can calculate P(A and B) = P(A)P(B). We already know that P(A) = ½ and P(B) = ⅙. So, P(A and B) = 1/12.
Now, this rule may seem a little odd. Why does this rule work? If you're the kind of person who needs to get a sense of why something works in order to learn it, here's an illustration that helps make the rule more intuitive.
Now, here's an application of our rule that uses larger numbers:
You are wondering how poorly your day could go. You know that, at work, one employee (out of 400) will be randomly selected for additional performance reviews. And you know that there is a ⅓ chance that it rains furiously during your commute home. (Of course, whether you are selected or not for the performance review will not affect the weather.) What is the probability that you are picked for the additional performance reviews AND it rains furiously during your commute?
Let A = you are selected for additional performance reviews. Then, P(A) = 1/400 (since one person is randomly selected from a group of 400). Let B = it rains furiously on your way home. The question gives us that P(B) = ⅓. By our rule, P(A and B) = P(A)P(B) = 1/1200. So it is pretty unlikely that your day will be the worst possible.
In our next post, instead of looking at the probability of A and B occurring, we will look at the probability of A or B occurring.
You roll a fair six-sided die. Is the event of you rolling a multiple of 5 independent of you rolling an even number?
No. Suppose you roll a multiple of 5. Then, you must have rolled 5, and 5 is not an even number! So if one event occurs, the other event is impossible. Now suppose you do not roll a multiple of 5. Then, you must have rolled a 1, 2 ,3, 4, or 6. So it is still possible that you rolled an even number, and thus the probability of the second event remains greater than 0. Thus, the first event occurring or not affects the probability of the second event. So they are not independent.
You roll a fair six-sided die. Is the event of you rolling a prime number independent of you rolling an even number?
No. Here, the violation is a little trickier. Suppose you roll a prime number. Then, you must have rolled a 2, 3, or 5. Then, the probability of you rolling an even number is 1/3 since, of those three numbers, only 2 is even. Now suppose you did not roll a prime number. Then, you must have rolled a 1, 4, or 6. Then, the probability of you rolling an even number is 2/3 since, of those three numbers, 4 and 6 are even. So whether the first event occurs or not can affect the probability of the second event. So they are not independent.
There are two events, A and B, which are independent of each other. P(A) = .2 and P(B) = .5. What is P(A and B)?
We can calculate this via our rule: P(A and B) = P(A)P(B) = .2 * .5 = .1.
You flip a fair coin 7 times in a row. What is the probability that all 7 flips come up heads?
Let H1 = the first coin comes up heads, H2 = the second coin comes up heads, ..., H7 = the seventh coin comes up heads. Now, at first blush, it may seem like our rule does not cover this case. But, note that we can define a new event: D = the first two coins come up heads. And D is independent of H3. (Whether the first two coins come up heads or not just doesn't affect whether the third will). So, P(H1 and H2 and H3) = P(D and H3). And we know P(D) = P(H1)P(H2) = 1/2 * 1/2 = 1/4, since the first two coins coming up heads are independent events. So P(D and H3) = P(D)P(H3) = 1/4 * 1/2 = 1/8, since they too are independent. Following this line of reasoning, we get that P(H1 and H2 and ... and H6 and H7) = (1/2)^7 = 1/128.
There are two independent events, A and B. Neither A nor B is guaranteed to happen. P(A) = .7. There are two values:
X = P(A and B)
Y = .7
Which of the following is true?
A. X is greater than Y
B. Y is greater than X
C. X and Y are equal
D. There is not enough information to tell.
B is the correct answer. Since neither event is certain, we know that P(A) < 1 and P(B) < 1. Since A and B are independent, we know that P(A and B) = P(A)P(B) = .7*P(B), since we know the value of P(A). Because P(B) < 1, we know that .7*P(B) < .7. So P(A and B) = .7*P(B) < .7. Thus, X is less than Y.
You're wondering whether you should go see a new action movie, Muscle Man: How One Man's Muscles Save the World (again). Now, you're not the biggest fan of action movies, but you do enjoy one from time to time. So you figure it'll be worth it if you can get a good seat or if the action sequences are amazing. If neither of those things happens, then it's not worth going for you. Now, you're wondering, should I go see the movie?
Well, something that matters to you is the likelihood of: (i) I get a good seat, or (ii) the action sequences are amazing. Let A = "I get a good seat" and B = "The action sequences are amazing." You really want to know the value of P(A or B). If it's really high, then the movie is probably worth it. If it's really low, then the movie probably isn't (and buying the ticket and so on just isn't worth it).
In this section, we'll talk about calculating P(A or B), which we read as "The probability that A or B occurs." But before we do so, we need to issue an important clarification: A or B means that A occurs or that B occurs or that both occur. In other words, if A and B both happen, then 'A or B' happens as well. This is somewhat at odds with how we often use the word "or," as in sentences like, "You can study hard or you can fail the test," with the implication being that you cannot do both. Excise that meaning from your mind; in probability, we say that "A or B" occurs if A occurs, or if B occurs, or if A and B occur.
Now, in order to calculate P(A or B), it will help to introduce the idea of mutually exclusive events:
Definition: Two events, A and B, are mutually exclusive if it is impossible for them to both occur.
Another way to put this (symbolically): P(A and B) = 0.
Here are some examples of mutually exclusive events:
- When flipping a coin, getting heads and getting tails are mutually exclusive events. It is impossible to get heads and tails from the same flip of a coin.
- Suppose you and your friend have both entered into a raffle that only picks one winner. Then, the event of you winning is mutually exclusive with your friend winning.
- Suppose you are taking a class in college and you need an A or B to graduate with honors. The event where you get an A is mutually exclusive with the event where you get a B.
The point of talking about mutually exclusive events is to make it easier to calculate probabilities of one event OR another event occurring. We can do such calculations via the following rule:
Mutually Exclusive Rule for P(A or B)
Let A, B be mutually exclusive events. Then, P(A or B) = P(A) + P(B).
In other words, the probability that A or B occurs is equal to the probability that A occurs plus the probability that B occurs.
And if you like to get a sense for why such rules work (rather than simply memorize the formula), see here for an illustration that helps make the rule more intuitive. Now, let's see this rule in action:
In rolling a fair, six-sided die, what is the probability that you will get a 1 or a 4?
We know that rolling a 1 and rolling a 4 are mutually exclusive events, since it is impossible for them both to occur. We know that P(Rolling a 1) = ⅙ and that P(Rolling a 4) = ⅙. Thus, by our above rule, P(Rolling a 1 or Rolling a 4) = P(Rolling a 1) + P(Rolling a 4) = + ⅙ + ⅙ = ⅓.
Now, this rule is only a special case of a more general principle. In general, for all events, and not just mutually exclusive ones, the following is true:
General Rule for P(A or B)
Let A, B be two events. Then, P(A or B) = P(A) + P(B) - P(A and B).
I.e. the probability that A or B occurs is equal to the probability that A occurs plus the probability that B occurs minus the probability that A and B occur.
And again, if you like to see why such rules are true, click here. Here is an example of using this rule:
You are wondering whether to go to the cafe. You would go if you knew that Bertrand or Simone was going. There is a 45% chance that Bertrand will go. There is a 20% chance Simone will go, and there is a 15% chance that both Bertrand and Simone go. What is the likelihood that Bertrand or Simone will go to the cafe?
Let B = "Bertrand goes to the cafe" and let S = "Simone goes to the cafe." Then, the question gives us that P(B) = .45, P(S) = .2, and P(B and S) = .15. Following our rule, P(A or B) = P(A) + P(B) - P(A and B) = .45 + .2 - .15 = .5. Thus, there is a 1/2 chance that Bertrand or Simone will go.
In our next post, we will look at a strategy that can help us solve some tricky questions: instead of finding the probability of some event, try finding the probability that it does not occur.
In a bag, there are 5 red marbles, 2 blue marbles, and 1 pink marble. I will pick one marble from the bag, set it aside, and then pick another marble from the bag. Is the event of my drawing a blue marble on the first draw mutually exclusive with drawing a pink marble second? Is drawing a pink marble first mutually exclusive with drawing a pink marble second?
Drawing a blue marble first and a pink marble second are not mutually exclusive; it is possible to do both. But, drawing a pink marble first is mutually exclusive with drawing a pink marble second. After all, since there is only one pink marble and we do not replace the marbles after we draw them, once you draw the first pink marble, you've drawn the only one there is! You can't draw a second pink marble.
Jane is worried that her new neighbor both (i) likes bad music, and (ii) is willing to blare his preferred kind of music at all hours. She estimates the probability that her neighbor likes bad music at .4, and the probability of his constantly blaring music at .3. And she estimates the probability that (i) or (ii) is true at .6. What probability should she assign to the worst possible outcome: her neighbor both likes bad music and is willing to blare music constantly?
This is a different way in which we can apply the rule we just learned. We have been thinking of our rule as a way to calculate P(A or B). But, if you are given P(A or B), P(A), and P(B), we can also use it as a way of calculating P(A and B). Let's plug in the numbers our question gives: .4 + .3 - P(A and B) = .6. Then, subtracting .6 and multiplying by -1 on both sides, we get: P(A and B) = .1.
Intuitively, this may seem a little odd, but we must remind ourselves that the formulas we use are true, and that we are allowed to manipulate them algebraically however we wish.
There are two events, A and B. The probability of just A occurring is r. The probability of just B occurring is s. The probability of neither A or B occurring is t. What is the probability that both A and B occur?
Now, there are four possibilities in total: (1) A and B, (2) A and not-B, (3) not-A and B, and (4) not-A and not-B. We know that one of these four must happen since they capture all the logical possibilities. Furthermore, the possibilities are all mutually exclusive since it is impossible for any two of them to happen. For example, suppose (2) and (4) were both true. Then A and not-A would be true, which would be impossible - either A is true or it is not! Thus, we can apply our above rule: P(Possibility 1 or 2 or 3 or 4 occurs) = P(Possibility 1) + P(Possibility 2) + P(Possibility 3) + P(Possibility 4) = 1. The question gives us the probability of (ii), (iii), and (iv). It asks us about the probability of (i). Plugging in the values we get from the question, we get: P(Possibility 1) + r + s + t = 1, and thus, P(Possibility 1) = 1 - r - s - t.