The BriefA Blog about the LSAT, Law School and Beyond
In our last post, we talked about the idea of an experiment, outcome, and event. If you're not familiar with those concepts, it may be a good idea to look at that post. Here, we will talk about some of the basic features of probability. First, a definition:
Definition: The probability of an event is a number that measures the likelihood of the event occurring.
And because it is tedious to always write out things like "the probability that a fair coin lands leads is ½", we will adopt an abbreviation. We use letters to represent events:
E = A fair coin lands heads
And then, we just write:
P(E) = ½
which we read as:
The probability that “A fair coin comes up heads” is ½.
And in general, for any event E, we use P(E) to denote the probability that event E occurs. This shorthand will save us much space in the rest of the series.
Now, a probability measures the likelihood of an event. This brings us to:
5 Basic Facts About Probability
1. A probability of 0 means that an event is impossible.
So if you find that P(E) = 0, that means that E will not occur. As an example, when rolling a six-sided die, the event that we roll a 7 is impossible -- it does not occur in any of our outcomes. Thus, P(Roll a 7) = 0.
2. A probability of 1 means that an event is certain.
So, when rolling a six-sided die, the event that we roll some number is a certainty -- it occurs in all of our outcomes. Thus, P(Roll a number) = 1.
3. An event with a higher probability is more likely to occur.
So, if the probability that it snows is 20% while the probability that it rains is 80%, then it is more likely to rain than it is to snow. And, on the flip side, events with a lower probability are less likely to occur.
4. Probabilities are always between 0 and 1.
This makes sense, since if an event had a probability greater than 1, then it would be more likely to occur. But events with a probability of 1 are already certain to occur! How could any event be more likely than a certainty? Similarly, if an event had a probability less than 0, then it would be less likely to occur, but events with a probability of 0 are already impossible! How could an event be less likely than an impossibility?
This also gives us a helpful way to check our answers: if we get a probability greater than 1 or less than 0, we have made a mistake somewhere.
5. The probabilities of our different outcomes must sum to 1.
E.g. if we have 4 different outcomes, then
P(Outcome 1) + P(Outcome 2) + P(Outcome 3) + P(Outcome 4) = 1.
This is because, when we do an experiment, something is bound to happen. So the probabilities of our outcomes must sum to 1.
Now, for the GRE, there are three main types of probability problems:
- The probability of a single event occurring: P(A)
- The probability that two events both occur: P(A and B)
- The probability that one or another event occurs: P(A or B)
You are about to do an experiment with four possible outcomes: A, B, C, and D. The stated probabilities are as follows:
P(A) = .5
P(B) = .3
P(C) = .38
P(D) = .1
Is such an experiment possible? What if P(D) = -.18?
No, such an experiment is not possible since P(A) + P(B) + P(C) + P(D) = 1.28 which is not equal to 1. And changing P(D) would make P(A) + P(B) + P(C) + P(D) = 1, but then we would have P(D) = -.18, which is a negative number. Since probabilities cannot be negative, this experiment is again impossible.
Give an example of an experiment not discussed above, and give an example of an event with a probability of 0 for that experiment, and another event with a probability of 1 for that experiment.
There are many possible examples; here is one: investing in the stock market. It is a certainty that "The value of my portfolio will either increase, decrease, or stay the same." It is an impossibility that "The value of my portfolio will both increase and stay the same." It may do one, or the other, but to do both is impossible.
Translate P(A) + P(B) = ½ * P(C) into a natural language (like English, French, Chinese, etc.).
Again, there are many possible solutions. Not confident in my French, I'll stick with giving the English translation: "The probability of A plus the probability of B is equal to half of the probability of C."
Some GRE questions ask about the likelihoods of different events. For example:
You are about to roll two fair six-sided dice. What is the probability that they sum to 7?
E. None of the above.
In this series, we cover the strategies you need for probability questions on the GRE.
Before continuing, you should know that probability questions make up only about 5% of all math questions on the GRE. That totals to about 2 questions per test. So if you are weak in other areas that appear more frequently (e.g. algebra, fractions/ratios, reading graphs), it might be wiser to look at those topics first and return here later.
What Is Probability?
Sometimes, we are not sure what will happen. It may rain tomorrow or not. I may win the lottery or lose. Federer may win Wimbledon again, or not. Probability is a way to handle this uncertainty. Even if we cannot know exactly what will happen, we can at least determine how likely the different possibilities are. So, if I roll a fair die, I can't know if it will land on 6 or not. But I know that it is more likely to land on an even number than to land on 6 specifically.
Talk about probability is commonplace. We might say it's "pretty likely" to rain later today, or that some team has "no chance" of making it to the playoffs. The mathematical theory of probability gives us a way to make that kind of talk precise, by turning it into formulas and numbers.
Probability begins with the idea of an experiment:
Definition: An experiment is an action that, when performed, leads to exactly one of many possible outcomes.
All sorts of things can be experiments. If I don't study for the final, that is an experiment; it could lead to my acing the class (unlikely), barely passing (more likely), or failing (very likely). In the land of GRE problems, standard experiments include things like flipping a coin (which either leads to the outcome of heads or the outcome of tails), rolling a die (which leads to some number 1 through 6), or picking a winner for a raffle (where the "outcome," or in other words the winner, could be anyone who bought a ticket).
Now, let's consider rolling a fair six-sided die. That experiment has 6 possible outcomes:
And there are lots of questions we could ask about rolling a dice. For example: am I more likely to get an even number or an odd number? What are the odds of getting a prime number? To answer these questions, we need the idea of an event:
Definition: An event is a set of some (or none, or all) of our experiment's outcomes.
Here are some examples of events:
- My die lands on an even number
- My die lands on a number which, in English, is made of 4 letters
- My die lands on 6
Note that events do not need to be possible; we can also consider events that simply will not occur:
- My die lands on 7
- My die lands on a number that is neither even nor odd
- My die lands on a number which, in English, is made of 47 letters
Now we are ready to get at the heart of probability: the whole point of probability is to figure out what the likelihoods of different events are. These GRE questions will give you some setup and some event, and ask you to find the probability of that event. And the rest of this series will be devoted to doing exactly that -- figuring out those likelihoods -- across a series of increasingly complicated contexts.
And here are some practice questions to test your understanding of the above.
I roll a 21-sided die. How many outcomes are there?
There are 21 outcomes; each of the 21 sides of the die corresponds to a different outcomes. (Within the land of GRE problems, we assume things like the die never lands on a corner, teetering between two numbers, and that no one catches our die midair, and so on).
Give an example of an experiment that was not discussed above.
Many examples would work; here is one: eating days-old leftover food. The possible outcomes are: you get sick, or you do not.
In rolling a 6-sided die, I list seven outcomes: I roll a 1, I roll a 2, I roll a 3, I roll a 4, I roll a 5, I roll a 6, and I roll an even number. Are there actually seven outcomes?
No. Recall that an experiment is an action which leads to exactly one of several possible outcomes. But it is possible to roll a die and get a 6 (one of the outcomes I list) and get an even number (another outcome I list). After all, 6 is even! So this is an improper listing of outcomes for our experiment.
In our previous posts, we've talked about the basic concepts of probability and some fundamental facts about probabilities. Here, we'll show how to calculate the probability of a single event when all the outcome are equally likely. This is, in a sense, the simplest case that we will cover, and it is crucial for everything we'll do later (e.g. in finding the probability of two events occurring).
Suppose we flip a fair coin. What is the probability we get heads? Intuitively, the answer should be 1/2. And that's exactly what the following rule would say:
Probability of Equally Likely Outcomes:
If you have n possible outcomes, all of which are equally likely, then the probability of any particular outcome occurring is 1/n.
So when we flip a fair coin, there are 2 possible outcomes (heads and tails). So n = 2 and the probability of one outcome (e.g. heads) occurring is 1/n = 1/2. And if we roll a six-sided die, there are 6 possible outcomes. So the probability of any particular outcome (e.g. rolling a 4) is 1/6. And if we held a raffle where there were 109 different entrants, the probability of any one of them winning would be 1/109.
Note that this rule only applies when all the outcomes are equally likely. In most GRE problems, the outcomes will be equally likely, and the question will signal that by saying that the outcome is "random" or that the outcomes are "equally likely." So, the question might say things like: "a name is chosen at random" or that "each outcome is equally likely." When the outcomes are not equally likely, all bets are off, and you will have to be more careful in how you approach the problem.
Now, we want to find the probability of some event occurring. Suppose I am going to roll a six-sided die, numbered 1 through 6. What is the probability that I get an even number? To calculate this, we use the following rule:
Probability of Single Events (for equally likely outcomes)
Suppose you have n equally likely outcomes. Then, the probability of some event E occurring is:
where the # of total outcomes = n.
So to find the probability of rolling an even number, we need to find the number of outcomes where we roll an even number. If we roll an even number, then we must have rolled a 2, 4, or 6. Then, we divide by the number of total outcomes, in our case 6. So, P(Roll an even number) = 3/6 = 1/2.
Here’s another example in a similar spirit:
Suppose you randomly choose a number from 1 to 50. What is the probability that you chose a prime number?
There are 50 possible outcomes to your random choice. Now, we need to know: in how many of those outcomes do you choose a prime number? In other words, how many of the numbers 1 to 50 are prime? Here, we just have to go through the list: 2, 3, 5, 7, 11, 13, 17, 19, 23, 29, 31, 37, 41, 43, 47. So there are fifteen such prime numbers. Thus: P(Pick a prime number) = 15/50 = 3/10
Now we know how to find the probability of a single event when the possible outcomes are equally likely. Our next step is to learn how to combine these probabilities in order to get the probabilities of more complex events.
130 people line up to buy raffle tickets. Every 10th person who buys a ticket gets a teddy bear as a promotional item. What is the probability that a randomly chosen person from the line will receive a teddy bear?
There are 130 people in line. Since every 10th person gets a teddy bear, we know 13 people got teddy bears. To find the probability that a randomly selected person gets a teddy bear, we just need to calculate: # of people who get teddy bears/# of people total. Thus, we get 13/130 = 1/10.
You have 50 friends. 12 of them have blue hair. You randomly pick one of your friends to invite to dinner tomorrow. What is the probability that you invite a person with blue hair?
We are looking for P(I invite a person with blue hair). Now, if I randomly pick a friend, that means there are 50 possible outcomes: I get dinner with friend 1, with friend 2, ..., with friend 50. Since the question tells us that the outcome is randomly selected, we know that they are all equally likely. So we can apply our rule. In how many of the outcomes will you get dinner with a blue-haired friend? In 12 of them. And we already know how many total outcomes there are. Thus, P(I invite a person with blue hair) = 12/50 = 6/25.
You still have 50 friends. 12 of them still have blue hair. What is the probability that you do not invite a person with blue hair?
This is just like our previous question, except now we want to find P(I do not invite a person with blue hair). If there are 12 people with blue hair, then there are 50 - 12 = 38 people without blue hair. So there are 38 outcomes where I do not invite a person with blue hair. Again, since the outcomes are chosen randomly, we can apply our principle to get P(I do not invite a person with blue hair) = 38/50 = 19/25.
In our previous posts, we talked about the notion of probability, some of its basic features, and how to find the probability of a single event. Here, we will find the probability of a compound event, namely an event where multiple events occur.
For example, we now know that the probability of a fair die landing on 6 is ⅙, while the probability of a fair coin landing heads is ½. But what is the probability that, if I flip a fair coin and roll a fair die, I get that the coin lands heads and the die lands on 6? What is probability that the coin lands heads or the die lands on 6?
To answer that question, we need to introduce a new idea: independence. In ordinary language, talk of “independence” suggests a rebellious child or ideas of liberty and freedom. But here, we use a different notion of independence: two events are independent if neither event affects the likelihood of the other.
An example will help to illustrate the concept. You roll a fair die and then flip a fair coin. We know that, ordinarily, a fair coin has a ½ chance of coming up heads. But now, suppose you knew that the die came up 6. Now, what is the probability that the coin came up heads? Clearly, the answer should remain: ½. The fact that the die came up 6 has nothing to do with the coin! We say that the two events are independent of one another. More precisely, we say:
Definition: Two events, A and B, are independent of one another if:
(i) A occurring does not affect the likelihood of B occurring, and
(ii) B occurring does not affect the likelihood of A occurring
For the purposes of the GRE, it will generally be clear when two events are independent. Here are some standard examples of independence:
- You flip a coin and roll a die. Whether you get heads on the coin ( = Event A) and whether you get 6 on the die ( = Event B) are independent.
- You draw a marble from a bag, replace that marble, and then draw a second marble. Whether the first marble is green ( = Event A) is independent of whether the second marble is purple ( = Event B). Similarly, whether the first marble is green is independent of whether the second marble is green.
Now consider the following case:
A bag contains 10 purple marbles and 7 green marbles. You will randomly draw one marble from the bag and then, without returning the first marble to the bag, you draw a second marble from the bag. Is the event of getting a green marble first independent of the event of getting a purple marble second?
No. This is because whether you get a green marble first or not can actually affect your chances of getting a purple marble second since you do not return the first marble to the bag. Suppose you draw a green marble first. Then, the chance you draw a purple marble second is: (number of purple marbles remaining)/(total number of marbles in bag) = 10/16. But if you did not draw a green marble first, then you must have drawn a purple marble. So then, the odds that you draw a purple marble second is: 9/16.
Here is a more intuitive way to put the same point: suppose your first marble is not green. Then it must be purple (since the bag just has purple and green marbles). That’s one fewer purple marble for you to draw next turn!
The reason why we care about independence is because independent events allow us to easily calculate the probabilities of compound events. What is a compound event?
Definition: Let A and B represent two events. A compound event E is the event where both A and B occur.
For example, suppose I am wondering about the weather this afternoon. Let A = “It rains.” and B = “The Broncos win their game today.” Then, the compound event A and B will be the event where “It rains and the Broncos win.” It will generally be obvious when two events are independent, and sometimes the question will explicitly state that fact. Other examples of independent events include:
- There is a bag with 10 green marbles and 14 yellow marbles. I pick a marble, look at its color, return it to the bag and pick another marble. The color of the first marble is independent of the color of the second.
- I roll a fair die, record its outcome, and then roll it again. The outcome of the first roll is independent of the outcome of the second roll.
- I randomly draw a person's name for a raffle. Then, I flip a coin. The name I draw is independent of my coin flip.
When events are independent, we can calculate the probability of both events occurring via the following rule:
Probabilities of Compound Events
Let A and B be independent of one another. Then, P(A and B) = P(A)P(B)
Let's see this rule in action:
Suppose I roll a fair six-sided die and flip a fair coin. What is the probability that the coin lands heads and the die lands on six?
Following our above rule, we say A = coin lands heads, and B = die lands on six. Then, we can calculate P(A and B) = P(A)P(B). We already know that P(A) = ½ and P(B) = ⅙. So, P(A and B) = 1/12.
Now, this rule may seem a little odd. Why does this rule work? If you're the kind of person who needs to get a sense of why something works in order to learn it, here's an illustration that helps make the rule more intuitive.
Now, here's an application of our rule that uses larger numbers:
You are wondering how poorly your day could go. You know that, at work, one employee (out of 400) will be randomly selected for additional performance reviews. And you know that there is a ⅓ chance that it rains furiously during your commute home. (Of course, whether you are selected or not for the performance review will not affect the weather.) What is the probability that you are picked for the additional performance reviews AND it rains furiously during your commute?
Let A = you are selected for additional performance reviews. Then, P(A) = 1/400 (since one person is randomly selected from a group of 400). Let B = it rains furiously on your way home. The question gives us that P(B) = ⅓. By our rule, P(A and B) = P(A)P(B) = 1/1200. So it is pretty unlikely that your day will be the worst possible.
In our next post, instead of looking at the probability of A and B occurring, we will look at the probability of A or B occurring.
You roll a fair six-sided die. Is the event of you rolling a multiple of 5 independent of you rolling an even number?
No. Suppose you roll a multiple of 5. Then, you must have rolled 5, and 5 is not an even number! So if one event occurs, the other event is impossible. Now suppose you do not roll a multiple of 5. Then, you must have rolled a 1, 2 ,3, 4, or 6. So it is still possible that you rolled an even number, and thus the probability of the second event remains greater than 0. Thus, the first event occurring or not affects the probability of the second event. So they are not independent.
You roll a fair six-sided die. Is the event of you rolling a prime number independent of you rolling an even number?
No. Here, the violation is a little trickier. Suppose you roll a prime number. Then, you must have rolled a 2, 3, or 5. Then, the probability of you rolling an even number is 1/3 since, of those three numbers, only 2 is even. Now suppose you did not roll a prime number. Then, you must have rolled a 1, 4, or 6. Then, the probability of you rolling an even number is 2/3 since, of those three numbers, 4 and 6 are even. So whether the first event occurs or not can affect the probability of the second event. So they are not independent.
There are two events, A and B, which are independent of each other. P(A) = .2 and P(B) = .5. What is P(A and B)?
We can calculate this via our rule: P(A and B) = P(A)P(B) = .2 * .5 = .1.
You flip a fair coin 7 times in a row. What is the probability that all 7 flips come up heads?
Let H1 = the first coin comes up heads, H2 = the second coin comes up heads, ..., H7 = the seventh coin comes up heads. Now, at first blush, it may seem like our rule does not cover this case. But, note that we can define a new event: D = the first two coins come up heads. And D is independent of H3. (Whether the first two coins come up heads or not just doesn't affect whether the third will). So, P(H1 and H2 and H3) = P(D and H3). And we know P(D) = P(H1)P(H2) = 1/2 * 1/2 = 1/4, since the first two coins coming up heads are independent events. So P(D and H3) = P(D)P(H3) = 1/4 * 1/2 = 1/8, since they too are independent. Following this line of reasoning, we get that P(H1 and H2 and ... and H6 and H7) = (1/2)^7 = 1/128.
There are two independent events, A and B. Neither A nor B is guaranteed to happen. P(A) = .7. There are two values:
X = P(A and B)
Y = .7
Which of the following is true?
A. X is greater than Y
B. Y is greater than X
C. X and Y are equal
D. There is not enough information to tell.
B is the correct answer. Since neither event is certain, we know that P(A) < 1 and P(B) < 1. Since A and B are independent, we know that P(A and B) = P(A)P(B) = .7*P(B), since we know the value of P(A). Because P(B) < 1, we know that .7*P(B) < .7. So P(A and B) = .7*P(B) < .7. Thus, X is less than Y.
You're wondering whether you should go see a new action movie, Muscle Man: How One Man's Muscles Save the World (again). Now, you're not the biggest fan of action movies, but you do enjoy one from time to time. So you figure it'll be worth it if you can get a good seat or if the action sequences are amazing. If neither of those things happens, then it's not worth going for you. Now, you're wondering, should I go see the movie?
Well, something that matters to you is the likelihood of: (i) I get a good seat, or (ii) the action sequences are amazing. Let A = "I get a good seat" and B = "The action sequences are amazing." You really want to know the value of P(A or B). If it's really high, then the movie is probably worth it. If it's really low, then the movie probably isn't (and buying the ticket and so on just isn't worth it).
In this section, we'll talk about calculating P(A or B), which we read as "The probability that A or B occurs." But before we do so, we need to issue an important clarification: A or B means that A occurs or that B occurs or that both occur. In other words, if A and B both happen, then 'A or B' happens as well. This is somewhat at odds with how we often use the word "or," as in sentences like, "You can study hard or you can fail the test," with the implication being that you cannot do both. Excise that meaning from your mind; in probability, we say that "A or B" occurs if A occurs, or if B occurs, or if A and B occur.
Now, in order to calculate P(A or B), it will help to introduce the idea of mutually exclusive events:
Definition: Two events, A and B, are mutually exclusive if it is impossible for them to both occur.
Another way to put this (symbolically): P(A and B) = 0.
Here are some examples of mutually exclusive events:
- When flipping a coin, getting heads and getting tails are mutually exclusive events. It is impossible to get heads and tails from the same flip of a coin.
- Suppose you and your friend have both entered into a raffle that only picks one winner. Then, the event of you winning is mutually exclusive with your friend winning.
- Suppose you are taking a class in college and you need an A or B to graduate with honors. The event where you get an A is mutually exclusive with the event where you get a B.
The point of talking about mutually exclusive events is to make it easier to calculate probabilities of one event OR another event occurring. We can do such calculations via the following rule:
Mutually Exclusive Rule for P(A or B)
Let A, B be mutually exclusive events. Then, P(A or B) = P(A) + P(B).
In other words, the probability that A or B occurs is equal to the probability that A occurs plus the probability that B occurs.
And if you like to get a sense for why such rules work (rather than simply memorize the formula), see here for an illustration that helps make the rule more intuitive. Now, let's see this rule in action:
In rolling a fair, six-sided die, what is the probability that you will get a 1 or a 4?
We know that rolling a 1 and rolling a 4 are mutually exclusive events, since it is impossible for them both to occur. We know that P(Rolling a 1) = ⅙ and that P(Rolling a 4) = ⅙. Thus, by our above rule, P(Rolling a 1 or Rolling a 4) = P(Rolling a 1) + P(Rolling a 4) = + ⅙ + ⅙ = ⅓.
Now, this rule is only a special case of a more general principle. In general, for all events, and not just mutually exclusive ones, the following is true:
General Rule for P(A or B)
Let A, B be two events. Then, P(A or B) = P(A) + P(B) - P(A and B).
I.e. the probability that A or B occurs is equal to the probability that A occurs plus the probability that B occurs minus the probability that A and B occur.
And again, if you like to see why such rules are true, click here. Here is an example of using this rule:
You are wondering whether to go to the cafe. You would go if you knew that Bertrand or Simone was going. There is a 45% chance that Bertrand will go. There is a 20% chance Simone will go, and there is a 15% chance that both Bertrand and Simone go. What is the likelihood that Bertrand or Simone will go to the cafe?
Let B = "Bertrand goes to the cafe" and let S = "Simone goes to the cafe." Then, the question gives us that P(B) = .45, P(S) = .2, and P(B and S) = .15. Following our rule, P(A or B) = P(A) + P(B) - P(A and B) = .45 + .2 - .15 = .5. Thus, there is a 1/2 chance that Bertrand or Simone will go.
In our next post, we will look at a strategy that can help us solve some tricky questions: instead of finding the probability of some event, try finding the probability that it does not occur.
In a bag, there are 5 red marbles, 2 blue marbles, and 1 pink marble. I will pick one marble from the bag, set it aside, and then pick another marble from the bag. Is the event of my drawing a blue marble on the first draw mutually exclusive with drawing a pink marble second? Is drawing a pink marble first mutually exclusive with drawing a pink marble second?
Drawing a blue marble first and a pink marble second are not mutually exclusive; it is possible to do both. But, drawing a pink marble first is mutually exclusive with drawing a pink marble second. After all, since there is only one pink marble and we do not replace the marbles after we draw them, once you draw the first pink marble, you've drawn the only one there is! You can't draw a second pink marble.
Jane is worried that her new neighbor both (i) likes bad music, and (ii) is willing to blare his preferred kind of music at all hours. She estimates the probability that her neighbor likes bad music at .4, and the probability of his constantly blaring music at .3. And she estimates the probability that (i) or (ii) is true at .6. What probability should she assign to the worst possible outcome: her neighbor both likes bad music and is willing to blare music constantly?
This is a different way in which we can apply the rule we just learned. We have been thinking of our rule as a way to calculate P(A or B). But, if you are given P(A or B), P(A), and P(B), we can also use it as a way of calculating P(A and B). Let's plug in the numbers our question gives: .4 + .3 - P(A and B) = .6. Then, subtracting .6 and multiplying by -1 on both sides, we get: P(A and B) = .1.
Intuitively, this may seem a little odd, but we must remind ourselves that the formulas we use are true, and that we are allowed to manipulate them algebraically however we wish.
There are two events, A and B. The probability of just A occurring is r. The probability of just B occurring is s. The probability of neither A or B occurring is t. What is the probability that both A and B occur?
Now, there are four possibilities in total: (1) A and B, (2) A and not-B, (3) not-A and B, and (4) not-A and not-B. We know that one of these four must happen since they capture all the logical possibilities. Furthermore, the possibilities are all mutually exclusive since it is impossible for any two of them to happen. For example, suppose (2) and (4) were both true. Then A and not-A would be true, which would be impossible - either A is true or it is not! Thus, we can apply our above rule: P(Possibility 1 or 2 or 3 or 4 occurs) = P(Possibility 1) + P(Possibility 2) + P(Possibility 3) + P(Possibility 4) = 1. The question gives us the probability of (ii), (iii), and (iv). It asks us about the probability of (i). Plugging in the values we get from the question, we get: P(Possibility 1) + r + s + t = 1, and thus, P(Possibility 1) = 1 - r - s - t.
Some law students consider transferring to a new law school because a change in their personal situation compels them to relocate; others want to transfer in the hope of earning a JD from a higher-tier school, and still others think they might just find a better fit somewhere else. If you fall into one of those categories, we’ve put together a handy cheat sheet of information, written as an FAQ.