Prerequisites: Stat 101 A. In order for you to follow the discussions in this class, you should be familiar with the following concepts and calculations:
– Basic concepts of descriptive statistics including the mean, median, variance, and standard deviation.
– Sampling distribution, the Central Limit Theorem, the normal distribution, and Student’s t-distribution, chi-square and F distributions. Basic concepts of inference and hypothesis testing.


为了使您能够进行Stat 101 A的学习,您应该熟悉以下概念和计算:




0 Learning objectives

In STATS 101A, you should be able to:

  • Understand the concepts of an experiment, events and sample space;
  • Apply the addition, total probability and multiplication rules; 
  • Understand the meaning of independence.

1 Events

Let’s begin by defining some basic terms.
An experiment is an activity that produces distinct outcomes.
The sample space $S$ is the set of all possible outcomes.
An event $E$ is a set of outcomes from $S$.
Here is a simple example. Suppose you have a 6 -sided die. Let’s define the experiment as rolling the die once and recording the number that comes up. The sample space $S$ is
There are many different events we can define here:, for example:

  1. An event could be just a single number, e.g., $E=\{6\}$ (we roll a six) or $E=\{2\}$ (we roll a two).
  2.  An event could also be more complicated – for example, $E$ could be that we roll an even number. This is the same as defining $E=\{2,4,6\}$.

2 Probability definitions

We can now start to talk about probabilities. This is the central object in STATS 101A. Let $A$ be an event. $P(A)$ is then the probability of event $A$. By definition, the probability of an event is always between 0 and 1 :
0 \leq P(A) \leq 1
We will define $B \mid A$ as a conditional event: it means that event $B$ occurs given that event $A$ has occurred.

The conditional probability $P(B \mid A)$ is the probability of $B$ occurring given that $A$ has occurred.

Given two events $A$ and $B,$ we denote the probability of $A$ or $B$ or both occurring as $P(A$ or $B$ ). We will sometimes write this in a more formal mathematical way as $P(A \cup B)$ (“the probability of $A$ union $B ; A \cup B$ is the set of outcomes where at least one of $A$ or $B$ happens).

We denote the probability of $A$ and $B$ occurring as $P(A$ and $B) .$ We refer to $” A$ and $B$ as a joint event $-$ both must occur. In math, we will sometimes write this as $P(A \cap B)$ (“the probability of $A$ intersect $B ; A \cap B$ is the set of outcomes where $A$ and $B$ both happen ).

The sample space $S$ is the set of all outcomes. The probability of $S$ is always 1 :
For an event $A$, we denote the complement of $A$ as the set of outcomes for which $A$ does not occur. We will denote the complement of $A$ by $” \operatorname{not} A^{\prime \prime}$ or by $A^{C}$ (C stands for complement). The probability of the complement of $A$ is 1 minus the probability of $A$ :
P(\operatorname{not} A)=1-P(A)

3 Rules for probabilities

The addition rule tells us, for events $A$ and $B,$ how to compute $P(A$ or $B)$ :
P(A \text { or } B)=P(A)+P(B)-P(A \text { and } B)
In words, we add the probability of $A$ and the probability of $B,$ and then remove the probability of $A$ and $B$ so as to not double count.

If $A$ and $B$ are mutually exclusive, then $P(A$ and $B)=0,$ and the addition rule simplifies:
P(A \text { or } B)=P(A)+P(B)
The total probability rule tells us how to divide up a probability $P(A)$ using the joint probabilities $P(A$ and $B)$ and $P\left(A\right.$ and $\left.B^{C}\right)$ :
P(A)=P(A \text { and } B)+P\left(A \text { and } B^{C}\right)
This rule holds more generally. If we have $A,$ and any collection of mutually exclusive and collectively exhaustive events $B_{1}, \ldots, B_{k},$ then
P(A)=P\left(A \text { and } B_{1}\right)+P\left(A \text { and } B_{2}\right)+\cdots+P\left(A \text { and } B_{k}\right)
The multiplication rule links conditional probabilities and joint probabilities. For any two events $A$ and $B$, we have
P(A \text { and } B)=P(A) \times P(B \mid A)
or equivalently
P(B \mid A)=\frac{P(A \text { and } B)}{P(A)}

Problem 1.

What is $P(W$ or $B)$ ?

Proof . Use the addition rule:
P(W \text { or } B) &=P(W)+P(B)-P(W \text { and } B) \\
&=0.48+0.18-0.16 \\

4 Independence

We say that events $A$ and $B$ are independent if knowing the occurrence of one event does not change the probability of the other event. Mathematically, there are three equivalent ways to say that $A$ and $B$ are independent:

1. $P(A$ and $B)=P(A) \times P(B)$.(The joint probability factorizes.)
2. $P(A \mid B)=P(A)$.(The conditional probability of $A$ given $B$ is the same as the unconditional probability of $A .)$
3. $P(B \mid A)=P(B)$.(The conditional probability of $B$ given $A$ is the same as the unconditional probability of $B .)$

If one of these conditions is true, the other two are true. If any one is false, the other two must be false.

Independence is not the same as mutual exclusivity. In fact, if $A$ and $B$ are mutually exclusive, then they are not independent. To see this, suppose that $A$ has occurred. Since at most one of $A$ and $B$ can occur, $B$ cannot occur. The probability of $B$ occurring given $A$ is therefore zero $(P(B \mid A)=0),$ even though $P(B)$ might be different from $0 .$ (For a concrete example of this, try the 6 -sided die example above with $A=$ the die roll is $\leq 3, B=$ the die roll is $\geq 4 . A$ and $B$ are mutually exclusive and not independent.)

STATS 101代写请认准UpriviateTA

STATS 101A代写,Statistics代写,Sampling distribution代写, hypothesis testing代写请认准UpriviateTA. UpriviateTA为您的留学生涯保驾护航。