Independence
In everyday language we refer to events that have nothing to do with each other as being independent.
Definition
Events $A$ and $B$ are independent if:
\[P(A\cap B)=P(A)\times P(B)\]If $P(A)\neq 0$ and $P(B)\neq 0$, then the following are equivalent:
- $A$ and $B$ are independent.
- $P(B)=P(B\vert A)$
- $P(A)=P(A\vert B)$
See slides 31 for additional examples. This covers proving independence using the definition above.
Independence for More Than Two Events
For a finite set of events there are two different types of independence:
Pairwise Independence
$A_1,\ldots,A_n$ are pairwise independent if every pair of events is independent: for all distinct $k,m$
\[P(A_m\cap A_k)=P(A_m)P(A_k)\]Mutual Independence
$A_1,\ldots,A_n$ are mutually independent if every event is independent of any intersection of the events: for all distinct $k,m$
\[P(A_{k1})\times\ldots\times P(A_{k_m})=P(A_{k_1}\cap\ldots\cap A_{k_m})\]Pairwise independence doesn’t imply pairwise independence. Generally, if it isn’t stated, then we are talking about mutual independence.
To see the proof and example of why pairwise independence does not imply mutual independence see slide 37 onward. This example also shows examples of probability set notation.