In Probability theory, we came across some problems in which it is difficult to compute conditional probability directly. To explain this let us consider two events A and B, then their joint probability is given as:

\(\begin{eqnarray}P(A,B) = P(A|B)P(B) \label{eqA}\end{eqnarray}\)and we know that:

\(\begin{eqnarray}P(B|A) = \frac{P(A,B)}{P(A)} \label{eqB}\end{eqnarray}\)then using \eqref{eqA} and \eqref{eqB} we get:

\(\begin{eqnarray}P(B|A) = \frac{P(A|B)P(B)}{P(A)} \label{bayes}\end{eqnarray}\)Equation \eqref{bayes} is called Baye’s Rule and is useful for calculating certain conditional probabilities as in many problems it is difficult to compute P(B|A) directly, hence we use Baye’s Rule to solve such problems. A few examples are given below:

Example-1

A communication system sends binary data {0 or 1} which is then detected at the receiver. The receiver occasionally makes mistakes and sometimes a 0 is sent and is detected as a 1 or a 1 can be sent and detected as a 0. Suppose the communication system is described by the following set of conditional probabilities:

Pr(0 received 0 transmitted) = 0.95 Pr(1 received 0 transmitted) = 0.05

Pr(0 received 1 transmitted) = 0.10 Pr(1 received 1 transmitted) = 0.90

Assuming 0s and 1s are equally likely to be transmitted (i.e. Pr(0 transmitted) = 1 ⁄ 2 and Pr(1 transmitted) = 1/2), find Pr(0 received) and Pr(1 received)

Suppose a 0 is detected at the receiver. What is the probability that the transmitted bit was actually a 1? Also, if a 1 was detected at the receiver, what is the probability that the transmitted bit was actually a 0?

What is the probability that the detected bit is not equal to the transmitted bit? This is the overall probability of error of the receiver

Solution: (a) Pr(0 received) = Pr(0 transmitted) \(\times\) Pr(0 received | 0 transmitted) + Pr(1 transmitted) \(\times\) Pr(0 received | 1 transmitted) = (1/2)(0.95) + (1/2)(0.10) = 0.525

Pr(1 received) = Pr(0 transmitted) \(\times\) Pr(1 received | 0 transmitted) + Pr(1 transmitted) \(\times\) Pr(1 received | 1 transmitted) = (1/2)(0.05) + (1/2)(0.90) = 0.475

Also you could’ve noticed that Pr(1 received) = 1 – Pr(0 received) = 0.475

(b) Pr(1 transmitted | 0 received)

\(=\frac{\text{Pr(1 transmitted}\; \cap \; \text{0 received)}}{\text{Pr(0 received)}}\)

\(=\frac{\text{Pr(1 transmitted)*Pr(0 received | 1 transmittted)}}{\text{Pr(0 received)}}\)

\( =\frac{0.5*0.1}{0.525}\approx0.095\)

Following the same approach we see Pr(0 transmitted | 1 received)

\(=\frac{0.5*0.05}{0.475} \approx 0.0526\)

(c) Pr(error) is the sum of the two answers in part (b) \(\approx\) 0.148.

Example-2: Continuing the previous example. In this problem, we will modify the communication system so that the detector the receiver is allowed to make one of three possible decisions:

“0” the detector decides the received signal was a 0,

“1” the detector decides the received signal was a 1,

“E” the detector is not sure and declares the received signal an erasure (i.e., the receiver chooses not to choose).

The operation of the detector is described by the following set of conditional probabilities:

Pr(0 received 0 transmitted) = 0.90 Pr(0 received 1 transmitted) = 0.04

Pr(1 received 0 transmitted) = 0.01 Pr(1 received 1 transmitted) = 0.80

Pr(E received 0 transmitted) = 0.09 Pr(E received 1 transmitted) = 0.16

Again, assume that 0’s and 1’s are equally likely to be transmitted.

(a) What is the probability that a symbol is erased at the receiver?

(b) Given that a received symbol is declared an erasure, what is the probability that a 0 was actually transmitted?

(c) What is the probability of error of this receiver? That is, what is the probability that a 0 was transmitted and it is detected as a 1 or a 1 was transmitted and it is detected as a 0?

Solution: (a) Pr(E received) = Pr(E received | 0 transmitted) Pr(0 transmitted) + Pr(E received | 1 transmitted) Pr(1 transmitted) = (0.09)(0.5) + (0.16)(0.5) = 0.125

(b) Pr(0 transmitted | E received) =\( \frac{\text{ Pr(E received | 0 transmitted) Pr(0 transmitted)}}{\text{Pr(E received)}}\)

\(=\frac{(0.09)(0.5)}{(0.125)} \approx 0.2601\)(c) Pr(error) = Pr(0 received | 1 transmitted) + Pr(1 received | 0 transmitted) = 0.04 + 0.01 = 0.05

Example-3: We are in possession of two coins, one which is fairly balanced and turns up heads with probability 1/2, the other is weighted such that heads shows up with probability 3/4 and tails with probability 1/4. The two coins are identical looking and feeling so we cannot tell which is which. In order to determine which is the biased coin we toss the coin 10 times and observe the number of heads that occurred.

(a) If 7 heads were observed, what is the probability that the coin flipped was the fair coin?

(b) If 3 heads were observed, what is the probability that the coin flipped was the fair coin?

Solution: (a) If 7 H were observed, what is the probability the coin flipped was the fair coin?

\(Pr(faircoin|7H)=\frac{Pr(faircoin\cap7H)}{Pr(7H)}=\frac{Pr(faircoin\cap7H)}{Pr(faircoin\cap7H)+Pr(unfaircoin\cap7H)}\)

We see that

\(

Pr(faircoin\cap7H)=\bigg(\frac{1}{2}\bigg)\frac{{10 \choose 7}}{2^{1}{^{0}}}=\bigg(\frac{1}{2}\bigg){10 \choose 7}\bigg(\frac{1}{2}\bigg)^{7}\bigg(\frac{1}{2}\bigg)^{3}

\)

\(

Pr(unfaircoin\cap7H)=\bigg(\frac{1}{2}\bigg){10 \choose 7}\bigg(\frac{3}{4}\bigg)^{7}\bigg(\frac{1}{4}\bigg)^{3}\)

\( Pr(faircoin|7H)\approx 0.319. \)

(b) If 3 H were observed, what is the probability the coin flipped was the fair coin?

Following the same method above:

\(

Pr(faircoin|3H)=\frac{Pr(faircoin\cap3H)}{Pr(3H)}=\frac{Pr(faircoin\cap3H)}{Pr(faircoin\cap3H)+Pr(unfaircoin\cap3H)}

\)

We see that

\(

Pr(faircoin\cap3H)=\bigg(\frac{1}{2}\bigg)\frac{{10 \choose 3}}{2^{1}{^{0}}}=\bigg(\frac{1}{2}\bigg){10 \choose 3}\bigg(\frac{1}{2}\bigg)^{3}\bigg(\frac{1}{2}\bigg)^{7} \)

\(Pr(unfaircoin\cap3H)=\bigg(\frac{1}{2}\bigg){10 \choose 3}\bigg(\frac{3}{4}\bigg)^{3}\bigg(\frac{1}{4}\bigg)^{7}\)

\(Pr(faircoin|3H)\approx 0.974.\)

There is one famous law also used in conjunction with Baye’s rule known as Total Law of probability which is mathematically defined as:

Let \(B_i\) be the set of mutually exclusive and exhaustive events. then.

\(P(A) = \sum_{i=1}^{n} P(A|B_i)P(B_i)\)