|
| 1 | +We start defining the following convenient r.v.s |
| 2 | + |
| 3 | +\begin{itemize} |
| 4 | +\item $A$: true answer sent by Alice. |
| 5 | + |
| 6 | +\item $V$: value received by Bob. |
| 7 | + |
| 8 | +\item $B$: answer interpreted by Bob. |
| 9 | +\end{itemize} |
| 10 | + |
| 11 | +From the statement, $V = A + \epsilon$, where $\epsilon \sim \mathcal{N}(0,\sigma^2)$ is the channel noise. |
| 12 | + |
| 13 | +Let $C$ be the event in which Bob understands Alice correctly.\\ |
| 14 | + |
| 15 | +a. By LOTP |
| 16 | + |
| 17 | +$$ |
| 18 | +P(C) = P(C|A=0) P(A=0) + P(C|A=1) P(A=1) |
| 19 | +$$ |
| 20 | + |
| 21 | +Bob understands the correct message $i$ sent by Alice if he interprets the answer as $i$. Thus, $P(C|A=i) = P(B=i|A=i)$, for $i=0,1$. |
| 22 | + |
| 23 | +Using this result and assuming that Alice is equally likely to send a ``yes'' or ``no'' answer, the LOTP simplifies to |
| 24 | + |
| 25 | +$$ |
| 26 | +P(C) = \frac{1}{2} P(B=0|A=0) + \frac{1}{2} P(B=1|A=1) |
| 27 | +$$ |
| 28 | + |
| 29 | +The probability $P(B=0|A=0)$ is given by |
| 30 | + |
| 31 | +\begin{equation*} |
| 32 | +\begin{split} |
| 33 | +P(B=0|A=0) |
| 34 | +&= P(V \le 1/2 | A=0) \\ |
| 35 | +&= P(A + \epsilon \le 1/2 | A=0) \\ |
| 36 | +&= P(\epsilon \le 1/2 | A=0) \\ |
| 37 | +&= P(\epsilon \le 1/2) |
| 38 | +\end{split} |
| 39 | +\end{equation*} |
| 40 | + |
| 41 | +\noindent where the last equality is due to the fact that $\epsilon$ is independent of $A$. |
| 42 | + |
| 43 | +Applying the location-scale transformation on $\epsilon$, we have that $Z = \epsilon/\sigma \sim \mathcal{N}(0,1)$. |
| 44 | +Substituting this relation into $P(B=0|A=0)$ |
| 45 | + |
| 46 | +$$ |
| 47 | +P(B=0 | A=0) = P(Z \le 1/2\sigma) = \Phi \left( \frac{1}{2\sigma} \right) |
| 48 | +$$ |
| 49 | + |
| 50 | +\noindent where $\Phi(\cdot)$ is the CDF of the standard Normal distribution. |
| 51 | + |
| 52 | +Using a very similar derivation, one can also obtain |
| 53 | + |
| 54 | +$$ |
| 55 | +P(B=1 | A=1) = \Phi \left( \frac{1}{2\sigma} \right) |
| 56 | +$$ |
| 57 | + |
| 58 | +Substituting these results into the expression of $P(C)$ |
| 59 | + |
| 60 | +$$ |
| 61 | +P(C) = \frac{1}{2} \Phi \left( \frac{1}{2\sigma} \right) + \frac{1}{2} \Phi \left( \frac{1}{2\sigma} \right) = \Phi \left( \frac{1}{2\sigma} \right) |
| 62 | +$$\\ |
| 63 | + |
| 64 | +b. If $\sigma$ is very small, $P(C) \approx 1$. If $\sigma$ is very large, $P(C) \approx 1/2$. |
| 65 | + |
| 66 | +These extreme case results make sense. |
| 67 | + |
| 68 | +The case where $\sigma$ is very small approximates to the ideal noise-free channel. In this situation the information is transmitted without disturbances through the channel, and Bob receives the exact same message transmitted by Alice. Therefore, Bob always interprets the message correctly. |
| 69 | + |
| 70 | +The case with large $\sigma$ corresponds to a heavily noisy channel. |
| 71 | +The value received by Bob is dominated by noise, such that the interpreted message depends mostly on whether the noise realization at the time of transmission is positive or negative, regardless of what Alice has sent. Since the Gaussian noise $\epsilon$ is positive half of the time and negative in the other half, Bob correctly understands the message with probability $\approx$ 1/2. |
0 commit comments