Let’s name the prisoners \(A\), \(B\) and \(C\), where ours is \(A\). The two possible responses of the guard are \(G_B\) and \(G_C\). The prisoner \(A\) thinks that the probability that he will be released is \(P(A_r)=2/3\). He also thinks that the probability that his chances of being released gets worse with a response from the guard. Symbolically, he believes that \(P(A_r\given G_B)=P(A_r\given G_C) = 1/2\).

Now we compute the values of these conditional probabilities and see if they concur with \(A's\) worries or not.


To make use of Bayes’ theorem with the total probability theorem, we need a partitioning of the sample space. Switching our perspective to from begin released to being kept in prison, we can partition the sample space to the three events \(\{A_k, B_k, C_k\}\), where \(A_k\) is the event that \(A\) is kept in prison, and likewise for the other two prisoners. We can even call these events simply by the “names” of the prisoners.

Probability tree

In a standard Bayesian approach, we ask what does a data like \(G_B\) tell us about the probability of \(A\) being kept in prison. This probability is computed by dividing the probability of \(A\) being kept and the guard responding with \(G_B\) by the probability of the guard responding with \(G_B\). Reading these values from the graph,

\[\begin{align*} P(A\given G_B) &= \frac{P(A\cap G_B)}{P(G_B)}\\ &= \frac{P(G_B\given A) \cdot P(A)}{P(G_B\given A) \cdot P(A) + P(G_B\given B) \cdot P(B) + P(G_B\given C) \cdot P(C)}\\ &= \frac{\frac{1}{2}\cdot\frac{1}{3}}{\frac{1}{2}\cdot\frac{1}{3}+ 0\cdot \frac{1}{3} + 1\cdot\frac{1}{3}} = \frac{1}{3} \end{align*}\]

The same calculation is valid for \(P(A\given G_C)\) as well. Therefore, the probability of \(A\) being kept (or released) does not change with receiving the guard’s response.