Skip to content

Latest commit

 

History

History
30 lines (24 loc) · 1.71 KB

File metadata and controls

30 lines (24 loc) · 1.71 KB

Equation (parameter-joint-repn-equation on page parameter-joint-repn-equation defines the joint distribution represented by a Bayesian network in terms of the parameters $\theta(X_i{{,|,}}{Parents}(X_i))$. This exercise asks you to derive the equivalence between the parameters and the conditional probabilities ${\textbf{ P}}(X_i{{,|,}}{Parents}(X_i))$ from this definition.

  1. Consider a simple network $X\rightarrow Y\rightarrow Z$ with three Boolean variables. Use Equations (conditional-probability-equation and (marginalization-equation (pages conditional-probability-equation and marginalization-equation) to express the conditional probability $P(z{{,|,}}y)$ as the ratio of two sums, each over entries in the joint distribution ${\textbf{P}}(X,Y,Z)$.

  2. Now use Equation (parameter-joint-repn-equation to write this expression in terms of the network parameters $\theta(X)$, $\theta(Y{{,|,}}X)$, and $\theta(Z{{,|,}}Y)$.

  3. Next, expand out the summations in your expression from part (b), writing out explicitly the terms for the true and false values of each summed variable. Assuming that all network parameters satisfy the constraint $\sum_{x_i} \theta(x_i{{,|,}}{parents}(X_i)){{,=,}}1$, show that the resulting expression reduces to $\theta(z{{,|,}}y)$.

  4. Generalize this derivation to show that $\theta(X_i{{,|,}}{Parents}(X_i)) = {\textbf{P}}(X_i{{,|,}}{Parents}(X_i))$ for any Bayesian network.