Skip to content

Marginal Probability

Formula

\[ P(A)=\sum_b P(A,b) \quad \text{(discrete)} \]
\[ f_X(x)=\int f_{X,Y}(x,y)\,dy \quad \text{(continuous)} \]

Parameters

  • \(P(A,b)\): joint probability over \(A\) and values/events \(b\)
  • \(f_{X,Y}\): joint density
  • \(f_X\): marginal density of \(X\)

What it means

A marginal probability is the probability of one variable/event after summing or integrating out the others.

What it's used for

  • Reducing joint distributions to single-variable distributions.
  • Computing denominators in conditional probabilities and Bayes' rule.

Key properties

  • Marginalization preserves total probability.
  • Sometimes called "summing out" a variable.

Common gotchas

  • Do not confuse marginalization with conditioning.
  • In continuous cases, use integration, not summation.

Example

If \(P(X=1,Y=0)=0.2\) and \(P(X=1,Y=1)=0.3\), then \(P(X=1)=0.5\).

How to Compute (Pseudocode)

Input: event probabilities / joint distribution entries
Output: requested probability quantity

identify the relevant events/variables and required joint/marginal terms
apply the probability identity in the card formula
check denominator/normalization terms are valid (nonzero when required)
return the computed probability

Complexity

  • Time: Typically \(O(1)\) for a single event computation once required probabilities are available; larger table-based calculations scale with table size
  • Space: \(O(1)\) extra space for a single computation
  • Assumptions: Probability terms (joint/marginals/conditionals) are already known or computed separately

See also