Skip to content

Joint Probability

Formula

\[ P(A\cap B)=P(A,B) \]
\[ P(A,B)=P(A\mid B)P(B)=P(B\mid A)P(A) \]

Parameters

  • \(A,B\): events (or random variables/events derived from them)
  • \(P(A,B)\): probability both occur

What it means

Joint probability is the probability of two events happening together. This is the "prob times prob" idea, but exact multiplication requires conditional probability or independence.

What it's used for

  • Building multivariate distributions.
  • Deriving Bayes' rule, marginals, and conditional probabilities.

Key properties

  • Symmetric: \(P(A,B)=P(B,A)\).
  • Not generally equal to \(P(A)P(B)\) unless \(A\) and \(B\) are independent.

Common gotchas

  • Multiplying marginals directly is only valid under independence.
  • \(P(A,B)\) and \(P(A\mid B)\) are different objects.

Example

If \(P(A)=0.4\) and \(P(B\mid A)=0.5\), then \(P(A,B)=0.5\cdot 0.4=0.2\).

How to Compute (Pseudocode)

Input: event probabilities / joint distribution entries
Output: requested probability quantity

identify the relevant events/variables and required joint/marginal terms
apply the probability identity in the card formula
check denominator/normalization terms are valid (nonzero when required)
return the computed probability

Complexity

  • Time: Typically \(O(1)\) for a single event computation once required probabilities are available; larger table-based calculations scale with table size
  • Space: \(O(1)\) extra space for a single computation
  • Assumptions: Probability terms (joint/marginals/conditionals) are already known or computed separately

See also