📖 WIPIVERSE

🔍 Currently registered entries: 126,479건

Marginals

In statistics and probability, "marginals" (or marginal distributions) refer to the probability distribution of a subset of variables from a joint probability distribution. Essentially, it represents the distribution of a single variable (or a small set of variables) without considering the values of the other variables.

More formally, if we have a joint probability distribution P(X, Y) over two variables X and Y, the marginal distribution of X, denoted P(X), is obtained by summing (or integrating in the continuous case) over all possible values of Y:

  • For discrete variables: P(X = x) = Σy P(X = x, Y = y)
  • For continuous variables: P(X = x) = ∫ P(X = x, Y = y) dy

The process of calculating the marginal distribution is often called "marginalizing out" the other variables.

The concept of marginals extends to distributions involving more than two variables. For example, given P(X, Y, Z), we can obtain the marginal distribution of X by marginalizing out both Y and Z: P(X = x) = Σy Σz P(X = x, Y = y, Z = z) (discrete case).

Marginal distributions provide insights into the individual behaviors of variables within a larger system, isolating them from the influence of other variables in the joint distribution. They are often used in Bayesian statistics, machine learning, and decision theory.

In areas outside of statistics, "marginals" can also refer to related concepts, such as marginal costs or marginal revenue in economics, which represent the change in cost or revenue resulting from producing one additional unit. While these concepts share a similar root relating to the effect of a small change, they are distinct from the probabilistic meaning described above.