Definition
The Hartley function is an information measure defined as the logarithm of the number of equally probable symbols or outcomes in a source. Formally, for a set of $N$ equiprobable possibilities, the Hartley function $H_0$ is given by
$$ H_0 = \log_b N, $$
where $b$ denotes the base of the logarithm, determining the unit of information (e.g., bits for base 2, nats for base e, hartleys for base 10).
Overview
Introduced by Ralph Hartley in 1928, the Hartley function constitutes one of the earliest quantitative formulations of information. It quantifies the maximum amount of information that can be conveyed when each possible symbol of a source is equally likely. The measure is additive for independent sources: the Hartley information of two independent sources equals the sum of their individual Hartley informations. Although superseded in many applications by Shannon entropy, which accounts for arbitrary probability distributions, the Hartley function remains a fundamental concept in the theoretical foundation of information theory and serves as a special case of Shannon entropy when all probabilities are equal.
Etymology/Origin
The term is named after American engineer and physicist Ralph Hartley (1888–1975), who published the seminal paper “Transmission of Information” (1928). Hartley’s work laid the groundwork for later developments by Claude Shannon and others.
Characteristics
- Logarithmic Scale: The value grows logarithmically with the number of possible symbols, reflecting diminishing returns of adding more alternatives.
- Base Dependence: Choice of logarithm base determines the unit: base 2 yields bits, base e yields nats, and base 10 yields hartleys.
- Additivity: For independent sources $A$ and $B$ with $N_A$ and $N_B$ equiprobable outcomes, $H_0(A,B) = H_0(A) + H_0(B) = \log_b(N_A N_B)$.
- Maximum Information: For a source with a fixed alphabet size, the Hartley function provides the maximal information content attainable when the symbol distribution is uniform.
- Relation to Entropy: When all symbol probabilities are equal, the Hartley function coincides numerically with the Shannon entropy $H = -\sum p_i \log_b p_i$.
Related Topics
- Hartley entropy – another name for the Hartley function, emphasizing its role as an entropy measure.
- Shannon entropy – a generalization of the Hartley function to arbitrary probability distributions.
- Information theory – the broader field studying quantification, transmission, and processing of information.
- Source coding theorem – connects entropy measures, including Hartley’s, to limits on data compression.
- Logarithmic units of information – bits, nats, hartleys, each corresponding to a different logarithm base.