Definition
Iterated filtering is a family of computational algorithms for maximum‑likelihood or Bayesian inference in partially observed stochastic dynamic systems, particularly nonlinear state‑space models. The methods repeatedly apply particle filtering (sequential Monte‑Carlo) while progressively reducing artificial perturbations of model parameters, thereby “filtering” the likelihood surface to locate optimal parameter estimates.
Overview
State‑space models consist of an unobserved latent process that evolves over time according to a Markovian transition law, and an observation process that provides noisy measurements of the latent state. Traditional likelihood evaluation for such models is often intractable because it requires integrating over all possible latent trajectories. Iterated filtering circumvents this by using a particle filter to approximate the likelihood for a given parameter set, then stochastically perturbs the parameters and repeats the filtering step. Over successive iterations, the magnitude of the perturbations is reduced according to a prescribed schedule, allowing the algorithm to converge toward the maximum‑likelihood estimator (MLE). Variants include the Iterated Filtering 1 (IF1), Iterated Filtering 2 (IF2), and particle Markov chain Monte‑Carlo (PMCMC) adaptations that incorporate iterated filtering within MCMC proposals.
The approach was popularized in the mid‑2000s by researchers such as Edward L. Ionides, Robert R. Bretó, and Aaron A. King, who demonstrated its utility in ecological and epidemiological applications (e.g., modeling infectious disease dynamics). Since then, iterated filtering has been implemented in several statistical software packages, notably the pomp (Partially Observed Markov Process) R package.
Etymology/Origin
The term combines “iterated,” indicating repeated application, with “filtering,” referring to the particle filtering (sequential Monte‑Carlo) technique used to approximate the conditional distribution of the latent state given observed data. The phrase first appeared in the statistical literature in the early 2000s to describe algorithms that iteratively refine parameter estimates by successive filtering steps.
Characteristics
| Feature | Description |
|---|---|
| Algorithmic core | Particle filter with stochastic parameter perturbations (“random walk” of parameters) at each time step. |
| Perturbation schedule | Typically a decreasing sequence (e.g., geometric cooling) that controls the variance of the parameter random walk, allowing exploration early and refinement later. |
| Convergence goal | Approaches the MLE under regularity conditions; can also be used to approximate the likelihood surface for Bayesian inference when combined with MCMC. |
| Applicability | Suited for nonlinear, non‑Gaussian state‑space models where analytical likelihood is unavailable. Common in epidemiology, ecology, finance, and systems biology. |
| Computational demand | Intensive; requires many particles and iterations, especially for high‑dimensional parameter spaces. Parallel computing is often employed. |
| Statistical properties | Under suitable conditions, iterated filtering yields consistent and asymptotically normal parameter estimates. Theoretical guarantees have been established for specific variants (e.g., IF2). |
| Software support | Implemented in R (pomp), Python (pyPOMP), and other platforms; extensions exist for handling time‑varying parameters and hybrid models. |
Related Topics
- Particle filtering (Sequential Monte‑Carlo) – the underlying technique for approximating the filtering distribution of latent states.
- State‑space models – mathematical framework consisting of latent state dynamics and observation equations.
- Maximum likelihood estimation – statistical method that iterated filtering seeks to approximate for complex models.
- Particle Markov chain Monte‑Carlo (PMCMC) – a class of algorithms that combine particle filters with MCMC, sometimes using iterated filtering for proposal generation.
- Approximate Bayesian Computation (ABC) – another likelihood‑free inference method, often contrasted with iterated filtering.
- Stochastic Approximation – mathematical foundation for the decreasing perturbation schedule used in iterated filtering.
Iterated filtering remains an active area of research, with ongoing developments aimed at improving scalability, robustness to model misspecification, and integration with modern machine‑learning tools.