Bayesian Statistics

Military Search Theory

Bayesian search theory maintains a probability map over target locations and updates it after each search pass using detection likelihoods, optimally allocating search effort to maximize the cumulative probability of detection — from hunting submarines to finding lost aircraft.

p(x | z₁:ₜ) ∝ p(zₜ | x) · p(x | z₁:ₜ₋₁)

Bayesian search theory is the application of Bayesian inference to the problem of finding a target whose location is unknown. The searcher begins with a prior probability distribution over possible target locations, conducts search operations that may or may not detect the target, and updates the probability map after each operation using Bayes' theorem. The result is a posterior distribution that concentrates probability in unsearched or poorly searched areas — exactly the regions where the target is most likely to remain undetected.

The theory has its origins in military operations research during World War II and remains one of the most celebrated and practically consequential applications of Bayesian reasoning.

The Search Framework

Prior Probability Map p(x)  →  Prior probability that the target is at location x

Detection Function p(detect | x, effort) = 1 − exp(−w(x) / A(x))

Where w(x)     →  Search effort (e.g., time or sweeps) allocated to location x
A(x)     →  Search width parameter (depends on sensor, terrain, weather)

Bayesian Update After Unsuccessful Search p(x | not detected) = p(x) · (1 − p(detect | x)) / Σ_x′ p(x′) · (1 − p(detect | x′))

After a search pass that fails to detect the target, the probability in the searched area decreases (because a target there would likely have been found), while the probability in unsearched areas increases. The map evolves dynamically as search operations proceed, always reflecting the cumulative evidence gathered so far.

Origins: The Hunt for U-Boats

1942

The U.S. Navy's Anti-Submarine Warfare Operations Research Group (ASWORG), led by physicist Philip Morse, develops the mathematical foundations of search theory to combat the U-boat threat in the Atlantic.

1946

Bernard Koopman publishes classified reports on optimal search theory, later declassified and published as Search and Screening (1956), the foundational text of the field.

1968

The search for USS Scorpion — a nuclear submarine lost in the Atlantic — employs Bayesian methods developed by John Craven. The wreckage is located within 220 yards of the predicted position.

2009–2011

The search for Air France Flight 447 in the deep Atlantic uses Bayesian search theory. After initial searches fail, the posterior probability map is updated and the wreckage and black boxes are found in the high-probability region.

2014

The search for Malaysia Airlines Flight 370 applies Bayesian methods to satellite signal analysis (Burst Frequency Offset) to define the search area in the southern Indian Ocean.

Optimal Effort Allocation

Given a fixed total search effort W, the problem of how to distribute it across locations to maximize the probability of detection is a constrained optimization problem with a beautiful solution.

Optimal Effort Allocation (Koopman) Maximize:  P(detect) = Σ_x p(x) · [1 − exp(−w(x) / A(x))]
Subject to: Σ_x w(x) = W

Solution (Equal Marginal Returns) At the optimum, the marginal return from additional effort is equal across all locations:
p(x) · exp(−w(x) / A(x)) / A(x) = λ   for all x with w(x) > 0

The solution allocates more effort to locations where the prior probability is high and where the search conditions are favorable (small A(x), meaning high detection probability per unit effort). Locations with very low prior probability or very poor search conditions receive no effort at all. The Lagrange multiplier lambda sets the threshold below which effort is not worthwhile.

The Scorpion Story

When the USS Scorpion was lost in May 1968, the Navy consulted John Craven, chief scientist of the Special Projects Office. Craven assembled a group of experts — submarine specialists, oceanographers, mathematicians — and elicited probability distributions over possible locations based on different failure scenarios. He combined these into a prior probability map, used Bayesian updating as search data accumulated, and correctly predicted the wreckage location. This success, widely publicized after declassification, became the most famous real-world validation of Bayesian search theory.

The Air France 447 Case

Air France Flight 447 disappeared over the Atlantic on June 1, 2009. Despite multiple search campaigns, the wreckage was not found for nearly two years. In 2011, a team led by Lawrence Stone of Metron Scientific Solutions applied Bayesian search theory to integrate all available evidence: aircraft performance data, ocean drift models for debris, and the results of the failed searches (which provided information by ruling out areas).

The critical insight was that failed searches are not wasted effort — they are evidence. Each unsuccessful pass reduces the probability that the target is in the searched area and increases it elsewhere. The posterior map after incorporating the negative evidence from 2009–2010 searches pointed to a region that had not been adequately searched, and the wreckage was found there within a week of resuming operations.

"A search that doesn't find the target is not a failure — it is data. It tells us where the target is not, and by Bayes' theorem, it tells us where the target is more likely to be." — Lawrence D. Stone, Theory of Optimal Search (2004)

Search and Rescue Applications

The U.S. Coast Guard's Search and Rescue Optimal Planning System (SAROPS) implements Bayesian search theory operationally. The system maintains a Monte Carlo simulation of possible target trajectories (accounting for ocean currents, wind, and leeway), updates the probability map as search aircraft and vessels report their coverage, and recommends optimal search patterns for the next sortie.

SAROPS has been used in thousands of real search and rescue operations, and its Bayesian framework has been adopted by coast guards in several other nations. The system accounts for detection probabilities that vary with weather, visibility, sea state, target size, and sensor type — a level of realism that would be impossible without the flexible Bayesian updating framework.

Extensions to Moving Targets

When the target may be moving — a drifting liferaft, a submarine attempting to evade, or a lost hiker — the prior must be propagated forward in time using a motion model. The resulting framework is a hidden Markov model, with the target's location as the hidden state and search observations as the evidence. Particle filters and grid-based methods are used to maintain the posterior distribution over time.

From War to Civilian Life

Bayesian search theory, born in the urgency of antisubmarine warfare, now saves lives in peacetime. Beyond maritime search and rescue, the same principles are used to locate lost hikers, plan search patterns for downed aircraft, guide undersea exploration for shipwrecks and archaeological sites, and even inform strategies for finding missing persons in urban environments.

Interactive Calculator

Each row represents a search action: cell (grid ID like A1, B2, etc.), searched (yes/no), and found (yes/no). The calculator starts with a uniform prior over all cells and updates the posterior when a search fails (not found), accounting for a detection probability of 0.8. Watch the posterior probability map shift as unsuccessful searches eliminate likely hiding locations.

Click Calculate to see results, or Animate to watch the statistics update one record at a time.

Related Topics

External Links