Irving John Good (1916–2009), known universally as Jack Good, was a polymath whose career spanned codebreaking, Bayesian statistics, artificial intelligence, and combinatorics. Born Isidore Jacob Gudak in London to a Polish Jewish family, he worked under Alan Turing at Bletchley Park during World War II, where he applied Bayesian reasoning to break the German Enigma and Tunny ciphers. After the war, he became one of the most prolific and eclectic contributors to Bayesian thought, publishing over two thousand papers across an astonishing range of subjects.
Bletchley Park and the Bayesian War Effort
Good arrived at Bletchley Park in 1941 at the age of twenty-five, assigned to Hut 8 under Alan Turing and later to the Newmanry under Max Newman. The codebreaking work at Bletchley was fundamentally Bayesian: Turing had developed a measure he called the “ban” (and its submultiple the “deciban”), which was a logarithmic measure of the weight of evidence for one hypothesis versus another. Good became a key contributor to this framework, applying sequential Bayesian updating to the enormous combinatorial problem of deciphering encrypted messages.
Turing and Good measured evidence in “bans” and “decibans”—units based on the logarithm of the Bayes factor. A deciban represented roughly the smallest change in evidence that a human could perceive. This practical, quantitative approach to evidence was one of the earliest systematic applications of Bayesian reasoning, and it was done in the highest of stakes: the outcome of a world war.
Weight of Evidence
After the war, Good formalized the concept of weight of evidence, drawing on the Bletchley experience. He defined the weight of evidence provided by observation E in favor of hypothesis H against its negation as the logarithm of the likelihood ratio. This measure has the crucial property of additivity: for independent pieces of evidence, weights simply add up, making it a natural tool for sequential updating.
“The Bayesian approach is the only one that is self-consistent, although it requires judgments that are partly subjective. But then, so does everything else in life.”— I. J. Good
Contributions to Bayesian Statistics
Good's contributions to Bayesian statistics were extraordinarily diverse. He wrote on the estimation of probabilities, hierarchical Bayesian methods (which he called “type II” and “type III” priors), the combination of evidence, contingency tables, and the philosophy of probability. His 1950 book Probability and the Weighing of Evidence was one of the earliest postwar Bayesian texts, and his 1965 The Estimation of Probabilities: An Essay on Modern Bayesian Methods further developed hierarchical approaches.
Artificial Intelligence and Other Interests
Good was also a pioneer of artificial intelligence. His 1965 paper “Speculations Concerning the First Ultraintelligent Machine” introduced the concept of an “intelligence explosion”—the idea that a sufficiently intelligent machine could design an even more intelligent machine, triggering a chain reaction of intelligence. This concept, now central to discussions of AI safety, was decades ahead of its time. Good also contributed to number theory, fractals, philosophy, and chess programming.
Later Career
Good held positions at the University of Manchester, the Admiralty, and Trinity College, Oxford, before moving permanently to Virginia Tech in 1967, where he remained for the rest of his career. He was named University Distinguished Professor and continued publishing into his nineties.
Born on 9 December in London as Isidore Jacob Gudak.
Joined Bletchley Park, working under Alan Turing on Enigma decryption.
Transferred to the Newmanry, applying Bayesian methods to the Tunny cipher.
Published Probability and the Weighing of Evidence.
Published on the concept of an ultraintelligent machine and on Bayesian estimation of probabilities.
Joined Virginia Tech as University Distinguished Professor.
Died on 5 April in Radford, Virginia, aged ninety-two.