Bradley Efron is an American statistician at Stanford University whose invention of the bootstrap in 1979 is widely regarded as one of the most important ideas in modern statistics. Beyond this landmark contribution, Efron has made deep contributions to empirical Bayes methods, exponential families, and the philosophy of statistical inference, consistently seeking to understand the connections and tensions between frequentist and Bayesian approaches. His work demonstrates that the boundary between these schools is more permeable than their adherents often suppose.
Life and Career
Born in St. Paul, Minnesota. Studies mathematics at Caltech and statistics at Stanford.
Earns his Ph.D. from Stanford under the supervision of Rupert Miller and Charles Stein, beginning a lifelong career at Stanford.
Publishes "Bootstrap Methods: Another Look at the Jackknife" in the Annals of Statistics, introducing the bootstrap to the world.
Co-authors An Introduction to the Bootstrap with Robert Tibshirani, making the method accessible to applied researchers across all fields.
Publishes Large-Scale Inference: Empirical Bayes Methods for Estimation, Testing, and Prediction, synthesizing decades of work on the empirical Bayes approach to high-dimensional problems.
Co-authors Computer Age Statistical Inference with Trevor Hastie, providing a panoramic view of modern inference from both frequentist and Bayesian perspectives.
The Bootstrap
The bootstrap is a resampling method that estimates the sampling distribution of a statistic by repeatedly drawing samples (with replacement) from the observed data. Before the bootstrap, obtaining standard errors and confidence intervals for complex statistics required either analytical derivations (often intractable) or asymptotic approximations (often unreliable in small samples). The bootstrap replaced both with a simple, general-purpose computational procedure.
1. Draw X* = (x₁*, ..., xₙ*) by sampling with replacement from X
2. Compute θ̂* = t(X*)
3. Repeat B times to obtain θ̂*₁, ..., θ̂*_B
4. The distribution of θ̂* approximates the sampling distribution of θ̂
While the bootstrap is fundamentally a frequentist idea, it has deep connections to Bayesian inference. Efron himself noted that the nonparametric bootstrap corresponds to a Bayesian posterior under a noninformative Dirichlet process prior on the data-generating distribution. The Bayesian bootstrap, introduced by Rubin in 1981, makes this connection explicit by drawing random weights from a Dirichlet distribution rather than resampling data points.
Efron has spent much of his career exploring the relationship between the bootstrap and Bayesian inference. In many regular problems, bootstrap confidence intervals and Bayesian credible intervals give similar answers. The bootstrap can be viewed as an automatic, nonparametric approximation to Bayesian inference with a diffuse prior. However, the two approaches diverge when priors are informative or when the parameter space has boundaries, revealing genuine philosophical differences beneath the computational similarities.
Empirical Bayes Methods
Efron's other major contribution to Bayesian statistics is his development of empirical Bayes methods, particularly for large-scale inference problems. In empirical Bayes, the prior distribution is estimated from the data rather than specified in advance. This approach is especially powerful when many similar inference problems are solved simultaneously, as in genomics, where thousands of genes are tested for differential expression. The data from all genes collectively provide information about the prior, which then improves inference for each individual gene through shrinkage.
Efron showed that empirical Bayes methods can control false discovery rates while providing calibrated posterior probabilities, bridging the gap between Bayesian estimation and frequentist hypothesis testing. His local false discovery rate approach gives each test case a probability of being a true null, combining the best features of both paradigms.
Legacy
Efron's career demonstrates that the most productive statistical thinking often transcends the Bayesian-frequentist divide. The bootstrap gave frequentists a universal computational tool; empirical Bayes gave Bayesians a data-driven approach to prior specification; and Efron's theoretical work illuminated the deep structural connections between these traditions. His influence is reflected in the ubiquity of both bootstrapping and empirical Bayes methods across modern data science.
"Those who ignore statistics are condemned to reinvent it." — Bradley Efron