Jayanta Kumar Ghosh (1937–2017) was one of India's most distinguished statisticians and a leading figure in the development of Bayesian asymptotic theory. A professor at the Indian Statistical Institute in Kolkata and later at Purdue University, he made fundamental contributions to understanding the large-sample behavior of Bayesian procedures, including posterior consistency, asymptotic normality of posteriors (the Bernstein-von Mises theorem), and model selection consistency with Bayesian methods. His work provided the theoretical assurance that Bayesian methods, under appropriate conditions, converge to the truth as data accumulate.
Early Life and Education
Ghosh was born in Kolkata (then Calcutta), India, and studied at Presidency College and the Indian Statistical Institute, earning his PhD under the supervision of Hari Kinkar Nandi. The Indian Statistical Institute, founded by P. C. Mahalanobis, provided a rich intellectual environment that combined theoretical depth with practical engagement. Ghosh joined the ISI faculty and spent several decades there before moving to Purdue University in the United States.
Bayesian Asymptotics
Ghosh's most influential work concerns the asymptotic behavior of posterior distributions. He contributed to refining and extending the Bernstein-von Mises theorem, which states that under regularity conditions, the posterior distribution converges to a normal distribution centered at the maximum likelihood estimator as the sample size grows. This result is crucial because it shows that Bayesian and frequentist methods agree asymptotically, providing a bridge between the two approaches.
The large-sample theory of Bayesian inference provides reassurance that Bayesian methods “work” even from a frequentist perspective: posteriors concentrate around the true parameter value, credible intervals have correct frequentist coverage asymptotically, and Bayes estimators are asymptotically efficient. These results have been essential for the acceptance of Bayesian methods by the broader statistical community.
Model Selection Consistency
Ghosh made important contributions to understanding when Bayesian model selection procedures are consistent—that is, when they select the true model (or the best approximating model) with probability approaching one as the sample size grows. He investigated the conditions on priors and model classes that ensure consistency, work that is particularly relevant given the widespread use of Bayesian model selection in practice.
“The strength of the Bayesian approach lies not only in its coherence but in the fact that, under general conditions, it leads to optimal procedures in large samples.”— J. K. Ghosh
Higher-Order Asymptotics and Other Work
Ghosh also worked on higher-order asymptotic expansions for Bayesian procedures, showing how Bayesian methods provide automatic corrections for bias and skewness that frequentist methods must achieve through explicit adjustments. His work with Ramamoorthi produced the influential monograph Bayesian Nonparametrics, and he contributed to robust Bayesian analysis, objective priors, and applications in genetics and bioinformatics.
Legacy
Ghosh was elected a Fellow of the Royal Society, received the Bhatnagar Prize, and held numerous other honors. He mentored many students who have become leading statisticians in their own right. His work on Bayesian asymptotics remains foundational, providing the theoretical underpinnings for the practical use of Bayesian methods across the sciences.
Born in Kolkata, India.
Received PhD from the Indian Statistical Institute.
Professor at the Indian Statistical Institute, Kolkata.
Moved to Purdue University as professor of statistics.
Elected Fellow of the Royal Society.
Published Bayesian Nonparametrics with R. V. Ramamoorthi.
Died on 30 September, aged eighty.