Vector autoregressive (VAR) models are workhorses of multivariate time series analysis. A VAR of order p expresses each variable in a system as a linear function of its own past values and the past values of all other variables. The appeal is clear — VARs capture the dynamic interdependencies among economic, financial, or scientific variables without imposing the strong identification assumptions of structural models. But the curse of dimensionality looms: a VAR with k variables and p lags has k²p coefficients, and with even moderate k the parameter count explodes.
Bayesian methods resolve this through shrinkage priors that pull coefficients toward structured defaults. The most influential of these is the Minnesota prior, developed at the Federal Reserve Bank of Minneapolis by Robert Litterman and Thomas Doan in the 1980s, which encodes the belief that each variable follows a random walk and that own lags are more important than cross-variable lags.
Minnesota Prior (Litterman, 1986) E[Aₗ(i,j)] = δᵢ if i = j and l = 1, 0 otherwise
Var[Aₗ(i,j)] = λ² / l² (own lags), λ²σᵢ² / (l²σⱼ²) (cross lags)
Where δᵢ = 1 (random walk) or 0 (white noise prior)
λ controls overall shrinkage tightness
l = lag order (later lags shrunk more aggressively)
The Minnesota Prior
The Minnesota prior embodies a simple economic intuition: most macroeconomic time series are well approximated by random walks. The prior centers the first own-lag coefficient at 1 (or a value close to it) and all other coefficients at 0. The variance of each prior decreases with the lag length (distant past matters less) and is smaller for cross-variable coefficients than for own-variable coefficients (your own history is more informative than others').
This structure acts as a regularizer. In a 20-variable, 4-lag VAR with 1,600+ coefficients, the Minnesota prior prevents overfitting by anchoring the estimates to a parsimonious baseline while still allowing the data to pull coefficients away from zero when the evidence warrants it. The result is dramatically improved out-of-sample forecasting compared to unrestricted OLS estimates.
An unrestricted VAR estimated by OLS produces coefficient estimates that are unbiased but have enormous variance when parameters are numerous relative to observations. Out-of-sample, this variance dominates, producing erratic and unreliable forecasts. The Bayesian approach trades a small amount of bias (from the prior) for a large reduction in variance — a bias-variance tradeoff that almost always favors the Bayesian estimator in practice.
Extensions and Modern Developments
Stochastic Volatility
Standard BVARs assume constant error variance Σ. Bayesian VARs with stochastic volatility allow Σₜ to evolve over time, capturing the well-documented changes in macroeconomic volatility (the Great Moderation, the 2008 crisis). These models, developed by Cogley and Sargent (2005) and Primiceri (2005), have become standard in central bank forecasting.
Large BVARs
Modern applications push BVARs to hundreds of variables, exploiting the regularization power of hierarchical priors. Bańbura, Giannone, and Reichlin (2010) showed that large BVARs with appropriately calibrated Minnesota priors can match or outperform factor models and dynamic stochastic general equilibrium (DSGE) models in macroeconomic forecasting.
Christopher Sims publishes "Macroeconomics and Reality," introducing VAR models to macroeconomics and arguing against the incredible identification assumptions of large structural models.
Robert Litterman develops the Minnesota prior for Bayesian VAR estimation at the Federal Reserve Bank of Minneapolis, demonstrating superior forecasting performance.
Primiceri introduces time-varying parameter VARs with stochastic volatility, enabling the study of evolving macroeconomic dynamics.
Bańbura, Giannone, and Reichlin demonstrate that large BVARs with 130+ variables produce competitive forecasts, establishing BVARs as a leading approach for data-rich environments.
Applications
BVARs are the primary forecasting tool at many central banks worldwide, including the European Central Bank, the Federal Reserve, and the Bank of England. They are used for GDP growth forecasting, inflation prediction, impulse response analysis (how does a monetary policy shock propagate?), and scenario analysis. The Bayesian framework naturally produces predictive distributions — not just point forecasts — enabling probabilistic assessments of economic risks.
"The restrictions imposed by economic theory are 'incredible' — not in the colloquial sense, but in the literal sense of not being believable." — Christopher Sims, "Macroeconomics and Reality" (1980)