• Graduate program
    • Why Tinbergen Institute?
    • Program Structure
    • Courses
    • Course Registration
    • Facilities
    • Admissions
    • Recent PhD Placements
  • Research
  • News
  • Events
    • Summer School
      • Behavioral Macro and Complexity
      • Econometrics and Data Science Methods for Business and Economics and Finance
      • Inequalities in Health and Healthcare
      • Introduction in Genome-Wide Data Analysis
      • Research on Productivity, Trade, and Growth
      • Summer School Business Data Science Program
    • Events Calendar
    • Tinbergen Institute Lectures
    • Annual Tinbergen Institute Conference
    • Events Archive
  • Summer School
  • Alumni
  • Times

Kleibergen, F. (2004). Invariant Bayesian Inference in Regression Models that is robust against the Jeffreys-Lindleys Paradox Journal of Econometrics, 123(2):227--258.

  • Journal
    Journal of Econometrics

We obtain the prior and posterior probability of a nested regression model as the Hausdorff-integral of the prior and posterior on the parameters of an encompassing linear regression model over a lower-dimensional set that represents the nested model. The Hausdorff-integral is invariant and therefore avoids the Borel-Kolmogorov paradox. Basing priors and prior probabilities of nested regression models on the prior on the parameters of an encompassing linear regression model reduces the discrepancies between classical and Bayesian inference, like, the Jeffreys-Lindley's paradox. We illustrate the analysis with examples of linear restrictions, i.e. a linear regression model, and non-linear restrictions, i.e. a cointegration and an autoregressive moving average model, on the parameters of an encompassing linear regression model.