• Graduate Programs
    • Tinbergen Institute Research Master in Economics
      • Why Tinbergen Institute?
      • Research Master
      • Admissions
      • All Placement Records
      • PhD Vacancies
    • Facilities
    • Research Master Business Data Science
    • Education for external participants
    • Summer School
    • Tinbergen Institute Lectures
    • PhD Vacancies
  • Research
  • Browse our Courses
  • Events
    • Summer School
      • Applied Public Policy Evaluation
      • Deep Learning
      • Development Economics
      • Economics of Blockchain and Digital Currencies
      • Economics of Climate Change
      • The Economics of Crime
      • Foundations of Machine Learning with Applications in Python
      • From Preference to Choice: The Economic Theory of Decision-Making
      • Inequalities in Health and Healthcare
      • Marketing Research with Purpose
      • Markets with Frictions
      • Modern Toolbox for Spatial and Functional Data
      • Sustainable Finance
      • Tuition Fees and Payment
      • Business Data Science Summer School Program
    • Events Calendar
    • Events Archive
    • Tinbergen Institute Lectures
    • 2026 Tinbergen Institute Opening Conference
    • Annual Tinbergen Institute Conference
  • News
  • Summer School
  • Alumni
    • PhD Theses
    • Master Theses
    • Selected PhD Placements
    • Key alumni publications
    • Alumni Community

Kleibergen, F. (2004). Invariant Bayesian Inference in Regression Models that is robust against the Jeffreys-Lindleys Paradox Journal of Econometrics, 123(2):227--258.


  • Journal
    Journal of Econometrics

We obtain the prior and posterior probability of a nested regression model as the Hausdorff-integral of the prior and posterior on the parameters of an encompassing linear regression model over a lower-dimensional set that represents the nested model. The Hausdorff-integral is invariant and therefore avoids the Borel-Kolmogorov paradox. Basing priors and prior probabilities of nested regression models on the prior on the parameters of an encompassing linear regression model reduces the discrepancies between classical and Bayesian inference, like, the Jeffreys-Lindley's paradox. We illustrate the analysis with examples of linear restrictions, i.e. a linear regression model, and non-linear restrictions, i.e. a cointegration and an autoregressive moving average model, on the parameters of an encompassing linear regression model.