• Graduate Programs
    • Tinbergen Institute Research Master in Economics
      • Why Tinbergen Institute?
      • Research Master
      • Admissions
      • PhD Vacancies
      • Selected PhD Placements
    • Facilities
    • Research Master Business Data Science
    • Education for external participants
    • Summer School
    • Tinbergen Institute Lectures
    • PhD Vacancies
  • Research
  • Browse our Courses
  • Events
    • Summer School
      • Applied Public Policy Evaluation
      • Deep Learning
      • Development Economics
      • Economics of Blockchain and Digital Currencies
      • Economics of Climate Change
      • The Economics of Crime
      • Foundations of Machine Learning with Applications in Python
      • From Preference to Choice: The Economic Theory of Decision-Making
      • Inequalities in Health and Healthcare
      • Marketing Research with Purpose
      • Markets with Frictions
      • Modern Toolbox for Spatial and Functional Data
      • Sustainable Finance
      • Tuition Fees and Payment
      • Business Data Science Summer School Program
    • Events Calendar
    • Events Archive
    • Tinbergen Institute Lectures
    • 2026 Tinbergen Institute Opening Conference
    • Annual Tinbergen Institute Conference
  • News
  • Summer School
  • Alumni
    • PhD Theses
    • Master Theses
    • Selected PhD Placements
    • Key alumni publications
    • Alumni Community
Home | Events Archive | Differentially Private Inference via Noisy Optimization
Seminar

Differentially Private Inference via Noisy Optimization


  • Location
    Erasmus University Rotterdam, E building, room ET-14
    Rotterdam
  • Date and time

    March 21, 2024
    12:00 - 13:00

Abstract
We propose a general optimization-based framework for computing differentially private M-estimators and a new method for constructing differentially private confidence regions. First, we show that robust statistics can be used in conjunction with noisy gradient descent or noisy Newton methods in order to obtain optimal private estimators with global linear or quadratic convergence, respectively. We establish local and global convergence guarantees, under both local strong convexity and self-concordance, showing that our private estimators converge with high probability to a small neighborhood of the nonprivate M-estimators. Second, we tackle the problem of parametric inference by constructing differentially private estimators of the asymptotic variance of our private M-estimators. This naturally leads to approximate pivotal statistics for constructing confidence regions and conducting hypothesis testing. We demonstrate the effectiveness of a bias correction that leads to enhanced small-sample empirical performance in simulations. This is joint work with Casey Bradshaw and Po-Ling Loh