• Graduate program
    • Courses
    • Why Tinbergen Institute?
    • Program Structure
    • Course Registration
    • Recent PhD Placements
    • Admissions
    • Facilities
  • Research
  • News
  • Events
    • Events Calendar
    • Tinbergen Institute Lectures
    • Annual Tinbergen Institute Conference
    • Events Archive
    • Summer School
      • Econometric Methods for Forecasting and Data Science
      • Introduction in Genome-Wide Data Analysis
      • Business Data Science Summer School Program
  • Times
Home | Events Archive | Data, Autocracies, and the Direction of Innovation
Seminar

Data, Autocracies, and the Direction of Innovation


  • Series
  • Speaker(s)
    David Yang (Harvard University, United States)
  • Field
    Behavioral Economics
  • Location
    Erasmus University, Theil Building, Room C2-6
    Rotterdam
  • Date and time

    November 29, 2019
    12:00 - 13:00

Data is a key input for developing certain modern technologies, particularly in the realm of AI. Many AI technologies aim to predict human behaviors, which could greatly benefit the survival of autocratic regimes. How does modern autocracy shape data-intensive innovation and the direction of technological change? We provide a framework to model: (1) data in the technological production process, where data is a non-rival, excludable input whose mass collection is often considered unethical; and (2) political economic incentives to collect data and develop AI algorithms that sustain autocracy. We support this conceptual framework with macro-level empirical evidence on AI innovation across countries, as well as micro-level evidence from China’s world-leading AI facial recognition industry. We examine the facial recognition firms who receive data from the government by providing public security services. We find that service provision with the government puts the firms on a different path in data-intensive innovation, resulting in more product development in commercial sector that relies on accessing the data held by the government.