A Spatial Perspective on the Econometrics of Program Evaluation

Marynia Kolak, Luc Anselin

Research output: Contribution to journalArticlepeer-review


Empirical work in regional science has seen a growing interest in causal inference, leveraging insights from econometrics, statistics, and related fields. This has resulted in several conceptual as well as empirical papers. However, the role of spatial effects, such as spatial dependence (SD) and spatial heterogeneity (SH), is less well understood in this context. Such spatial effects violate the so-called stable unit treatment value assumption advanced by Rubin as part of the foundational framework for empirical treatment effect analysis. In this article, we consider the role of spatial effects more closely. We provide a brief overview of a number of attempts to extend existing econometric treatment effect evaluation methods with an accounting for spatial aspects and outline and illustrate an alternative approach. Specifically, we propose a spatially explicit counterfactual framework that leverages spatial panel econometrics to account for both SD and SH in treatment choice, treatment variation, and treatment effects. We illustrate this framework with a replication of a well-known treatment effect analysis, that is, the evaluation effect of minimum legal drinking age laws on mortality for US states during the period 1970–1984, a classic textbook example of applied causal inference. We replicate the results available in the literature and compare these to a range of alternative specifications that incorporate spatial effects.

Original languageEnglish (US)
Pages (from-to)128-153
Number of pages26
JournalInternational Regional Science Review
Issue number1-2
StatePublished - Jan 1 2020
Externally publishedYes


  • spatial dependence
  • spatial econometrics
  • spatial heterogeneity
  • treatment effects

ASJC Scopus subject areas

  • Environmental Science(all)
  • Social Sciences(all)


Dive into the research topics of 'A Spatial Perspective on the Econometrics of Program Evaluation'. Together they form a unique fingerprint.

Cite this