Impact factor (WEB OF SCIENCE - Clarivate)

2 year: 7.664 | 5 year: 7.127


Natural experiments: A Nobel Prize awarded research design for strengthening causal inference on global health challenges

Famke JM Mölenberg1, Francisca Vargas Lopes1,2

1 Department of Public Health, Erasmus MC, University Medical Center Rotterdam, Rotterdam, The Netherlands
2 Erasmus Centre for Health Economics Rotterdam (EsCHER), Erasmus University Rotterdam, Rotterdam, The Netherlands


Print Friendly, PDF & Email

We would like to congratulate David Card, Joshua Angrist and Guido Imbens for winning the Nobel Memorial Prize in Economic Sciences 2021 for their pioneering work on natural experiments [1]. The committee acknowledged their work for “… shifting the focus in empirical research using observational data towards relying on quasi-experimental variation to establish causal effects.” On the occasion of their Nobel Prize, we would like to share a thought on learnings gained during our PhD trajectories in public health focused on natural experiments.

In public health, the opportunity of natural experiments to address global health challenges have been discussed for some years [25]. Natural experiments allow the retrospective and prospective evaluation of policies, interventions or programs in real-world settings [2]. Importantly, they present a valuable alternative to evaluate changes to a system for which it would be unethical, unfeasible or simply impossible to conduct randomised controlled trials (RCTs). Although there is not a widely accepted definition, the key element of natural experiments is that the change in exposure is caused by external shocks or factors outside researchers’ control, and that manipulation of exposure by researchers is not possible [2]. This allows the identification of intervention and control groups. While under ideal circumstances there is an “as-if” random allocation to the intervention, it is not uncommon that potential confounding remains in the effect of exposures on outcomes of interest. In combining good knowledge of the allocation process, careful choice of methods, and transparent reporting and assumption testing, studies based on natural experiments can approximate causal evidence [2].

It is likely that with this Nobel prize, opportunities for evaluations by means of natural experiments will be further explored. On a global scale, numerous of opportunities will arise from the sudden and disruptive changes linked to COVID-19 resulting from the global variation in national responses [6]. Therefore, it is important to understand the barriers to evaluate them. In this essay we build on learnings gained during our PhD trajectories focused on natural experiments. We discuss three key aspects hindering the potential of this type of research. We argue that, paradoxically, some level of control is needed to shape conditions in which evaluations of natural experiments is possible.


There is considerable unpredictability in researching natural experiments, which may pose serious challenges for its evaluation. Unpredictability can be related to the implementation of the intervention (eg, timing, intensity and reach), but also to aspects related to the study conduction (eg, suitability of datasets and power). Studies using natural experiments in prospective evaluations may face difficulties aligning implementation, evaluation and funding timelines. For example, infrastructural interventions can be substantially delayed, while legislation is sometimes sooner implemented than anticipated; both impact heavily on timelines. Studies evaluating policies or interventions that have already been implemented will rely on previously collected data. This may sound like a secure route to minimise unpredictability, but exploratory data analysis is needed to assess whether assumptions and other statistical requirements of the study design are met. Not rarely, evaluations are altered or discontinued if evaluation in a meaningful way is not possible as expected when the natural experiment was identified. To overcome these challenges, researchers should be involved in early phases of intervention and policy planning, ensuring that key requirements to conduct evaluations through natural experiments are not missed. Based on our experience, early career researchers with relatively short contracts may benefit from joining existing collaborations with the fundaments for evaluation already present. Furthermore, research environments need to accommodate the intrinsic uncertainty of these studies. Providing the incentives to swiftly react on societal changes that suddenly occur are key: additional data can often be collected now, or never. For example, quick and flexible sources of funding have become available over the past months to study the COVID-19 pandemic. Similar initiatives are needed to combat big global health challenges that have been around for a while, including the “obesity epidemic”, the persistence of social inequalities, and the climate crisis.

Photo: Natural experiments in cities allow evaluation of exposures that often cannot be randomised (Famke Mölenberg, personal collection, used with permission).


Even when data to evaluate the natural experiment are available, a complex factor is creating the database that includes all information needed. Linking datasets has been emphasised as an important aspect to foster the evaluation of natural experiments [7]. Even though databanks with linked administrative datasets are increasingly becoming available for entire countries or regions, some regions lack reliable data. Large secondary databases provide excellent opportunities to evaluate natural experiments based on already collected data, as long as researchers are sufficiently aware of their potential. Training on secondary data as formal part of research education might increase the opportunities for natural experiments evaluation. In absence of databanks, datasets need to be linked on a one-by-one basis. Informed consents – especially those that have been signed years ago – are often not designed to accommodate the linkage of datasets for the retrospective evaluation of natural experiments. Existing ethical and regulatory frameworks for sharing and processing personal data are sometimes subject for debate, making the alignment of different stakeholders a main barrier to proceed. Researchers need support by their own institutions to create multidisciplinary teams in which persons from various backgrounds (eg, legal officers, policymakers, practitioners, researchers) jointly facilitate the timely linkage of databases within ethical and regulatory frameworks.


Natural experiments provide unique opportunities to strengthen the evidence base. To increase their adoption in public health, it is essential to improve the understanding of studies based on natural experiments. Over the years, we have received disappointing reviewer comments when submitting evaluations of natural experiments to public health journals. Some of the misunderstanding may result from different ways of conceptualising natural experiments [5]. Given the tendency in some journals to consider scientific rigor and associated uncertainties as more important than implications for professional practice, as well as the strict criteria for the use of causal language solely allowed for RCTs [8], means that the continuum of evidence from associations to causal conclusions is being ignored. Interdisciplinary research can be one way to learn from methodologies used by researchers in other fields [9]. Training of public health professionals, non-academic stakeholders, funders, and policymakers to understand the value and specificities of natural experiments is likely needed to increase the use of these evaluation strategies.

At last, we would like to draw attention to the career perspectives of researchers working on projects that capitalise on natural experiments. In a system where publications are still key to obtain new research funding and academic positions, evaluating small-scale interventions or conducting descriptive research might provide a more secure route to progress in academia [4]. This may pose a serious risk that the existing “evaluative bias” will increase, whereby most evidence is available for interventions that were easiest to study and to publish [10]. We need institutional changes where researchers are acknowledged for the societal relevance of their studies, not primarily on the quantity of publications. These risks are even larger for PhD candidates with strict timings and output objectives, possibly demotivating early career researchers from devoting their projects to evaluate natural experiments. Ultimately this may lead to less senior researchers being experts on natural experiments, and moving it forward in public health. Better understanding of the use and value is needed to ensure that brave researchers more often evaluate the interventions that hold large promise to inform decisions about population health.

In conclusion, natural experiments provide opportunities to inform policymaking on exposures that are impossible to randomise. While successful evaluations are available in literature, much can be learned from the barriers ultimately leading to unexplored opportunities, ceased projects, and unpublished manuscripts. As long as barriers are not addressed jointly by the research and policy environments, opportunities to provide evidence on global health challenges with extensive societal impact will continue to be missed.


We would like to thank Prof Dr Alex Burdorf, Dr. Sam Harper, Prof Dr. Frank J. van Lenthe, and Prof Dr Johan P. Mackenbach for their useful insights and feedback to an early draft of this manuscript.

[1] Funding: FVL is supported by the Erasmus Initiative Smarter Choices for Better Health.

[2] Authorship contributions: FJMM and FVL jointly conceptualised the manuscript. FJMM has written the first draft, FVL has given extensive feedback to the manuscript. FJMM and FVL have read and approved the final version of the manuscript.

[3] Competing interests: The authors completed the ICMJE Declaration of Interest Form (available upon request from the corresponding author), and declare no conflicts of interest.


[1] The Sveriges Riksbank Prize in Economic Sciences in Memory of Alfred Nobel. 2021. Nobel Prize Outreach AB 2021. Available: Accessed: 23 November 2021.

[2] P Craig, C Cooper, D Gunnell, S Haw, K Lawson, and S Macintyre. Using natural experiments to evaluate population health interventions: new Medical Research Council guidance. J Epidemiol Community Health. 2012;66:1182-6. DOI: 10.1136/jech-2011-200375. [PMID:22577181]

[3] M Petticrew, S Cummins, C Ferrell, A Findlay, C Higgins, and C Hoy. Natural experiments: an underused tool for public health? Public Health. 2005;119:751-7. DOI: 10.1016/j.puhe.2004.11.008. [PMID:15913681]

[4] D Ogilvie, J Adams, A Bauman, EW Gregg, J Panter, and KR Siegel. Using natural experimental studies to guide public health action: turning the evidence-based medicine paradigm on its head. J Epidemiol Community Health. 2020;74:203-8. DOI: 10.1136/jech-2019-213085. [PMID:31744848]

[5] F de Vocht, SV Katikireddi, C McQuire, K Tilling, M Hickman, and P Craig. Conceptualising natural and quasi experiments in public health. BMC Med Res Methodol. 2021;21:32 DOI: 10.1186/s12874-021-01224-x. [PMID:33573595]

[6] JV Been and A Sheikh. COVID-19 must catalyse key global natural experiments. J Glob Health. 2020;10:010104. DOI: 10.7189/jogh.10.010104. [PMID:32355555]

[7] Academy of Medical Sciences London. Improving the health of the public by 2040. Available: Accessed: 01 October 2021.

[8] MA Hernán. The C-Word: Scientific Euphemisms Do Not Improve Causal Inference From Observational Data. Am J Public Health. 2018;108:616-9. DOI: 10.2105/AJPH.2018.304337. [PMID:29565659]

[9] EC Matthay, E Hagan, LM Gottlieb, ML Tan, D Vlahov, and NE Adler. Alternative causal inference methods in population health research: Evaluating tradeoffs and triangulating evidence. SSM Popul Health. 2019;10:100526. DOI: 10.1016/j.ssmph.2019.100526. [PMID:31890846]

[10] D Ogilvie, M Egan, V Hamilton, and M Petticrew. Systematic reviews of health effects of social interventions: 2. Best available evidence: how low should you go? J Epidemiol Community Health. 2005;59:886-92. DOI: 10.1136/jech.2005.034199. [PMID:16166365]

Correspondence to:
Famke J.M. Mölenberg, PhD
Department of Public Health
Erasmus MC, University Medical Center
Rotterdam, P.O. Box 2040, 3000 CA
The Netherlands
[email protected]