Abstract
This article discusses several practical issues arising with the application of diagnostic principles to theory-based evaluation (e.g. with Process Tracing and Bayesian Updating). It is structured around three iterative application steps, focusing mostly on the third. While covering different ways evaluators fall victims to confirmation bias and conservatism, the article includes suggestions on which theories can be tested, what kind of empirical material can act as evidence and how to estimate the Bayes formula values/update confidence, including when working with ranges and qualitative confidence descriptors. The article tackles evidence packages (one of the most problematical practical issues), proposing ways to (a) set boundaries of single observations that can be considered independent and handled numerically; (b) handle evidence packages when numerical probability estimates are not available. Some concepts are exemplified using a policy influence process where an institution’s strategy has been influenced by a knowledge product by another organisation.
Original language | English |
---|---|
Pages (from-to) | 499-515 |
Number of pages | 17 |
Journal | Evaluation |
Volume | 26 |
Issue number | 4 |
DOIs | |
Publication status | Published - 1 Oct 2020 |
Keywords
- Bayesian Process Tracing
- Bayesian Updating
- confusion matrix
- diagnostic evaluation
- theory-based evaluation