Unpacking the modelling process via sensitivity auditingLo Piano, S. ORCID: https://orcid.org/0000-0002-2625-483X, Sheikholeslami, R., Puy, A. and Saltelli, A. (2022) Unpacking the modelling process via sensitivity auditing. Futures, 144. 103041. ISSN 0016-3287
It is advisable to refer to the publisher's version if you intend to cite from this work. See Guidance on citing. To link to this item DOI: 10.1016/j.futures.2022.103041 Abstract/SummaryAcknowledging the conditionality of model-based evidence facilitates the dialogue between model developers and model users, especially when models are used to guide decisions at the science-policy interface. In general, model users have limited access to verify the realism of a model, being only exposed to model plausibility and trustworthiness; instead, modellers have an an array of validation and verification techniques available. In the end, model credibility is what both developers and users aim for, also in the interest of shielding from the possible pitfall of over-interpreting the model results. To this end, in this contribution we discuss sensitivity auditing, an extension of sensitivity analysis, that can help model developers and users to overcome communication barriers and foster dialogue around modelling activities. The use of sensitivity auditing is not limited to models in a restricted sense, but it can be applied to any policy-relevant instance of quantification, including metrics, rankings and indicators. We present six real-world applications of sensitivity auditing to instances of quantification in a range of socio-environmental systems, including public health, education, and the water-food nexus. These examples reveal the usefulness of sensitivity auditing in facilitating the proper use of numbers and models at the science-policy- society interface and avoiding uncertainty laundering.
Download Statistics DownloadsDownloads per month over past year Altmetric Deposit Details University Staff: Request a correction | Centaur Editors: Update this record |