RTI uses cookies to offer you the best experience online. By clicking “accept” on this website, you opt in and you agree to the use of cookies. If you would like to know more about how RTI uses cookies and how to manage them please view our Privacy Policy here. You can “opt out” or change your mind by visiting: http://optout.aboutads.info/. Click “accept” to agree.
Approaches for reporting and interpreting statistically nonsignificant findings in evidence syntheses
A systematic review
Sharifan, A., Dobrescu, A., Harrod, C., Klerings, I., Ong, A. Y., Ngeh, E., Xiao, Y.-T., & Gartlehner, G. (2025). Approaches for reporting and interpreting statistically nonsignificant findings in evidence syntheses: A systematic review. Journal of Clinical Epidemiology, 190, 112083. Article 112083. Advance online publication. https://doi.org/10.1016/j.jclinepi.2025.112083
Objectives: To systematically review approaches for reporting and interpreting statistically nonsignificant findings with clinical relevance in evidence synthesis and to assess their methodological quality and the extent of their empirical validation. Study Design and Setting: We searched Ovid MEDLINE ALL, Scopus, PsycInfo, Library of Guidance for Health Scientists, and MathSciNet for published studies in English from January 1, 2000, to January 30, 2025, for (1) best practices in guidance documents for evidence synthesis when interpreting clinically relevant nonsignificant findings, (2) statistical methods to support the interpretation, and (3) reporting practices. To identify relevant reporting guidelines, we also searched the Enhancing the QUAlity and Transparency Of health Research Network. The quality assessment applied the Mixed Methods Appraisal Tool, Appraisal tool for Cross-Sectional Studies, and checklists for expert opinion and systematic reviews from the Joanna Briggs Institute. At least two reviewers independently conducted all procedures, and a large language model facilitated data extraction and quality appraisal. Results: Of the 5332 records, 37 were eligible for inclusion. Of these, 15 were editorials or opinion pieces, nine addressed methods, eight were cross-sectional or mixed-methods studies, four were journal guidance documents, and one was a systematic review. Twentyseven records met the quality criteria of the appraisal tool relevant to their study design or publication type, while 10 records, comprising one systematic review, two editorials or opinion pieces, and seven cross-sectional studies, did not. Relevant methodological approaches to evidence synthesis included utilization of uncertainty intervals and their integration with various statistical measures (15 of 37, 41%), Bayes factors (six of 37, 16%), likelihood ratios (three of 37, 8%), effect conversion measures (two of 37, 5%), equivalence testing (two of 37, 5%), modified Fisher's test (one of 37, 3%), and reverse fragility index (one of 37, 3%). Reporting practices included problematic "null acceptance" language (14 of 37, 38%), with some records discouraging the inappropriate claim of no effect based on nonsignificant findings (nine of 37, 24%). None of the proposed methods were empirically tested with interest holders. Conclusion: Although various approaches have been proposed to improve the presentation and interpretation of statistically nonsignificant findings, a widely accepted consensus has not emerged, as these approaches have yet to be systematically tested for their practicality and validity. This review provides a comprehensive review of available methodological approaches spanning both the frequentist and Bayesian statistical frameworks and identifies critical gaps in empirical validation of some approaches, namely the lack of thresholds to guide the interpretation of results. These findings highlight the need for systematic testing of proposed methods with interest holders and the development of evidence-based guidance to support appropriate interpretation of nonsignificant results in evidence synthesis. (c) 2025 The Author(s). Published by Elsevier Inc. This is an open access article under the CC BY-NC-ND license (http://creativecommons. org/licenses/by-nc-nd/4.0/).
RTI shares its evidence-based research - through peer-reviewed publications and media - to ensure that it is accessible for others to build on, in line with our mission and scientific standards.