Taylor & Francis Group
Browse
gacr_a_2366281_sm2073.docx (298.29 kB)

Taking it back: A pilot study of a rubric measuring retraction notice quality

Download (298.29 kB)
journal contribution
posted on 2024-06-26, 06:40 authored by Alyssa Shi, Brooke Bier, Carrigan Price, Luke Schwartz, Devan Wainright, Audra Whithaus, Alison Abritis, Ivan Oransky, Misha Angrist

The frequency of scientific retractions has grown substantially in recent years. However, thus far there is no standardized retraction notice format to which journals and their publishers adhere voluntarily, let alone compulsorily. We developed a rubric specifying seven criteria in order to judge whether retraction notices are easily and freely accessible, informative, and transparent. We mined the Retraction Watch database and evaluated a total of 768 retraction notices from two publishers (Springer and Wiley) over three years (2010, 2015, and 2020). Per our rubric, both publishers tended to score higher on measures of openness/availability, accessibility, and clarity as to why a paper was retracted than they did in: acknowledging institutional investigations; confirming whether there was consensus among authors; and specifying which parts of any given paper warranted retraction. Springer retraction notices appeared to improve over time with respect to the rubric’s seven criteria. We observed some discrepancies among raters, indicating the difficulty in developing a robust objective rubric for evaluating retraction notices.

Funding

The author(s) reported there is no funding associated with the work featured in this article.

History

Usage metrics

    Accountability in Research

    Licence

    Exports

    RefWorks
    BibTeX
    Ref. manager
    Endnote
    DataCite
    NLM
    DC