JMIR Publications recently published “Establishing Institutional Benchmarks with Critical and Strategic Assessments: A Comprehensive Review of the Quality of Scientific Reporting” in Journal of Internet Medical Research (JMIR)who reported that improving standards of rigor and transparency should lead to improved productivity in scientific publications, but assessing transparency levels is difficult if they are done manually by reviewers.
The main purpose of this study is to establish a quality standard for scientific reporting that can be used in institutions and countries, and to show the need for quality reports to ensure reproducibility in biomedicine, the use of documents and made from the revision project: Cancer Biology.
The authors discuss the development of the previously proposed Rigor and Translation Index (RTI), which attempts to assess the rigor and transparency of journals, institutions, and countries by using articles that are rated on the recommendations found in the revised guidelines (eg, NIH, MDAR, ARRIVE). ).
Using the work of the Reproducibility Project: Cancer Biology, the authors were able to determine that the research used is greater than the original papers, which according to the work, all require more information from the authors to start trying to copy.
Unfortunately, the RTI levels of journals, institutions, and countries are all currently below average for duplicate searches. If they take the RTI of these replication studies as a target for future manuscripts, more work will be needed to ensure the average manuscript contains enough information for replication attempts.
Dr. Anita Bandrowski from the University of California San Diego said, “Reformative research is necessary for the advancement of science. However, in the last decade, many reports about the lack of reproducibility of research have given light on a long-standing problem, which proves to be both problematic. and expensive.”
“Research reform is necessary for the advancement of science. However, over the past decade, many reports on the lack of research reform have shed light on a long-standing problem, which suggests that it has trouble and cost.”
In an effort to encourage innovation, many scientific organizations and journals have adopted Open and Open Standards, which focus on the establishment. best practices at the level of individual journals.
With a similar vein, Publication-driven Materials Design, Analysis, and Reporting is a multidisciplinary research system designed to improve reporting in life science research at the standard writing level.
This system provides a standard, minimal reporting system whose principles were used, in part, to create the first RTI, in newspaper Quality standards focus on research methods and transparency.
In particular, the authors here present a new version of RTI, which represents the mean SciScore on a subset of documents, and show how it can be used to assess the reliability of reporting in research institutions.
While it is not useful to simply compare all papers scoring “2” as non-repeatable and all papers showing “8” as being repeatable, as most fields and their functions are more better they are, it can be stated that higher scores are associated with more. very detailed and therefore they can easily be used to try to copy.
Joe Menke et al., Creating Institutional Scores with Criticality and Interpretative Indices: A Comprehensive Review of Effective Scientific Reporting, Journal of Medical Internet Research (2022). DOI: 10.2196/37324
Published by JMIR Publications
hintRigor and transparency index: A large-scale analysis of the quality of scientific reporting (2022, July 26) Retrieved 26 July 2022 from https://medicalxpress.com/news/2022-07-rigor-transparency-index-large-scale- analysis .html
This document is subject to copyright. Except for any bona fide transaction for research or investigation purposes, no part may be reproduced without written permission. Content is provided for informational purposes only.
Large-scale analysis of scientific reporting quality Source link Large-scale analysis of scientific reporting quality