How can we improve the evidence that serves as the foundation of safe and effective critical care practice? Scientists, clinicians, and the public have focused recently on problems affecting quality and trustworthiness of research in many areas of science. Ensuring rigor (adherence to high standards of research methods) and reproducibility (obtaining the same results when experiments are repeated) is crucial to all of science. However, the stakes are highest in research that ultimately affects patient care. The National Institutes of Health (NIH),1  the National Academies of Sciences, Engineering, and Medicine (NASEM),2  and the popular press3,4  have all expressed concerns about rigor and reproducibility. NIH and NASEM have proposed actions aimed at improving the design, conduct, and reporting of research. Consideration of these recommendations will improve any research project, but they are particularly applicable to clinical researchers interested in building a high-quality, trustworthy knowledge base for critical care practice.

The NIH has developed a set of guidelines that are aimed at improving rigor and reproducibility.5  The NIH guidelines took effect in January 2016, and they require research proposals to explicitly address 4 areas: scientific premise, design, consideration of relevant biological variables, and authentication of key biological and/or chemical resources. The scientific premise provides the foundation for new research; a shaky foundation undermines confidence in the new research built upon it. Although it may seem obvious that any research project should be undergirded by previous knowledge in the field, developing a scientific premise requires that researchers carefully consider both the strength and the quality of existing evidence and that they articulate how existing knowledge informs the proposed project. Stating the scientific premise is not a defense of previous research. Rather, it is an examination of the strengths and weaknesses of prior studies that informs the research questions, study design, and analysis of proposed research.

The design of a research study contributes to both rigor and reproducibility. Rigor depends on scrupulous adherence to scientific methods and to the requirements of specific research designs. Both qualitative and quantitative research have standardized approaches to enhance rigor. The NIH guidelines require investigators to explain how choices in study design and methods will lead to “robust and unbiased” results. Justifying decisions about the research plan is important for researchers in clinical settings as well. For example, is the number of study participants large enough to yield a reliable answer to the research question, and how was that number determined? Descriptions of the design and methods must be sufficiently detailed to permit others to understand what will be (or was) done and to replicate the research.

“Standardized approaches to planning and reporting research increase rigor and reproducibility.”

Standardized reporting systems are excellent templates for ensuring that appropriate details of clinical research are explained. The Consolidated Standards of Reporting Trials (CONSORT) statement provides a roadmap for reporting the elements of randomized trials,6  and the Standards for Quality Improvement Reporting Excellence (SQUIRE) guidelines provide similar reporting structure for quality improvement projects.7  The EQUATOR (Enhancing the QUAlity and Transparency Of health Research) network has a library of reporting systems for qualitative and quantitative research on their website.8  Although designed to improve reporting of qualitative research, quantitative research, and quality improvement projects, these systems also provide a strong scaffold for project planning. Considering the reporting elements during development of the study improves the likelihood that the final report will provide information necessary for reproducibility. Standardized approaches to planning and reporting research contribute to rigor and reproducibility.

The new focus on relevant biological variables in the NIH guidelines is directly applicable to all critical care research. Sex is one biological variable that is specifically called out by the NIH guideline.1,9  Whereas past efforts have focused on increased inclusion of women as research participants, the new guidelines ask investigators to factor sex into the research design and analysis in a way that permits sex-based comparisons. The goal is to understand how sex influences underlying disease mechanisms and response to interventions. However, sex is not the only biological variable that may be crucial to critical care research. Other biological variables such as age, underlying comorbid conditions, or body mass index may be very important and are often understudied. Enhanced attention to inclusion of biological variables in study design and analysis will inform personalized care for critically ill patients. In addition, other social, behavioral, and environmental factors may play important roles in critical illness; variables such as gender also require additional investigation.

The NASEM report2  on research integrity identifies both misconduct and “detrimental research practices” as threats to rigor and reproducibility. Fabrication, falsification, and plagiarism are clearly misconduct. Detrimental research practices may fall short of clear misconduct but have negative effects on science, society, and health care nonetheless. Examples of detrimental research practices include failure to comply with data sharing policies, misleading statistical analyses, and not disclosing negative findings.

The NIH guidelines and the NASEM report recognize that individual researchers bear responsibility for research integrity. However, the NASEM report identifies crucial roles for institutions where research is conducted, research sponsors, federal agencies, and research journals. Many resources are available to help individuals learn more about rigor and reproducibility. The NIH has established a clearinghouse for rigor and reproducibility training, and it currently offers 2 excellent web-based series of tutorials.10  Institutions must create environments that support critical care research through education, mentoring, monitoring, and modeling rigor and transparency. Every institution should examine how it fosters best practices in research.

Critical care research has some inherent checks and balances that support rigor. Requirements for protections of research participants’ rights, including review and oversight by an institutional review board, focus attention on study design and data integrity. Approval by internal research councils provides an additional level of scrutiny for study quality. Clinical data are frequently used as a source of research data, and one might argue that clinical data may be less susceptible to bias.

On the other hand, reproducibility is a constant problem for critical care research. Even with stringent adherence to the study protocol, critical care environments—and critically ill patients—introduce variability into research. Some variability can be identified and controlled, but other factors, which may influence the results, are difficult to ascertain. Studies conducted in a single unit or a single institution are particularly vulnerable to these unrecognized variables that can jeopardize reproducibility. In moving research findings to the clinical setting, we should carefully evaluate rigor and avoid becoming overly enthusiastic about popular news coverage of unexpected findings from studies with less rigorous designs. We should also temper our enthusiasm for changing practice on the basis of results of a single study, particularly if the study was conducted at a single site; what produced spectacular results in one study may not turn out to be reproducible or ready for broad translation to clinical practice.

“We should temper our enthusiasm for changing practice on the basis of results of a single study.”

We are committed to having the American Journal of Critical Care publish interdisciplinary critical care research that is high quality, rigorous, and reproducible; these attributes are essential for research that is designed to improve practice. Several of our standard processes support this commitment. Our author guidelines assist potential authors with structuring their submissions to address rigor and reproducibility. We encourage authors to use standardized reporting templates such as CONSORT and SQUIRE. We support authors in disclosure of nonsignificant and negative results. Our review processes also support rigor and reproducibility. We carefully review the quality of manuscripts before engaging peer review, and for each paper, we select multiple peer reviewers who are experts in the clinical condition and the research methods used. Guidelines for reviewers are available on the journal’s website. We have a process in place for rating the quality of peer reviews. As scientific and disciplinary norms change, we are further committed to evaluating and improving our processes so that readers are confident in the quality and trustworthiness of what they read in the American Journal of Critical Care. The statements and opinions contained in this editorial are solely those of the coeditors in chief.

REFERENCES

REFERENCES
1
National Institutes of Health
.
Rigor and reproducibility
. . Accessed April 26, 2017.
2
The National Academies of Sciences, Engineering, and Medicine
.
Fostering Integrity in Research
.
Washington, DC
:
The National Academies Press
;
2017
.
3
Dumas-Mallet
E
,
Smith
A
,
Boraud
T
,
Gonon
F
.
Poor replication validity of biomedical association studies reported by newspapers
.
PLoS ONE
.
2017
;
12
:
e0172650
.
4
Harris
R
.
Rigor Mortis: How Sloppy Science Creates Worthless Cures, Crushes Hope, and Wastes Billions
.
New York, NY
:
Basic Books
;
2017
.
5
National Institutes of Health
.
Guidance: Rigor and Reproducibility in Grant Applications
. . Accessed April 26, 2017.
6
Consolidated Standards of Reporting Trials
.
CONSORT website
. . Accessed April 26, 2017.
7
Standards for QUality Improvement Reporting Excellence (SQUIRE)
. . Accessed April 26, 2017.
8
EQUATOR network
.
Enhancing the QUAlity and Transparency Of health Research
. . Accessed April 26, 2017.
9
Clayton
JA
.
Studying both sexes: a guiding principle for biomedicine
.
FASEB J
.
2016
;
30
(
2
):
519
524
.
10
National Institutes of Health
.
Clearinghouse for training modules to enhance data reproducibility
. . Accessed April 26, 2017.

Footnotes

FINANCIAL DISCLOSURES

None reported.

eLetters

Now that you’ve read the article, create or contribute to an online discussion on this topic. Visit www.ajcconline.org and click “Submit a response” in either the full-text or PDF view of the article.

To purchase electronic or print reprints, contact American Association of Critical-Care Nurses, 101 Columbia, Aliso Viejo, CA 92656. Phone, (800) 899-1712 or (949) 362-2050 (ext 532); fax, (949) 362-2049; e-mail, reprints@aacn.org.