Files
gh-k-dense-ai-claude-scient…/skills/scientific-writing/references/reporting_guidelines.md
2025-11-30 08:30:18 +08:00

749 lines
24 KiB
Markdown
Raw Blame History

This file contains ambiguous Unicode characters
This file contains Unicode characters that might be confused with other characters. If you think that this is intentional, you can safely ignore this warning. Use the Escape button to reveal them.
# Reporting Guidelines for Scientific Studies
## Overview
Reporting guidelines are evidence-based recommendations for what information should be included when reporting specific types of research studies. They provide checklists and flow diagrams to ensure complete, accurate, and transparent reporting, which is essential for readers to assess study validity and for other researchers to replicate the work.
The EQUATOR Network (Enhancing the QUAlity and Transparency Of health Research) maintains a comprehensive library of reporting guidelines. Using appropriate reporting guidelines improves manuscript quality and increases the likelihood of publication acceptance.
## Why Use Reporting Guidelines?
### Benefits
**For authors:**
- Ensures nothing important is forgotten
- Increases acceptance rates
- Improves manuscript organization
- Reduces reviewer requests for additional information
**For readers and reviewers:**
- Enables critical appraisal of study validity
- Facilitates systematic reviews and meta-analyses
- Improves understanding of what was actually done
**For science:**
- Enhances reproducibility
- Reduces research waste
- Improves transparency
- Enables better evidence synthesis
### When to Use
- **During study design**: Many guidelines include protocol versions (e.g., SPIRIT for trial protocols)
- **During manuscript drafting**: Use checklist to ensure all items are covered
- **Before submission**: Verify adherence and often submit checklist with manuscript
- **Many journals require**: Reporting guideline checklists as part of submission
## Major Reporting Guidelines by Study Type
### CONSORT - Randomized Controlled Trials
**Full name:** Consolidated Standards of Reporting Trials
**When to use:** Any randomized controlled trial (RCT), including pilot and feasibility trials
**Latest version:** CONSORT 2010 (updated statement)
**Key components:**
- **Checklist**: 25 items covering title, abstract, introduction, methods, results, discussion
- **Flow diagram**: Participant flow through enrollment, allocation, follow-up, and analysis
**Main checklist items:**
1. Title identifies study as randomized trial
2. Structured abstract
3. Scientific background and rationale
4. Specific objectives and hypotheses
5. Trial design description (parallel, crossover, factorial, etc.)
6. Eligibility criteria for participants
7. Settings and locations of data collection
8. Interventions described in sufficient detail for replication
9. Primary and secondary outcomes defined
10. Sample size determination and power calculation
11. Randomization sequence generation
12. Allocation concealment mechanism
13. Blinding implementation
14. Statistical methods
15. Participant flow with reasons for dropouts
16. Recruitment dates and follow-up dates
17. Baseline characteristics table
18. Analysis results for each outcome
19. Harms and adverse events
20. Trial limitations
21. Generalizability
22. Interpretation consistent with results
23. Trial registration number
24. Full protocol access
25. Funding sources
**Extensions for specific designs:**
- CONSORT for cluster randomized trials
- CONSORT for non-inferiority and equivalence trials
- CONSORT for pragmatic trials
- CONSORT for crossover trials
- CONSORT for N-of-1 trials
- CONSORT for stepped wedge designs
**Where to access:** http://www.consort-statement.org/
### STROBE - Observational Studies
**Full name:** Strengthening the Reporting of Observational Studies in Epidemiology
**When to use:** Cohort studies, case-control studies, and cross-sectional studies
**Latest version:** STROBE 2007 (widely adopted standard)
**Key study designs covered:**
- **Cohort**: Follow exposed and unexposed groups forward in time
- **Case-control**: Compare exposure history between cases and controls
- **Cross-sectional**: Measure exposure and outcome simultaneously
**Main checklist items (22 items):**
1. Title and abstract indicate study design
2. Background and rationale
3. Objectives
4. Study design with rationale
5. Setting, locations, and dates
6. Eligibility criteria and selection methods
7. Variables clearly defined (outcomes, exposures, confounders)
8. Data sources and measurement methods
9. Bias management strategies
10. Study size justification
11. Handling of quantitative variables
12. Statistical methods including confounding and interactions
13. Sensitivity analyses
14. Participant flow with reasons for non-participation
15. Descriptive data including follow-up time
16. Outcome data
17. Main results with unadjusted and adjusted estimates
18. Other analyses (subgroups, sensitivity analyses)
19. Key results summary
20. Limitations with potential bias discussion
21. Interpretation and generalizability
22. Funding sources and role
**Extensions:**
- STROBE-ME (Molecular Epidemiology)
- RECORD (Routinely collected health data)
- STROBE-RDS (Respondent-driven sampling)
**Where to access:** https://www.strobe-statement.org/
### PRISMA - Systematic Reviews and Meta-Analyses
**Full name:** Preferred Reporting Items for Systematic Reviews and Meta-Analyses
**When to use:** Systematic reviews with or without meta-analysis
**Latest version:** PRISMA 2020 (significant update)
**Key components:**
- **Checklist**: 27 items covering all sections
- **Flow diagram**: Study selection process
**Main sections:**
1. **Title**: Identify as systematic review/meta-analysis
2. **Abstract**: Structured summary
3. **Introduction**: Rationale and objectives
4. **Methods**:
- Eligibility criteria
- Information sources (databases, dates)
- Search strategy (full strategy for at least one database)
- Selection process
- Data collection process
- Data items extracted
- Risk of bias assessment
- Effect measures
- Synthesis methods
- Reporting bias assessment
- Certainty assessment (e.g., GRADE)
5. **Results**:
- Study selection flow diagram
- Study characteristics
- Risk of bias assessment results
- Synthesis results (meta-analysis if applicable)
- Reporting biases
- Certainty of evidence
6. **Discussion**:
- Limitations
- Interpretation
- Implications
**Extensions:**
- PRISMA for Abstracts
- PRISMA for Protocols (PRISMA-P)
- PRISMA for Network Meta-Analyses
- PRISMA for Scoping Reviews (PRISMA-ScR)
- PRISMA for Individual Patient Data
- PRISMA for Diagnostic Test Accuracy
- PRISMA for Equity-focused reviews
**Where to access:** http://www.prisma-statement.org/
### SPIRIT - Study Protocols for Clinical Trials
**Full name:** Standard Protocol Items: Recommendations for Interventional Trials
**When to use:** Protocols for randomized trials and other planned intervention studies
**Latest version:** SPIRIT 2013
**Purpose:** Ensure trial protocols contain complete descriptions before trial begins
**Main checklist items (33 items):**
- Administrative information (title, trial registration, funding)
- Introduction (background, rationale, objectives)
- Methods: Trial design
- Study setting
- Eligibility criteria
- Interventions in detail
- Outcomes (primary and secondary)
- Participant timeline
- Sample size calculation
- Recruitment strategy
- Allocation and randomization
- Blinding
- Data collection methods
- Data management
- Statistical methods
- Monitoring (data monitoring committee)
- Harms reporting
- Auditing
- Ethics and dissemination
- Ethics approval
- Consent procedures
- Confidentiality
- Dissemination plans
**Where to access:** https://www.spirit-statement.org/
### STARD - Diagnostic Accuracy Studies
**Full name:** Standards for Reporting of Diagnostic Accuracy Studies
**When to use:** Studies evaluating diagnostic test accuracy
**Latest version:** STARD 2015
**Main checklist items (30 items):**
1. Study design identification
2. Background information and objectives
3. Study design description
4. Participant selection criteria and recruitment
5. Data collection methods
6. Index test description and execution
7. Reference standard description
8. Rationale for choosing reference standard
9. Test result definition and cutoffs
10. Flow of participants with timing
11. Baseline demographic and clinical characteristics
12. Cross-tabulation of index test results by reference standard
13. Estimates of diagnostic accuracy with confidence intervals
14. Handling of indeterminate results
15. Adverse events from testing
**Flow diagram:** Shows participant flow and test results
**Where to access:** https://www.equator-network.org/reporting-guidelines/stard/
### TRIPOD - Prediction Model Studies
**Full name:** Transparent Reporting of a multivariable prediction model for Individual Prognosis Or Diagnosis
**When to use:** Studies developing, validating, or updating prediction models
**Latest version:** TRIPOD 2015
**Types of studies:**
- Model development only
- Model development with validation
- External validation of existing model
- Model update
**Main checklist items (22 items):**
1. Title identifies study as prediction model study
2. Abstract summarizes key elements
3. Background and objectives
4. Data source and participants
5. Outcome definition
6. Predictors (candidate and selected)
7. Sample size justification
8. Missing data handling
9. Model building procedure
10. Model specification (equation or algorithm)
11. Model performance measures
12. Risk groups if used
13. Participant flow diagram
14. Model development results
15. Model performance
16. Model updating if applicable
**Where to access:** https://www.tripod-statement.org/
### ARRIVE - Animal Research
**Full name:** Animal Research: Reporting of In Vivo Experiments
**When to use:** All in vivo animal studies
**Latest version:** ARRIVE 2.0 (2020 update)
**Two sets of items:**
**ARRIVE Essential 10** (minimum requirements):
1. Study design
2. Sample size calculation
3. Inclusion and exclusion criteria
4. Randomization
5. Blinding
6. Outcome measures
7. Statistical methods
8. Experimental animals (species, strain, sex, age)
9. Experimental procedures
10. Results and interpretation
**ARRIVE Recommended Set** (additional items for full reporting):
- Abstract, background, objectives
- Ethics statement
- Housing and husbandry
- Animal care and monitoring
- Interpretation and generalizability
- Protocol registration
- Data access
**Where to access:** https://arriveguidelines.org/
### CARE - Case Reports
**Full name:** CAse REport Guidelines
**When to use:** Case reports and case series
**Latest version:** CARE 2013
**Main checklist items (13 items):**
1. Title with "case report"
2. Abstract summarizing case
3. Introduction with case background
4. Patient information (demographics, primary concern)
5. Clinical findings
6. Timeline of events
7. Diagnostic assessment
8. Therapeutic intervention
9. Follow-up and outcomes
10. Discussion with strengths and limitations
11. Patient perspective
12. Informed consent
**Where to access:** https://www.care-statement.org/
### SQUIRE - Quality Improvement Studies
**Full name:** Standards for QUality Improvement Reporting Excellence
**When to use:** Healthcare quality improvement reports
**Latest version:** SQUIRE 2.0 (2015)
**Main sections (18 items):**
1. Title and abstract
2. Introduction (problem description, available knowledge, rationale, objectives)
3. Methods (context, intervention, study design, measures, analysis, ethical review)
4. Results (intervention, outcomes)
5. Discussion (summary, interpretation, limitations, conclusions)
6. Other information (funding)
**Where to access:** http://www.squire-statement.org/
### CHEERS - Economic Evaluations
**Full name:** Consolidated Health Economic Evaluation Reporting Standards
**When to use:** Health economic evaluations
**Latest version:** CHEERS 2022 (major update from 2013)
**Main checklist items (28 items):**
1. Title identification as economic evaluation
2. Abstract
3. Background and objectives
4. Target population and subgroups
5. Setting and location
6. Study perspective
7. Comparators
8. Time horizon
9. Discount rate
10. Selection of outcomes
11. Measurement of effectiveness
12. Measurement and valuation of costs
13. Currency and price adjustments
14. Choice of model
15. Assumptions
16. Analytical methods
**Where to access:** https://www.equator-network.org/reporting-guidelines/cheers/
### SRQR - Qualitative Research
**Full name:** Standards for Reporting Qualitative Research
**When to use:** Qualitative and mixed methods research
**Latest version:** SRQR 2014
**Main sections:**
- Title and abstract
- Introduction (problem formulation, purpose)
- Methods (qualitative approach, researcher characteristics, context, sampling strategy, ethical issues, data collection, data analysis, trustworthiness)
- Results (synthesis and interpretation, links to empirical data)
- Discussion (limitations, implications)
**Alternative:** COREQ (Consolidated criteria for reporting qualitative research) for interviews and focus groups
**Where to access:** https://www.equator-network.org/reporting-guidelines/srqr/
## How to Use Reporting Guidelines
### During Study Planning
1. **Identify relevant guideline** based on study design
2. **Review checklist items** that require planning (e.g., randomization, blinding)
3. **Design study** to ensure all required elements will be captured
4. **Consider protocol guidelines** (e.g., SPIRIT for trials)
### During Manuscript Drafting
1. **Download checklist** from guideline website
2. **Work through each item** systematically
3. **Note where each item is addressed** in manuscript (page/line numbers)
4. **Revise manuscript** to include missing items
5. **Use flow diagrams** as appropriate
### Before Submission
1. **Complete formal checklist** with page numbers
2. **Review all items** are adequately addressed
3. **Include checklist** with submission if journal requires
4. **Note guideline adherence** in cover letter or methods
### Example Checklist Entry
```
Item 7: Eligibility criteria for participants, and the settings and locations where the data were collected
Page 6, lines 112-125: "Participants were community-dwelling adults aged 60-85 years with mild cognitive impairment (MCI) as defined by Petersen criteria. Exclusion criteria included dementia diagnosis, major psychiatric disorders, or unstable medical conditions. Recruitment occurred from three memory clinics in Boston, MA, between January 2022 and December 2023."
```
## Finding the Right Guideline
### EQUATOR Network Search
**Website:** https://www.equator-network.org/
**How to use:**
1. Select your study design from the wizard
2. Browse by health research category
3. Search for specific keywords
4. Filter by guideline status (development stage)
### By Study Design
| If your study is a... | Use this guideline |
|----------------------|-------------------|
| Randomized controlled trial | CONSORT |
| Cohort, case-control, or cross-sectional study | STROBE |
| Systematic review or meta-analysis | PRISMA |
| Protocol for a trial | SPIRIT |
| Diagnostic accuracy study | STARD |
| Prediction model study | TRIPOD |
| Animal study | ARRIVE |
| Case report | CARE |
| Quality improvement study | SQUIRE |
| Economic evaluation | CHEERS |
| Qualitative research | SRQR or COREQ |
### Multiple Guidelines
**Some studies may require multiple guidelines:**
**Example 1:** Pilot RCT with qualitative component
- CONSORT for quantitative arm
- SRQR for qualitative component
**Example 2:** Systematic review of diagnostic tests
- PRISMA for review methods
- STARD considerations for included studies
## Extensions and Adaptations
Many reporting guidelines have extensions for specific contexts:
### CONSORT Extensions (examples)
- **CONSORT for Abstracts**: Structured abstracts for RCT reports
- **CONSORT for Harms**: Reporting adverse events
- **CONSORT-EHEALTH**: eHealth interventions
- **CONSORT-SPI**: Social and psychological interventions
### PRISMA Extensions (examples)
- **PRISMA-P**: Protocols for systematic reviews
- **PRISMA for Abstracts**: Conference abstracts
- **PRISMA-NMA**: Network meta-analyses
- **PRISMA-IPD**: Individual patient data reviews
- **PRISMA-S**: Search strategies
- **PRISMA-DTA**: Diagnostic test accuracy reviews
### STROBE Extensions (examples)
- **STROBE-ME**: Molecular epidemiology
- **RECORD**: Routinely collected health data
## Creating Flow Diagrams
### CONSORT Flow Diagram
**Four stages:**
1. **Enrollment**: Assessed for eligibility
2. **Allocation**: Randomly assigned to groups
3. **Follow-up**: Received intervention, lost to follow-up
4. **Analysis**: Included in analysis
**Example:**
```
Assessed for eligibility (n=250)
Excluded (n=50)
• Did not meet criteria (n=30)
• Declined to participate (n=15)
• Other reasons (n=5)
Randomized (n=200)
├─────────────────┬─────────────────┐
↓ ↓ ↓
Allocated to Allocated to Allocated to
Intervention A Intervention B Control
(n=67) (n=66) (n=67)
↓ ↓ ↓
Lost to follow-up Lost to follow-up Lost to follow-up
(n=3) (n=5) (n=2)
↓ ↓ ↓
Analyzed Analyzed Analyzed
(n=64) (n=61) (n=65)
```
### PRISMA Flow Diagram
**Stages:**
1. **Identification**: Records from databases and registers
2. **Screening**: Records screened, excluded
3. **Included**: Studies included in review and synthesis
**New features in PRISMA 2020:**
- Separate tracking for database and register searches
- Tracking of duplicate removal
- Clear distinction between reports and studies
## Common Mistakes and How to Avoid Them
### Mistake 1: Not Using Guidelines at All
**Impact:** Missing critical information, lower chance of acceptance
**Solution:** Identify and use appropriate guideline from study planning stage
### Mistake 2: Using Guidelines Only After Manuscript is Complete
**Impact:** May realize key data were not collected or documented
**Solution:** Review guidelines during study design and data collection
### Mistake 3: Incomplete Checklist Completion
**Impact:** Missed items remain unreported
**Solution:** Systematically address every single checklist item
### Mistake 4: Using Outdated Guidelines
**Impact:** Missing recent improvements in reporting standards
**Solution:** Always check for latest version on official guideline website
### Mistake 5: Using Wrong Guideline for Study Design
**Impact:** Important design-specific elements not reported
**Solution:** Carefully match study design to appropriate guideline
### Mistake 6: Not Submitting Checklist When Required
**Impact:** Editorial desk rejection or delays
**Solution:** Check journal submission guidelines and include checklist
### Mistake 7: Generic Reporting Without Specificity
**Impact:** Insufficient detail for replication or appraisal
**Solution:** Provide specific, detailed information for each item
## Journal Requirements
### Many Journals Now Require:
1. **Statement of adherence** to reporting guidelines in Methods
2. **Completed checklist** uploaded as supplementary file
3. **Page/line numbers** on checklist indicating where items are addressed
4. **Flow diagrams** as figures in manuscript
### Example Methods Statement:
```
"This study is reported in accordance with the Strengthening the Reporting of
Observational Studies in Epidemiology (STROBE) statement. A completed STROBE
checklist is provided as Supplementary File 1."
```
### Journals with Strong Requirements:
- PLOS journals (require checklists for specific designs)
- BMJ (requires CONSORT, PRISMA, and others)
- The Lancet (requires adherence statements)
- JAMA and JAMA Network journals (require checklists)
- Nature portfolio journals (encourage guidelines)
## Resources
### Official Guideline Websites
- **EQUATOR Network**: https://www.equator-network.org/
- **CONSORT**: http://www.consort-statement.org/
- **STROBE**: https://www.strobe-statement.org/
- **PRISMA**: http://www.prisma-statement.org/
- **SPIRIT**: https://www.spirit-statement.org/
- **ARRIVE**: https://arriveguidelines.org/
- **CARE**: https://www.care-statement.org/
### Training Materials
- EQUATOR Network provides webinars and training resources
- Many guidelines have explanatory papers published in medical journals
- Universities often provide workshops on reporting guidelines
### Software Tools
- **Some reference managers** can insert reporting guideline citations
- **Covidence, RevMan** for systematic review reporting
- **PRISMA flow diagram generator**: http://prisma.thetacollaborative.ca/
## Checklist: Using Reporting Guidelines
**Before starting your study:**
- [ ] Identified appropriate reporting guideline(s)
- [ ] Reviewed checklist items requiring prospective planning
- [ ] Designed study to capture all required elements
- [ ] Registered protocol if applicable
**During manuscript drafting:**
- [ ] Downloaded latest version of guideline checklist
- [ ] Systematically addressed each checklist item
- [ ] Created required flow diagram
- [ ] Noted where each item is addressed (page/line)
**Before submission:**
- [ ] Completed formal checklist with page numbers
- [ ] Verified all items adequately addressed
- [ ] Included adherence statement in Methods
- [ ] Prepared checklist as supplementary file if required
- [ ] Checked journal-specific requirements
- [ ] Mentioned guideline adherence in cover letter
## Venue-Specific Reporting Requirements
### Reporting Standards by Venue Type
| Venue Type | Guideline Use | Transparency Requirements |
|-----------|--------------|---------------------------|
| **Medical journals** | Mandatory (CONSORT, STROBE, etc.) | Checklist required at submission |
| **PLOS/BMC** | Mandatory for study types | Checklist uploaded as supplement |
| **Nature/Science** | Recommended | Methods completeness emphasized |
| **ML conferences** | No formal guidelines | Reproducibility details required |
### ML Conference Reporting Standards
**NeurIPS/ICML/ICLR reproducibility requirements:**
- **Datasets**: Names, versions, access methods, preprocessing
- **Code**: Availability statement; GitHub common
- **Hyperparameters**: All settings reported (learning rate, batch size, etc.)
- **Seeds**: Random seeds for reproducibility
- **Computational resources**: GPUs used, training time
- **Statistical significance**: Error bars, confidence intervals, multiple runs
- **Broader Impact** statement (NeurIPS): Societal implications
**What to include (typically in appendix):**
- Complete hyperparameter settings
- Training details and convergence criteria
- Hardware specifications
- Software versions (PyTorch 2.0, etc.)
- Dataset splits and any preprocessing
- Evaluation metrics and protocols
### Enforcement and Evaluation
**What gets checked:**
- **Medical journals**: Checklist uploaded; adherence statement in Methods; systematic completeness
- **PLOS/BMC**: Mandatory checklists for certain designs; reproducibility emphasized
- **High-impact**: Methods sufficiency for replication (checklist often not required)
- **ML conferences**: Reproducibility checklist (NeurIPS); code availability increasingly expected
**Common issues leading to rejection:**
- Missing required checklists (medical journals)
- Insufficient methods detail for reproduction
- Missing key information (randomization, blinding, power calculation)
- No data/code availability statement when required
**Methods statement examples:**
**Journal (STROBE):**
```
This study followed STROBE reporting guidelines. Checklist provided in Supplement 1.
```
**ML conference (reproducibility):**
```
Code available at github.com/user/project. All hyperparameters in Appendix A.
Training used 4×A100 GPUs (~20 hours). Seeds: {42, 123, 456}.
```
### Pre-Submission Reporting Checklist
**For clinical trials (medical journals):**
- [ ] CONSORT checklist complete with page numbers
- [ ] Trial registration number in abstract and methods
- [ ] CONSORT flow diagram included
- [ ] Statistical analysis plan described
- [ ] Adherence statement in Methods
**For observational studies (medical/epidemiology):**
- [ ] STROBE checklist complete
- [ ] Study design clearly stated
- [ ] Statistical methods detailed
- [ ] Confounders addressed
- [ ] Adherence statement in Methods
**For systematic reviews:**
- [ ] PRISMA checklist complete
- [ ] PRISMA flow diagram included
- [ ] Protocol registered (PROSPERO)
- [ ] Search strategy documented
- [ ] Risk of bias assessment included
**For ML conference papers:**
- [ ] All datasets named with versions
- [ ] Code availability stated (GitHub link if available)
- [ ] Hyperparameters listed (appendix acceptable)
- [ ] Random seeds reported
- [ ] Computational resources specified
- [ ] Error bars/confidence intervals shown
- [ ] Broader Impact statement (if required)