Initial commit

This commit is contained in:
Zhongwei Li
2025-11-30 08:30:18 +08:00
commit 74bee324ab
335 changed files with 147377 additions and 0 deletions

View File

@@ -0,0 +1,285 @@
# Research Grants Skill
## Overview
Comprehensive skill for writing competitive research grant proposals focused on four major U.S. funding agencies:
- **NSF** (National Science Foundation)
- **NIH** (National Institutes of Health)
- **DOE** (Department of Energy)
- **DARPA** (Defense Advanced Research Projects Agency)
## What This Skill Provides
### Agency-Specific Guidance
Detailed reference materials for each funding agency including:
- Mission and priorities
- Review criteria and scoring
- Proposal structure and page limits
- Budget requirements
- Submission processes
- Tips for competitive applications
### Core Components
- **Specific Aims Pages** (NIH): Template and detailed guide for the critical 1-page aims page
- **Project Summaries** (NSF): Template for the required Overview, Intellectual Merit, and Broader Impacts
- **Broader Impacts**: Comprehensive strategies for NSF's equally-weighted review criterion
- **Budget Justification**: Templates and examples for personnel, equipment, travel, and supplies
- **Review Criteria**: Understanding what reviewers look for at each agency
### Templates
Ready-to-use templates for:
- NSF Project Summary
- NIH Specific Aims Page
- Budget Justifications
- (Additional templates in development)
## How to Use This Skill
### Quick Start
When writing a grant proposal, specify the agency and grant type:
```
> Help me write an NSF proposal for computational biology research
> I need to draft NIH R01 Specific Aims for my cancer research project
> What should I include in a DOE ARPA-E concept paper?
> I'm applying for a DARPA program - help me structure the proposal
```
### Detailed Guidance
For in-depth help on specific components:
```
> Help me write compelling broader impacts for my NSF proposal
> Review my NIH Specific Aims page
> What should I include in my budget justification?
> How do I respond to reviewer comments in an NIH resubmission?
```
### Agency Comparison
```
> What are the key differences between NSF and NIH proposals?
> Should I apply to DOE or DARPA for my energy technology project?
```
## Key Features
### NSF Proposals
- **Intellectual Merit + Broader Impacts** (equally weighted)
- Strategies for substantive, measurable broader impacts
- Integration of research and education
- Broadening participation in STEM
- 15-page project description limits (most programs)
### NIH Proposals
- **Specific Aims Page**: The most critical page (detailed 1-page guide included)
- **Research Strategy**: Significance, Innovation, Approach sections
- **Preliminary Data**: Essential for R01 applications
- Rigor and reproducibility requirements
- Modular vs. detailed budgets
- Resubmission strategies (A1 applications)
### DOE Proposals
- **Energy relevance** and alignment with DOE mission
- **Technology readiness levels** (TRLs)
- National laboratory collaborations
- Cost sharing requirements (especially ARPA-E)
- Commercialization pathways
- User facilities access
### DARPA Proposals
- **DARPA-hard problems**: High-risk, high-reward
- **Heilmeier Catechism**: The 8 critical questions
- Program Manager engagement (critical!)
- Phase-based structure with milestones
- Technology transition planning
- Demonstration and prototypes
## Reference Materials
### Agency Guidelines
- `references/nsf_guidelines.md` - Comprehensive NSF guidance
- `references/nih_guidelines.md` - NIH mechanisms and review criteria
- `references/doe_guidelines.md` - DOE offices and programs
- `references/darpa_guidelines.md` - DARPA structure and strategy
### Specialized Guides
- `references/broader_impacts.md` - NSF broader impacts strategies
- `references/specific_aims_guide.md` - NIH Specific Aims page mastery
- `references/budget_preparation.md` - Budget development (coming soon)
- `references/review_criteria.md` - Comparative review criteria (coming soon)
- `references/timeline_planning.md` - Project management (coming soon)
### Templates
- `assets/nsf_project_summary_template.md`
- `assets/nih_specific_aims_template.md`
- `assets/budget_justification_template.md`
## Success Metrics
Typical success rates by agency:
- **NSF**: 15-30% (varies by program)
- **NIH R01**: ~20% overall (~27% for Early Stage Investigators)
- **DOE Office of Science**: 20-40% (varies by program)
- **ARPA-E**: 2-5% (concept papers to awards)
- **DARPA**: Highly variable by program
## Common Use Cases
### First-Time Applicants
```
> I've never written a grant before. Help me understand NSF proposal structure.
> What are the most common mistakes in first NIH R01 applications?
```
### Experienced Investigators
```
> Help me strengthen the innovation section for my NIH resubmission
> I need to address broader impacts more substantively for NSF
> What's the best way to show technology transition for DARPA?
```
### Career Development
```
> Help me write a competitive NSF CAREER proposal
> What should I emphasize in an NIH K99/R00 application?
```
### Multi-Agency Strategy
```
> Should I submit this to NSF or NIH?
> Can I submit similar proposals to DOE and DARPA?
```
## Best Practices
### Start Early
- NSF/NIH proposals: Start 3-6 months before deadline
- DOE/DARPA proposals: 4-6 months (especially if involving national labs)
### Get Feedback
- Mock review sessions
- Colleagues in and outside your field
- Institutional grant support offices
- Program officers (when appropriate)
### Understand Review Criteria
- NSF: Intellectual Merit + Broader Impacts (equal weight)
- NIH: Significance, Investigator, Innovation, Approach, Environment (scored 1-9)
- DOE: Technical merit, qualifications, budget, relevance
- DARPA: Innovation, impact, team, feasibility, transition
### Common Success Factors
✅ Clear, compelling significance and innovation
✅ Strong preliminary data (NIH, DOE)
✅ Detailed, rigorous methodology
✅ Realistic timeline and budget
✅ Specific, measurable outcomes
✅ Strong team with relevant expertise
✅ Integration of broader impacts (NSF)
✅ Technology transition plan (DOE, DARPA)
## Integration with Other Skills
This skill works well with:
- **Scientific Writing**: For clear, compelling prose
- **Literature Review**: For background sections
- **Research Lookup**: For finding relevant citations
- **Peer Review**: For self-assessment before submission
## Updates and Additions
This skill is continuously updated with:
- Current agency priorities
- Recent policy changes
- New funding mechanisms
- Additional templates and examples
### Coming Soon
- More budget examples
- Timeline templates
- Collaboration letter templates
- Data management plan templates
- Facilities and equipment description templates
## Tips for Maximum Effectiveness
### For NSF Proposals
1. Start with Specific Aims/Objectives (even though not required)
2. Develop broader impacts with same rigor as research plan
3. Use figures and diagrams liberally (make it skimmable)
4. Address both review criteria explicitly
5. Get feedback from outside your immediate field
### For NIH Proposals
1. Perfect your Specific Aims page first (10+ drafts)
2. Include substantial preliminary data
3. Address rigor and reproducibility explicitly
4. Identify potential problems proactively with alternatives
5. Make sure your aims are independent but synergistic
### For DOE Proposals
1. Emphasize energy relevance and impact
2. Include quantitative metrics (cost, efficiency, emissions)
3. Develop pathway to deployment or commercialization
4. Consider national laboratory partnerships
5. Address technology readiness levels
### For DARPA Proposals
1. Contact the Program Manager early (essential!)
2. Attend Proposers Day events
3. Focus on breakthrough innovation (10x, not 10%)
4. Answer the Heilmeier Catechism explicitly
5. Develop clear transition strategy
## Resources Beyond This Skill
### Official Resources
- NSF: https://www.nsf.gov/funding/
- NIH: https://grants.nih.gov/
- DOE: https://science.osti.gov/grants/
- DARPA: https://www.darpa.mil/work-with-us/opportunities
### Institutional Resources
- Your institution's Office of Sponsored Research
- Grant writing workshops
- Internal review programs
- Successful proposal archives
### Professional Development
- Grant writing courses and webinars
- Agency-specific guidance documents
- Professional society resources
- Mentoring networks
## Questions or Issues?
This skill is designed to be comprehensive but may not cover every specific situation. When using this skill:
1. **Be specific** about your agency, program, and grant type
2. **Provide context** about your research area and career stage
3. **Ask follow-up questions** for clarification
4. **Request examples** for specific sections you're working on
## Version History
- **v1.0** (January 2025): Initial release with NSF, NIH, DOE, DARPA guidance
- Comprehensive reference materials for all four agencies
- Templates for key proposal components
- Specific Aims and Broader Impacts detailed guides
---
**Remember**: Grant writing is both an art and a science. This skill provides the frameworks, strategies, and best practices—but your unique research vision, preliminary data, and team expertise are what will ultimately win funding. Start early, seek feedback, revise extensively, and don't be discouraged by rejection. Even the most successful scientists face many declined proposals before achieving funding success.
Good luck with your proposals! 🎯

View File

@@ -0,0 +1,933 @@
---
name: research-grants
description: "Write competitive research proposals for NSF, NIH, DOE, and DARPA. Agency-specific formatting, review criteria, budget preparation, broader impacts, significance statements, innovation narratives, and compliance with submission requirements."
allowed-tools: [Read, Write, Edit, Bash]
---
# Research Grant Writing
## Overview
Research grant writing is the process of developing competitive funding proposals for federal agencies and foundations. Master agency-specific requirements, review criteria, narrative structure, budget preparation, and compliance for NSF (National Science Foundation), NIH (National Institutes of Health), DOE (Department of Energy), and DARPA (Defense Advanced Research Projects Agency) submissions.
**Critical Principle: Grants are persuasive documents that must simultaneously demonstrate scientific rigor, innovation, feasibility, and broader impact.** Each agency has distinct priorities, review criteria, formatting requirements, and strategic goals that must be addressed.
## When to Use This Skill
This skill should be used when:
- Writing research proposals for NSF, NIH, DOE, or DARPA programs
- Preparing project descriptions, specific aims, or technical narratives
- Developing broader impacts or significance statements
- Creating research timelines and milestone plans
- Preparing budget justifications and personnel allocation plans
- Responding to program solicitations or funding announcements
- Addressing reviewer comments in resubmissions
- Planning multi-institutional collaborative proposals
- Writing preliminary data or feasibility sections
- Preparing biosketches, CVs, or facilities descriptions
## Visual Enhancement with Scientific Schematics
**When creating documents with this skill, always consider adding scientific diagrams and schematics to enhance visual communication.**
If your document does not already contain schematics or diagrams:
- Use the **scientific-schematics** skill to generate AI-powered publication-quality diagrams
- Simply describe your desired diagram in natural language
- Nano Banana Pro will automatically generate, review, and refine the schematic
**For new documents:** Scientific schematics should be generated by default to visually represent key concepts, workflows, architectures, or relationships described in the text.
**How to generate schematics:**
```bash
python scripts/generate_schematic.py "your diagram description" -o figures/output.png
```
The AI will automatically:
- Create publication-quality images with proper formatting
- Review and refine through multiple iterations
- Ensure accessibility (colorblind-friendly, high contrast)
- Save outputs in the figures/ directory
**When to add schematics:**
- Research methodology and workflow diagrams
- Project timeline Gantt charts
- Conceptual framework illustrations
- System architecture diagrams (for technical proposals)
- Experimental design flowcharts
- Broader impacts activity diagrams
- Collaboration network diagrams
- Any complex concept that benefits from visualization
For detailed guidance on creating schematics, refer to the scientific-schematics skill documentation.
---
## Agency-Specific Overview
### NSF (National Science Foundation)
**Mission**: Promote the progress of science and advance national health, prosperity, and welfare
**Key Features**:
- Intellectual Merit + Broader Impacts (equally weighted)
- 15-page project description limit (most programs)
- Emphasis on education, diversity, and societal benefit
- Collaborative research encouraged
- Open data and open science emphasis
- Merit review process with panel + ad hoc reviewers
### NIH (National Institutes of Health)
**Mission**: Enhance health, lengthen life, and reduce illness and disability
**Key Features**:
- Specific Aims (1 page) + Research Strategy (12 pages for R01)
- Significance, Innovation, Approach as core review criteria
- Preliminary data typically required for R01s
- Emphasis on rigor, reproducibility, and clinical relevance
- Modular budgets ($250K increments) for most R01s
- Multiple resubmission opportunities
### DOE (Department of Energy)
**Mission**: Ensure America's security and prosperity through energy, environmental, and nuclear challenges
**Key Features**:
- Focus on energy, climate, computational science, basic energy sciences
- Often requires cost sharing or industry partnerships
- Emphasis on national laboratory collaboration
- Strong computational and experimental integration
- Energy innovation and commercialization pathways
- Varies by office (ARPA-E, Office of Science, EERE, etc.)
### DARPA (Defense Advanced Research Projects Agency)
**Mission**: Make pivotal investments in breakthrough technologies for national security
**Key Features**:
- High-risk, high-reward transformative research
- Focus on "DARPA-hard" problems (what if true, who cares)
- Emphasis on prototypes, demonstrations, and transition paths
- Often requires multiple phases (feasibility, development, demonstration)
- Strong project management and milestone tracking
- Teaming and collaboration often required
- Varies dramatically by program manager and BAA (Broad Agency Announcement)
## Core Components of Research Proposals
### 1. Executive Summary / Project Summary / Abstract
Every proposal needs a concise overview that communicates the essential elements of the research to both technical reviewers and program officers.
**Purpose**: Provide a standalone summary that captures the research vision, significance, and approach
**Length**:
- NSF: 1 page (Project Summary with separate Overview, Intellectual Merit, Broader Impacts)
- NIH: 30 lines (Project Summary/Abstract)
- DOE: Varies (typically 1 page)
- DARPA: Varies (often 1-2 pages)
**Essential Elements**:
- Clear statement of the problem or research question
- Why this problem matters (significance, urgency, impact)
- Novel approach or innovation
- Expected outcomes and deliverables
- Qualifications of the team
- Broader impacts or translational pathway
**Writing Strategy**:
- Open with a compelling hook that establishes importance
- Use accessible language (avoid jargon in opening sentences)
- State specific, measurable objectives
- Convey enthusiasm and confidence
- Ensure every sentence adds value (no filler)
- End with transformative vision or impact statement
**Common Mistakes to Avoid**:
- Being too technical or detailed (save for project description)
- Failing to articulate "why now" or "why this team"
- Vague objectives or outcomes
- Neglecting broader impacts or significance
- Generic statements that could apply to any proposal
### 2. Project Description / Research Strategy
The core technical narrative that presents the research plan in detail.
**Structure Varies by Agency:**
**NSF Project Description** (typically 15 pages):
- Introduction and background
- Research objectives and questions
- Preliminary results (if applicable)
- Research plan and methodology
- Timeline and milestones
- Broader impacts (integrated throughout or separate section)
- Prior NSF support (if applicable)
**NIH Research Strategy** (12 pages for R01):
- Significance (why the problem matters)
- Innovation (what's novel and transformative)
- Approach (detailed research plan)
- Preliminary data
- Research design and methods
- Expected outcomes
- Potential problems and alternative approaches
**DOE Project Narrative** (varies):
- Background and significance
- Technical approach and innovation
- Qualifications and experience
- Facilities and resources
- Project management and timeline
**DARPA Technical Volume** (varies):
- Technical challenge and innovation
- Approach and methodology
- Schedule and milestones
- Deliverables and metrics
- Team qualifications
- Risk assessment and mitigation
For detailed agency-specific guidance, refer to:
- `references/nsf_guidelines.md`
- `references/nih_guidelines.md`
- `references/doe_guidelines.md`
- `references/darpa_guidelines.md`
### 3. Specific Aims (NIH) or Objectives (NSF/DOE/DARPA)
Clear, testable goals that structure the research plan.
**NIH Specific Aims Page** (1 page):
- Opening paragraph: Gap in knowledge and significance
- Long-term goal and immediate objectives
- Central hypothesis or research question
- 2-4 specific aims with sub-aims
- Expected outcomes and impact
- Payoff paragraph: Why this matters
**Structure for Each Aim:**
- Aim statement (1-2 sentences, starts with action verb)
- Rationale (why this aim, preliminary data support)
- Working hypothesis (testable prediction)
- Approach summary (brief methods overview)
- Expected outcomes and interpretation
**Writing Strategy**:
- Make aims independent but complementary
- Ensure each aim is achievable within timeline and budget
- Provide enough detail to judge feasibility
- Include contingency plans or alternative approaches
- Use parallel structure across aims
- Clearly state what will be learned from each aim
For detailed guidance, refer to `references/specific_aims_guide.md`.
### 4. Broader Impacts (NSF) / Significance (NIH)
Articulate the societal, educational, or translational value of the research.
**NSF Broader Impacts** (critical component, equal weight with Intellectual Merit):
NSF explicitly evaluates broader impacts. Address at least one of these areas:
1. **Advancing discovery and understanding while promoting teaching, training, and learning**
- Integration of research and education
- Training of students and postdocs
- Curriculum development
- Educational materials and resources
2. **Broadening participation of underrepresented groups**
- Recruitment and retention strategies
- Partnerships with minority-serving institutions
- Outreach to underrepresented communities
- Mentoring programs
3. **Enhancing infrastructure for research and education**
- Shared facilities or instrumentation
- Cyberinfrastructure and data resources
- Community-wide tools or databases
- Open-source software or methods
4. **Broad dissemination to enhance scientific and technological understanding**
- Public outreach and science communication
- K-12 educational programs
- Museum exhibits or media engagement
- Policy briefs or stakeholder engagement
5. **Benefits to society**
- Economic impact or commercialization
- Health, environment, or national security benefits
- Informed decision-making
- Workforce development
**Writing Strategy for NSF Broader Impacts**:
- Be specific with concrete activities, not vague statements
- Provide timeline and milestones for broader impacts activities
- Explain how impacts will be measured and assessed
- Connect to institutional resources and existing programs
- Show commitment through preliminary efforts or partnerships
- Integrate with research plan (not tacked on)
**NIH Significance**:
- Addresses important problem or critical barrier to progress
- Improves scientific knowledge, technical capability, or clinical practice
- Potential to lead to better outcomes, interventions, or understanding
- Rigor of prior research in the field
- Alignment with NIH mission and institute priorities
For detailed guidance, refer to `references/broader_impacts.md`.
### 5. Innovation and Transformative Potential
Articulate what is novel, creative, and paradigm-shifting about the research.
**Innovation Elements to Highlight**:
- **Conceptual Innovation**: New frameworks, models, or theories
- **Methodological Innovation**: Novel techniques, approaches, or technologies
- **Integrative Innovation**: Combining disciplines or approaches in new ways
- **Translational Innovation**: New pathways from discovery to application
- **Scale Innovation**: Unprecedented scope or resolution
**Writing Strategy**:
- Clearly state what is innovative (don't assume it's obvious)
- Explain why current approaches are insufficient
- Describe how your innovation overcomes limitations
- Provide evidence that innovation is feasible (preliminary data, proof-of-concept)
- Distinguish incremental from transformative advances
- Balance innovation with feasibility (not too risky)
**Common Mistakes**:
- Claiming novelty without demonstrating knowledge of prior work
- Confusing "new to me" with "new to the field"
- Over-promising without supporting evidence
- Being too incremental (minor variation on existing work)
- Being too speculative (no path to success)
### 6. Research Approach and Methods
Detailed description of how the research will be conducted.
**Essential Components**:
- Overall research design and framework
- Detailed methods for each aim/objective
- Sample sizes, statistical power, and analysis plans
- Timeline and sequence of activities
- Data collection, management, and analysis
- Quality control and validation approaches
- Potential problems and alternative strategies
- Rigor and reproducibility measures
**Writing Strategy**:
- Provide enough detail for reproducibility and feasibility assessment
- Use subheadings and figures to improve organization
- Justify choice of methods and approaches
- Address potential limitations proactively
- Include preliminary data demonstrating feasibility
- Show that you've thought through the research process
- Balance detail with readability (use supplementary materials for extensive details)
**For Experimental Research**:
- Describe experimental design (controls, replicates, blinding)
- Specify materials, reagents, and equipment
- Detail data collection protocols
- Explain statistical analysis plans
- Address rigor and reproducibility
**For Computational Research**:
- Describe algorithms, models, and software
- Specify datasets and validation approaches
- Explain computational resources required
- Address code availability and documentation
- Describe benchmarking and performance metrics
**For Clinical or Translational Research**:
- Describe study population and recruitment
- Detail intervention or treatment protocols
- Explain outcome measures and assessments
- Address regulatory approvals (IRB, IND, IDE)
- Describe clinical trial design and monitoring
For detailed methodology guidance by discipline, refer to `references/research_methods.md`.
### 7. Preliminary Data and Feasibility
Demonstrate that the research is achievable and the team is capable.
**Purpose**:
- Prove that the proposed approach can work
- Show that the team has necessary expertise
- Demonstrate access to required resources
- Reduce perceived risk for reviewers
- Provide foundation for proposed work
**What to Include**:
- Pilot studies or proof-of-concept results
- Method development or optimization
- Access to unique resources (samples, data, collaborators)
- Relevant publications from your team
- Preliminary models or simulations
- Feasibility assessments or power calculations
**NIH Requirements**:
- R01 applications typically require substantial preliminary data
- R21 applications may have less stringent requirements
- New investigators may have less preliminary data
- Preliminary data should directly support proposed aims
**NSF Approach**:
- Preliminary data less commonly required than NIH
- May be important for high-risk or novel approaches
- Can strengthen proposal for competitive programs
**Writing Strategy**:
- Present most compelling data that supports your approach
- Clearly connect preliminary data to proposed aims
- Acknowledge limitations and how proposed work will address them
- Use figures and data visualizations effectively
- Avoid over-interpreting or overstating preliminary findings
- Show trajectory of your research program
### 8. Timeline, Milestones, and Management Plan
Demonstrate that the project is well-planned and achievable within the proposed timeframe.
**Essential Elements**:
- Phased timeline with clear milestones
- Logical sequence and dependencies
- Realistic timeframes for each activity
- Decision points and go/no-go criteria
- Risk mitigation strategies
- Resource allocation across time
- Coordination plan for multi-institutional teams
**Presentation Formats**:
- Gantt charts showing overlapping activities
- Year-by-year breakdown of activities
- Quarterly milestones and deliverables
- Table of aims/tasks with timeline and personnel
**Writing Strategy**:
- Be realistic about what can be accomplished
- Build in time for unexpected delays or setbacks
- Show that timeline aligns with budget and personnel
- Demonstrate understanding of regulatory timelines (IRB, IACUC)
- Include time for dissemination and broader impacts
- Address how progress will be monitored and assessed
**DARPA Emphasis**:
- Particularly important for DARPA proposals
- Clear technical milestones with measurable metrics
- Quarterly deliverables and reporting
- Phase-based structure with exit criteria
- Demonstration and transition planning
For detailed guidance, refer to `references/timeline_planning.md`.
### 9. Team Qualifications and Collaboration
Demonstrate that the team has the expertise, experience, and resources to succeed.
**Essential Elements**:
- PI qualifications and relevant expertise
- Co-I and collaborator roles and contributions
- Track record in the research area
- Complementary expertise across team
- Institutional support and resources
- Prior collaboration history (if applicable)
- Mentoring and training plan (for students/postdocs)
**Writing Strategy**:
- Highlight most relevant publications and accomplishments
- Clearly define roles and responsibilities
- Show that team composition is necessary (not just convenient)
- Demonstrate successful prior collaborations
- Address how team will be managed and coordinated
- Explain institutional commitment and support
**Biosketches / CVs**:
- Follow agency-specific formats (NSF, NIH, DOE, DARPA differ)
- Highlight most relevant publications and accomplishments
- Include synergistic activities and collaborations
- Show trajectory and productivity
- Address any career gaps or interruptions
**Letters of Collaboration**:
- Specific commitments and contributions
- Demonstrates genuine partnership
- Includes resource sharing or access agreements
- Signed and on letterhead
For detailed guidance, refer to `references/team_building.md`.
### 10. Budget and Budget Justification
Develop realistic budgets that align with the proposed work and agency guidelines.
**Budget Categories** (typical):
- **Personnel**: Salary and fringe for PI, co-Is, postdocs, students, staff
- **Equipment**: Items >$5,000 (varies by agency)
- **Travel**: Conferences, collaborations, fieldwork
- **Materials and Supplies**: Consumables, reagents, software
- **Other Direct Costs**: Publication costs, participant incentives, consulting
- **Indirect Costs (F&A)**: Institutional overhead (rates vary)
- **Subawards**: Costs for collaborating institutions
**Agency-Specific Considerations**:
**NSF**:
- Full budget justification required
- Cost sharing generally not required (but may strengthen proposal)
- Up to 2 months summer salary for faculty
- Graduate student support encouraged
**NIH**:
- Modular budgets for ≤$250K direct costs per year (R01)
- Detailed budgets for >$250K or complex awards
- Salary cap applies (~$221,900 for 2024)
- Limited to 1 month (8.33% FTE) for most PIs
**DOE**:
- Often requires cost sharing (especially ARPA-E)
- Detailed budget with quarterly breakdown
- Requires institutional commitment letters
- National laboratory collaboration budgets separate
**DARPA**:
- Detailed budgets by phase and task
- Requires supporting cost data for large procurements
- Often requires cost-plus or firm-fixed-price structures
- Travel budget for program meetings
**Budget Justification Writing**:
- Justify each line item in terms of the research plan
- Explain effort percentages for personnel
- Describe specific equipment and why necessary
- Justify travel (conferences, collaborations)
- Explain consultant roles and rates
- Show how budget aligns with timeline
For detailed budget guidance, refer to `references/budget_preparation.md`.
## Review Criteria by Agency
Understanding how proposals are evaluated is critical for writing competitive applications.
### NSF Review Criteria
**Intellectual Merit** (primary):
- What is the potential for the proposed activity to advance knowledge?
- How well-conceived and organized is the proposed activity?
- Is there sufficient access to resources?
- How well-qualified is the individual, team, or institution to conduct proposed activities?
**Broader Impacts** (equally important):
- What is the potential for the proposed activity to benefit society?
- To what extent does the proposal address broader impacts in meaningful ways?
**Additional Considerations**:
- Integration of research and education
- Diversity and inclusion
- Results from prior NSF support (if applicable)
### NIH Review Criteria
**Scored Criteria** (1-9 scale, 1 = exceptional, 9 = poor):
1. **Significance**
- Addresses important problem or critical barrier
- Improves scientific knowledge, technical capability, or clinical practice
- Aligns with NIH mission
2. **Investigator(s)**
- Well-suited to the project
- Track record of accomplishments
- Adequate training and expertise
3. **Innovation**
- Novel concepts, approaches, methodologies, or interventions
- Challenges existing paradigms
- Addresses important problem in creative ways
4. **Approach**
- Well-reasoned and appropriate
- Rigorous and reproducible
- Adequately accounts for potential problems
- Feasible within timeline
5. **Environment**
- Institutional support and resources
- Scientific environment contributes to probability of success
**Additional Review Considerations** (not scored but discussed):
- Protections for human subjects
- Inclusion of women, minorities, and children
- Vertebrate animal welfare
- Biohazards
- Resubmission response (if applicable)
- Budget and timeline appropriateness
### DOE Review Criteria
Varies by program office, but generally includes:
- Scientific and/or technical merit
- Appropriateness of proposed method or approach
- Competency of personnel and adequacy of facilities
- Reasonableness and appropriateness of budget
- Relevance to DOE mission and program goals
### DARPA Review Criteria
**DARPA-specific considerations**:
- Overall scientific and technical merit
- Potential contribution to DARPA mission
- Relevance to stated program goals
- Plans and capability to accomplish technology transition
- Qualifications and experience of proposed team
- Realism of proposed costs and availability of funds
**Key Questions DARPA Asks**:
- **What if you succeed?** (Impact if the research works)
- **What if you're right?** (Implications of your hypothesis)
- **Who cares?** (Why it matters for national security)
For detailed review criteria by agency, refer to `references/review_criteria.md`.
## Writing Principles for Competitive Proposals
### Clarity and Accessibility
**Write for Multiple Audiences**:
- Technical reviewers in your field (will scrutinize methods)
- Reviewers in related but not identical fields (need context)
- Program officers (look for alignment with agency goals)
- Panel members reading 15+ proposals (need clear organization)
**Strategies**:
- Use clear section headings and subheadings
- Start sections with overview paragraphs
- Define technical terms and abbreviations
- Use figures, diagrams, and tables to clarify complex ideas
- Avoid jargon when possible; explain when necessary
- Use topic sentences to guide readers
### Persuasive Argumentation
**Build a Compelling Narrative**:
- Establish the problem and its importance
- Show gaps in current knowledge or approaches
- Present your solution as innovative and feasible
- Demonstrate that you're the right team
- Show that success will have significant impact
**Structure of Persuasion**:
1. **Hook**: Capture attention with significance
2. **Problem**: Establish what's not known or not working
3. **Solution**: Present your innovative approach
4. **Evidence**: Support with preliminary data
5. **Impact**: Show transformative potential
6. **Team**: Demonstrate capability to deliver
**Language Choices**:
- Use active voice for clarity and confidence
- Choose strong verbs (investigate, elucidate, discover vs. look at, study)
- Be confident but not arrogant (avoid "obviously," "clearly")
- Acknowledge uncertainty appropriately
- Use precise language (avoid vague terms like "several," "various")
### Visual Communication
**Effective Use of Figures**:
- Conceptual diagrams showing research framework
- Preliminary data demonstrating feasibility
- Timelines and Gantt charts
- Workflow diagrams showing methodology
- Expected results or predictions
**Design Principles**:
- Make figures self-explanatory with complete captions
- Use consistent color schemes and fonts
- Ensure readability (large enough fonts, clear labels)
- Integrate figures with text (refer to specific figures)
- Follow agency-specific formatting requirements
### Addressing Risk and Feasibility
**Balance Innovation and Risk**:
- Acknowledge potential challenges
- Provide alternative approaches
- Show preliminary data reducing risk
- Demonstrate expertise to handle challenges
- Include contingency plans
**Common Concerns**:
- Too ambitious for timeline/budget
- Technically infeasible
- Team lacks necessary expertise
- Preliminary data insufficient
- Methods not adequately described
- Lack of innovation or significance
### Integration and Coherence
**Ensure All Parts Align**:
- Budget supports activities in project description
- Timeline matches aims and milestones
- Team composition matches required expertise
- Broader impacts connect to research plan
- Letters of support confirm stated collaborations
**Avoid Contradictions**:
- Preliminary data vs. stated gaps
- Claimed expertise vs. publication record
- Stated aims vs. actual methods
- Budget vs. stated activities
## Common Proposal Types
### NSF Proposal Types
- **Standard Research Proposals**: Most common, up to $500K and 5 years
- **CAREER Awards**: Early career faculty, integrated research/education, $400-500K over 5 years
- **Collaborative Research**: Multiple institutions, separately submitted, shared research plan
- **RAPID**: Urgent research opportunities, up to $200K, no preliminary data required
- **EAGER**: High-risk, high-reward exploratory research, up to $300K
- **EArly-concept Grants for Exploratory Research (EAGER)**: Early-stage exploratory work
### NIH Award Mechanisms
- **R01**: Research Project Grant, $250K+ per year, 3-5 years, most common
- **R21**: Exploratory/Developmental Research, up to $275K over 2 years, no preliminary data
- **R03**: Small Grant Program, up to $100K over 2 years
- **R15**: Academic Research Enhancement Awards (AREA), for primarily undergraduate institutions
- **R35**: MIRA (Maximizing Investigators' Research Award), program-specific
- **P01**: Program Project Grant, multi-project integrated research
- **U01**: Research Project Cooperative Agreement, NIH involvement in conduct
**Fellowship Mechanisms**:
- **F30**: Predoctoral MD/PhD Fellowship
- **F31**: Predoctoral Fellowship
- **F32**: Postdoctoral Fellowship
- **K99/R00**: Pathway to Independence Award
- **K08**: Mentored Clinical Scientist Research Career Development Award
### DOE Programs
- **Office of Science**: Basic research in physical sciences, biological sciences, computing
- **ARPA-E**: Transformative energy technologies, requires cost sharing
- **EERE**: Applied research in renewable energy and energy efficiency
- **National Laboratories**: Collaborative research with DOE labs
### DARPA Programs
- **Varies by Office**: BTO, DSO, I2O, MTO, STO, TTO
- **Program-Specific BAAs**: Broad Agency Announcements for specific thrusts
- **Young Faculty Award (YFA)**: Early career researchers, up to $500K
- **Director's Fellowship**: High-risk, paradigm-shifting research
For detailed program guidance, refer to `references/funding_mechanisms.md`.
## Resubmission Strategies
### NIH Resubmission (A1)
**Introduction to Resubmission** (1 page):
- Summarize major criticisms from previous review
- Describe specific changes made in response
- Use bullet points for clarity
- Be respectful of reviewers' comments
- Highlight substantial improvements
**Strategies**:
- Address every major criticism
- Make changes visible (but don't use track changes in final)
- Strengthen weak areas (preliminary data, methods, significance)
- Consider changing aims if fundamentally flawed
- Get external feedback before resubmitting
- Use full 37-month window if needed for new data
**When Not to Resubmit**:
- Fundamental conceptual flaws
- Lack of innovation or significance
- Missing key expertise or resources
- Extensive revisions needed (consider new submission)
### NSF Resubmission
**NSF allows resubmission after revision**:
- Address reviewer concerns in revised proposal
- No formal "introduction to resubmission" section
- May be reviewed by same or different panel
- Consider program officer feedback
- May need to wait for next submission cycle
For detailed resubmission guidance, refer to `references/resubmission_strategies.md`.
## Common Mistakes to Avoid
### Conceptual Mistakes
1. **Failing to Address Review Criteria**: Not explicitly discussing significance, innovation, approach, etc.
2. **Mismatch with Agency Mission**: Proposing research that doesn't align with agency goals
3. **Unclear Significance**: Failing to articulate why the research matters
4. **Insufficient Innovation**: Incremental work presented as transformative
5. **Vague Objectives**: Goals that are not specific or measurable
### Writing Mistakes
1. **Poor Organization**: Lack of clear structure and flow
2. **Excessive Jargon**: Inaccessible to broader review panel
3. **Verbosity**: Unnecessarily complex or wordy writing
4. **Missing Context**: Assuming reviewers know your field deeply
5. **Inconsistent Terminology**: Using different terms for same concept
### Technical Mistakes
1. **Inadequate Methods**: Insufficient detail to judge feasibility
2. **Overly Ambitious**: Too much proposed for timeline/budget
3. **No Preliminary Data**: For mechanisms requiring demonstrated feasibility
4. **Poor Timeline**: Unrealistic or poorly justified schedule
5. **Misaligned Budget**: Budget doesn't support proposed activities
### Formatting Mistakes
1. **Exceeding Page Limits**: Automatic rejection
2. **Wrong Font or Margins**: Non-compliant formatting
3. **Missing Required Sections**: Incomplete application
4. **Poor Figure Quality**: Illegible or unprofessional figures
5. **Inconsistent Citations**: Formatting errors in references
### Strategic Mistakes
1. **Wrong Program or Mechanism**: Proposing to inappropriate opportunity
2. **Weak Team**: Insufficient expertise or missing key collaborators
3. **No Broader Impacts**: For NSF, failing to adequately address
4. **Ignoring Program Priorities**: Not aligning with current emphasis areas
5. **Late Submission**: Technical issues or rushed preparation
## Workflow for Grant Development
### Phase 1: Planning and Preparation (2-6 months before deadline)
**Activities**:
- Identify appropriate funding opportunities
- Review program announcements and requirements
- Consult with program officers (if appropriate)
- Assemble team and confirm collaborations
- Develop preliminary data (if needed)
- Outline research plan and specific aims
- Review successful proposals (if available)
**Outputs**:
- Selected funding opportunity
- Assembled team with defined roles
- Preliminary outline of specific aims
- Gap analysis of needed preliminary data
### Phase 2: Drafting (2-3 months before deadline)
**Activities**:
- Write specific aims or objectives (start here!)
- Develop project description/research strategy
- Create figures and data visualizations
- Draft timeline and milestones
- Prepare preliminary budget
- Write broader impacts or significance sections
- Request letters of support/collaboration
**Outputs**:
- Complete first draft of narrative sections
- Preliminary budget with justification
- Timeline and management plan
- Requested letters from collaborators
### Phase 3: Internal Review (1-2 months before deadline)
**Activities**:
- Circulate draft to co-investigators
- Seek feedback from colleagues and mentors
- Request institutional review (if required)
- Mock review session (if possible)
- Revise based on feedback
- Refine budget and budget justification
**Outputs**:
- Revised draft incorporating feedback
- Refined budget aligned with revised plan
- Identified weaknesses and mitigation strategies
### Phase 4: Finalization (2-4 weeks before deadline)
**Activities**:
- Final revisions to narrative
- Prepare all required forms and documents
- Finalize budget and budget justification
- Compile biosketches, CVs, and current & pending
- Collect letters of support
- Prepare data management plan (if required)
- Write project summary/abstract
- Proofread all materials
**Outputs**:
- Complete, polished proposal
- All required supplementary documents
- Formatted according to agency requirements
### Phase 5: Submission (1 week before deadline)
**Activities**:
- Institutional review and approval
- Upload to submission portal
- Verify all documents and formatting
- Submit 24-48 hours before deadline
- Confirm successful submission
- Receive confirmation and proposal number
**Outputs**:
- Submitted proposal
- Submission confirmation
- Archived copy of all materials
**Critical Tip**: Never wait until the deadline. Portals crash, files corrupt, and emergencies happen. Aim for 48 hours early.
## Integration with Other Skills
This skill works effectively with:
- **Scientific Writing**: For clear, compelling prose
- **Literature Review**: For comprehensive background sections
- **Peer Review**: For self-assessment before submission
- **Research Lookup**: For finding relevant citations and prior work
- **Data Visualization**: For creating effective figures
## Resources
This skill includes comprehensive reference files covering specific aspects of grant writing:
- `references/nsf_guidelines.md`: NSF-specific requirements, formatting, and strategies
- `references/nih_guidelines.md`: NIH mechanisms, review criteria, and submission requirements
- `references/doe_guidelines.md`: DOE programs, emphasis areas, and application procedures
- `references/darpa_guidelines.md`: DARPA BAAs, program offices, and proposal strategies
- `references/broader_impacts.md`: Strategies for compelling broader impacts statements
- `references/specific_aims_guide.md`: Writing effective specific aims pages
- `references/budget_preparation.md`: Budget development and justification
- `references/review_criteria.md`: Detailed review criteria by agency
- `references/timeline_planning.md`: Creating realistic timelines and milestones
- `references/team_building.md`: Assembling and presenting effective teams
- `references/resubmission_strategies.md`: Responding to reviews and revising proposals
Load these references as needed when working on specific aspects of grant writing.
## Templates and Assets
- `assets/nsf_project_summary_template.md`: NSF project summary structure
- `assets/nih_specific_aims_template.md`: NIH specific aims page template
- `assets/timeline_gantt_template.md`: Timeline and Gantt chart examples
- `assets/budget_justification_template.md`: Budget justification structure
- `assets/biosketch_templates/`: Agency-specific biosketch formats
## Scripts and Tools
- `scripts/compliance_checker.py`: Verify formatting requirements
- `scripts/budget_calculator.py`: Calculate budgets with inflation and fringe
- `scripts/deadline_tracker.py`: Track submission deadlines and milestones
---
**Final Note**: Grant writing is both an art and a science. Success requires not only excellent research ideas but also clear communication, strategic positioning, and meticulous attention to detail. Start early, seek feedback, and remember that even the best researchers face rejection—persistence and revision are key to funding success.

View File

@@ -0,0 +1,453 @@
# Budget Justification Template
## Overview
A budget justification provides detailed explanation for each budget line item, demonstrating that costs are necessary, reasonable, and directly related to the proposed research. The justification should be detailed enough for reviewers to understand and assess cost reasonableness.
**Key Principles**:
- Justify EVERY line item in terms of the research plan
- Explain calculations clearly
- Show that costs are necessary for the proposed work
- Demonstrate cost-effectiveness where possible
- Follow agency-specific formats and requirements
---
## Personnel (Salaries and Wages)
### Senior Personnel
**Principal Investigator: [Name, Title]**
**Effort**: [X] calendar months ([Y]% FTE) per year
**Justification**:
The PI will provide overall scientific leadership, supervise all research activities, mentor graduate students and postdocs, analyze data, prepare manuscripts, and report to the funding agency. The PI will be responsible for [specific activities related to aims]. [X] months of effort is necessary given the scope of the project and the PI's other commitments ([describe other activities briefly]).
**Calculation**:
- Year 1: [Annual salary] × [% effort] × [inflation factor if applicable] = $[amount]
- Years 2-5: [include escalation if applicable]
**Example**:
*Principal Investigator: Dr. Jane Smith, Associate Professor of Biology*
*Effort*: 2.5 calendar months (21% FTE) per year
*Justification*: Dr. Smith will provide overall project leadership including: (1) supervising all experimental work and data analysis for Aims 1-3, (2) weekly mentoring meetings with 3 graduate students and 2 postdocs, (3) coordinating with collaborators at partner institutions, (4) analyzing multi-omics datasets and interpreting results, (5) preparing manuscripts and presenting at conferences, and (6) managing budget and reporting to NIH. 2.5 months effort is necessary for a project of this scope involving multiple aims, techniques, and personnel. Dr. Smith's remaining effort supports teaching (3 months), other research projects (4 months), and administrative duties (2.5 months).
*Calculation*:
- Year 1: $120,000 × 0.2083 = $25,000
- Years 2-5: 3% annual increase
---
**Co-Investigator: [Name, Title]**
**Effort**: [X] calendar months ([Y]% FTE) per year
**Justification**:
Dr. [Name] will be responsible for [specific aspects of project related to their expertise]. This includes [specific activities for which aims]. Co-I effort is essential because [expertise/resources they provide that PI lacks].
**Example**:
*Co-Investigator: Dr. Robert Johnson, Professor of Bioinformatics*
*Effort*: 1 calendar month (8.3% FTE) per year
*Justification*: Dr. Johnson will lead the computational analysis for Aim 1, including multi-omics data integration, machine learning-based subtype classification, and biomarker identification. His expertise in unsupervised clustering methods and experience with similar T2D datasets is essential for this aim. Specific responsibilities include: (1) developing analysis pipelines, (2) training graduate student in bioinformatics methods, (3) interpreting computational results, and (4) co-authoring manuscripts.
*Calculation*: Year 1: $150,000 × 0.0833 = $12,500
---
### Postdoctoral Scholars
**Postdoctoral Researcher (1.0 FTE)**
**Justification**:
One full-time postdoctoral researcher is essential to conduct [which experiments/aims]. The postdoc will be responsible for [specific technical activities], data analysis, and mentoring graduate students. Specific duties include: [list 4-6 key responsibilities tied to specific aims]. We will recruit a candidate with expertise in [required skills/background].
**Calculation**:
- Year 1: NIH NRSA stipend level Year 0-2 ($54,840) + fringe benefits (26%) = $69,099
- Years 2-3: Adjusted for postdoc experience level
- Years 4-5: Senior postdoc rate
**Example**:
*Postdoctoral Researcher (1.0 FTE)*
*Justification*: One full-time postdoc is essential to execute the cellular and molecular experiments in Aims 2-3. The postdoc will: (1) generate and characterize patient-derived iPSC lines, (2) differentiate iPSCs into β-cells, hepatocytes, and adipocytes, (3) perform functional assays (insulin secretion, glucose uptake, cytokine profiling), (4) conduct proteomics sample preparation and analysis, (5) integrate cellular data with clinical outcomes, and (6) mentor graduate students in cell culture techniques. We will recruit a candidate with expertise in stem cell biology and diabetes research. The postdoc will have opportunity for career development through institutional K99/R00 preparation programs.
*Calculation*:
- Year 1: $54,840 (NIH Year 0) + $14,258 (26% fringe) = $69,098
- Year 2: $56,784 (NIH Year 1) + $14,764 = $71,548
- Year 3: $59,292 (NIH Year 2) + $15,416 = $74,708
---
### Graduate Students
**Graduate Research Assistants ([Number] students)**
**Justification**:
[Number] graduate students are required to [specific roles and aims]. Each student will focus on [division of labor among students]. This project provides excellent training opportunities in [techniques/approaches], preparing students for careers in [field]. Students will be recruited from our [department/program] with preference for candidates from underrepresented groups through our partnerships with [specific programs].
**Calculation**:
- Stipend: $[amount]/student/year (following university RA rates)
- Tuition: $[amount]/student/year
- Total per student: $[amount]
- Number of students: [N]
- Total: $[amount] per year
**Example**:
*Graduate Research Assistants (3 students)*
*Justification*: Three PhD students are required to execute the experimental work across all three aims:
- Student 1 will lead Aim 1 work on multi-omics profiling and subtype classification
- Student 2 will conduct Aim 2 mechanistic studies using patient-derived cells
- Student 3 will perform Aim 3 treatment response analyses in cell models and humanized mice
This project provides excellent interdisciplinary training in genomics, cell biology, and translational diabetes research. Students will present annually at the American Diabetes Association and co-author peer-reviewed publications. We will recruit students from our Biological Sciences PhD program, with priority recruitment from underrepresented groups through our IMSD program (NIH R25).
*Calculation*:
- Stipend: $32,000/student/year (12 months at university RA rate)
- Tuition and fees: $18,000/student/year
- Total per student: $50,000/year
- 3 students × 5 years = $750,000 total
(Note: In modular budget, include under Personnel narrative; in detailed budget, may be split between Personnel and Other)
---
### Research Staff
**Research Technician ([Title], [% FTE])**
**Justification**:
A [full/part]-time research technician is necessary to [specific technical support]. The technician will [specific duties], allowing the PI and postdoc to focus on [higher-level activities]. Essential responsibilities include: [list key duties related to aims].
**Calculation**:
- Annual salary: $[amount] for [% FTE]
- Fringe benefits ([%]): $[amount]
- Total: $[amount]/year
**Example**:
*Research Technician (1.0 FTE)*
*Justification*: A full-time research technician is necessary to provide technical support for high-throughput assays and maintain cell lines and mouse colonies. Specific responsibilities include: (1) maintaining iPSC, hepatocyte, and adipocyte cultures (>50 patient-derived lines), (2) performing routine insulin secretion, glucose uptake, and ELISA assays, (3) managing humanized mouse colony and performing metabolic phenotyping, (4) preparing samples for omics analysis, and (5) maintaining laboratory equipment and ordering supplies. The technician will enable the postdoc and graduate students to focus on experimental design, data analysis, and manuscript preparation.
*Calculation*:
- Year 1: $45,000 (base salary) + $11,700 (26% fringe) = $56,700
- Years 2-5: 3% annual increase
---
## Fringe Benefits
**Rate**: [X]% for [category of personnel]
**Justification**:
Fringe benefit rates are based on our institution's federally negotiated rates. Rates differ by personnel category:
- Faculty: [X]%
- Postdocs: [X]%
- Graduate students: [X]% (if applicable)
- Staff: [X]%
Rates include [what's covered: health insurance, retirement, life insurance, etc.].
**Total Fringe**: $[amount] per year
---
## Equipment ($5,000 or more per unit)
**[Equipment Item Name and Model]**
**Cost**: $[amount]
**Justification**:
This equipment is essential for [which aims/experiments]. We currently do not have access to [this capability] at our institution. The [equipment] will be used to [specific applications in the project]. [Estimated usage: hours/week or % time on this project]. This equipment will support [how many students/researchers] and will remain useful for future projects in [area].
**Example**:
*BD FACSAria III Cell Sorter with 4-laser configuration*
*Cost*: $425,000
*Justification*: A high-speed cell sorter is essential for Aim 2 experiments requiring isolation of specific cell populations from patient-derived heterogeneous cultures (β-cells, hepatocytes, adipocytes) for downstream proteomics and functional analysis. Our current institutional sorter has a 6-month wait time and lacks the 4-laser capability needed for our 8-color panel. This sorter will be used 15 hours/week for this project and will support 3 graduate students and 1 postdoc. The equipment will be housed in the Department of Biology core facility and will be available to 15 other laboratories after this project, ensuring long-term institutional value. Equipment cost includes installation, training, and 5-year service contract.
---
## Travel
### Domestic Travel
**Purpose**: [Conference/meeting/collaboration]
**Justification**:
Travel is requested for [purpose: presenting results, collaboration, training]. The PI and/or [personnel] will attend [specific conferences/meetings] annually to disseminate findings and network with the research community. These meetings are essential for [specific benefits: feedback, collaborations, recruiting, staying current].
**Calculation**:
- [Conference name]: $[airfare] + $[hotel, X nights] + $[meals/incidentals] + $[registration] = $[total]
- Number of trips/year: [N]
- Total domestic travel: $[amount]/year
**Example**:
*Domestic Travel*
*Justification*: Annual travel for the PI, postdoc, and 2 graduate students to present research findings and network with the diabetes research community.
Trips include:
1. American Diabetes Association Scientific Sessions (annual, June): Premier venue for diabetes research dissemination. PI and 2 trainees will present posters/talks, attend workshops, and meet with collaborators. ($2,500/person × 3 people = $7,500)
2. Endocrine Society Annual Meeting (alternate years): Important for reaching clinical endocrinology audience. PI will present translational findings. ($2,200)
3. Cold Spring Harbor Metabolism & Disease Conference (Year 3): Specialized meeting for in-depth scientific exchange. Postdoc will present mechanistic findings. ($1,800)
*Total*: $9,700/year (Years 1-2, 4-5); $11,500/year (Year 3)
### Foreign Travel
**Purpose**: [International conference/collaboration]
**Justification**:
[If requesting foreign travel, provide strong justification for why international meeting is necessary]
**Example**:
*Foreign Travel*
*Justification*: PI will attend the International Diabetes Federation Congress (every 2 years, Years 2 and 4) to present findings to international clinical and research audience. This is the largest global diabetes meeting and essential for international collaborations and dissemination. Our data on molecular subtypes has direct relevance for diverse patient populations globally.
*Cost*: $4,500/trip (airfare $1,500, hotel 4 nights $1,200, meals $800, registration $1,000)
*Total*: $4,500 (Years 2, 4)
---
## Materials and Supplies
### [Category]
**Justification**:
[Description of supplies needed and why]
**Calculation**:
[Itemize major categories with estimated costs]
**Total**: $[amount]/year
**Example**:
*Laboratory Supplies and Reagents*
*Justification*: Supplies are required for cell culture, molecular biology, and metabolic assays across all three aims.
*Breakdown*:
- Cell culture reagents (media, growth factors, serum): $15,000/year
- Maintaining >50 patient-derived iPSC, hepatocyte, and adipocyte lines
- Differentiation protocols requiring specialized media
- Molecular biology supplies (RNA extraction, qPCR, Western blotting): $12,000/year
- Processing samples from cell assays and mouse tissues
- Validation experiments for omics findings
- Metabolomics and proteomics sample prep: $18,000/year
- Sample processing for Aim 1 multi-omics profiling (n=2,000 patients)
- Sample preparation for mass spectrometry (Aims 1-2)
- Mouse metabolic phenotyping supplies: $10,000/year
- Glucose tolerance tests, insulin tolerance tests
- Blood collection and plasma analysis
- Tissue harvest and processing
- Immunoassays and ELISAs: $8,000/year
- Insulin, c-peptide, GLP-1, cytokine measurements
- ~500 assays/year across aims
- General lab supplies (pipette tips, tubes, glassware): $7,000/year
*Total*: $70,000/year
---
## Participant/Trainee Support Costs
(For undergraduate researchers, workshop participants, etc.)
**Stipends**: $[amount]
**Justification**:
[Number] undergraduate researchers will participate in summer research for 10 weeks annually. Stipends of $[amount] per student provide support for [what stipend covers].
**Travel**: $[amount]
**Justification**:
Travel support for undergraduates to present research at [conference].
**Subsistence**: $[amount] (if applicable)
**Other**: $[amount]
**Total**: $[amount]/year
**Example**:
*Undergraduate Summer Research Program*
*Stipends*: 10 undergraduates × $5,000 = $50,000/year
*Justification*: Ten undergraduates will participate in 10-week summer research experiences, working with graduate students on specific sub-projects. Students will be recruited from partner HBCUs (50% of participants) and our institution's McNair Scholars program. Stipends ($5,000 per student for 10 weeks) provide support during full-time research commitment.
*Travel*: 10 students × $1,500 = $15,000/year
*Justification*: Support for undergraduates to present research at the Annual Biomedical Research Conference for Minority Students (ABRCMS). This is a critical professional development opportunity, particularly for students from underrepresented groups.
*Total Participant Support*: $65,000/year
(Note: Participant support costs are not subject to indirect costs)
---
## Other Direct Costs
### Publication Costs
**Cost**: $[amount]/year
**Justification**:
We anticipate publishing [N] peer-reviewed articles over the 5-year project period in open-access journals to ensure broad dissemination. Average open-access fees are approximately $[amount] per article. Funds will cover article processing charges for publications resulting from this work.
**Example**:
*Publication Costs*: $12,000/year
*Justification*: We anticipate 2 publications per year (10 total over 5 years) in high-impact open-access journals. Average article processing charges are $3,000-$4,000 (e.g., Nature Communications, Cell Reports, Diabetes). We budget $6,000/year to ensure broad, immediate dissemination of findings as required by NIH public access policy. Additional publications in traditional subscription journals will not require fees.
### Consultant Services
**[Consultant Name/Role]**: $[amount]
**Justification**:
Dr. [Name] will serve as consultant for [specific expertise needed]. [He/She] will [specific consulting activities], requiring approximately [X] days per year at a rate of $[amount]/day. This expertise is essential for [why you can't do this yourself] and will ensure [benefit to project].
**Example**:
*Statistical Consultant*: $15,000/year
*Justification*: Dr. Sarah Chen, Professor of Biostatistics at Johns Hopkins, will provide statistical consulting for machine learning-based subtype classification (Aim 1) and clinical outcome analysis (Aim 3). She will advise on study design, sample size calculations, analysis approaches, and interpretation of complex multi-omics datasets. Her expertise in diabetes clinical trials and unsupervised clustering is essential for rigorous analysis. Services will require approximately 10 days/year at $1,500/day (standard consulting rate). Dr. Chen has agreed to this arrangement (see letter of commitment).
### Other
List any other direct costs (subawards, animal costs, computing time, etc.)
---
## Consortium/Contractual Costs
(For collaborating institutions)
**[Institution Name] Subaward**
**Total costs**: $[amount] per year
**Justification**:
[Collaborating institution] will perform [specific work related to which aims]. Dr. [PI name at institution] will lead these efforts. This collaboration is essential because [why this expertise/resource is needed and not available at your institution].
**Work to be performed**:
- [Task 1]
- [Task 2]
- [Task 3]
Detailed budget and justification from [institution] are included as a subaward/consortium application.
**Example**:
*University of California San Diego Subaward*
*Total costs*: $100,000/year
*Justification*: UCSD will perform all mass spectrometry-based metabolomics and proteomics analyses for Aims 1-2. Dr. Michael Williams, Director of the UCSD Metabolomics Core, will lead these efforts. This collaboration is essential because our institution lacks the specialized mass spectrometry platforms (Orbitrap Fusion, QTOF) and expertise required for these analyses. UCSD has extensive experience with T2D metabolomics and proteomics, having processed >5,000 clinical samples.
*Work to be performed*:
- Sample processing and metabolite/protein extraction (Years 1-3)
- LC-MS/MS analysis on Orbitrap Fusion and QTOF platforms
- Data processing, quality control, and statistical analysis
- Quarterly meetings to discuss results and plan analyses
*Budget includes*: Personnel (50% technician, 10% Dr. Williams), supplies, and instrument time. Detailed subaward budget attached.
*Note*: Consortium F&A limited to 8% of total costs per NIH policy.
---
## Indirect Costs (Facilities & Administrative)
**Rate**: [X]% of Modified Total Direct Costs (MTDC)
**MTDC Excludes**: Equipment, capital expenditures, charges for patient care, participant support costs, rental costs of off-site facilities, scholarships and fellowships, and the portion of each subaward in excess of $25,000.
**Justification**:
Indirect cost rate is based on our institution's federally negotiated rate agreement with [DHHS/agency], effective [dates]. This rate covers institutional costs for facilities (building depreciation, operations, maintenance) and administration (sponsored projects office, accounting, library, etc.) that support research.
**Example**:
*Facilities & Administrative Costs*: 57% of MTDC (on-campus rate)
*Justification*: Our institution's federally negotiated F&A rate with DHHS is 57% for on-campus research, effective July 1, 2023 - June 30, 2027. This rate covers facilities costs (building depreciation, utilities, operations and maintenance) and administrative costs (sponsored projects administration, accounting, library, general administration).
*Calculation example (Year 1)*:
- Total direct costs: $550,000
- Less: Equipment ($425,000), participant support ($65,000), consortium F&A ($8,000)
- MTDC base: $52,000
- Indirect costs: $52,000 × 0.57 = $29,640
---
## Summary Budget Table
| Category | Year 1 | Year 2 | Year 3 | Year 4 | Year 5 | Total |
|----------|--------|--------|--------|--------|--------|-------|
| Personnel | $XXX | $XXX | $XXX | $XXX | $XXX | $XXX |
| Fringe Benefits | $XXX | $XXX | $XXX | $XXX | $XXX | $XXX |
| Equipment | $XXX | $0 | $0 | $0 | $0 | $XXX |
| Travel | $XXX | $XXX | $XXX | $XXX | $XXX | $XXX |
| Materials & Supplies | $XXX | $XXX | $XXX | $XXX | $XXX | $XXX |
| Other Direct Costs | $XXX | $XXX | $XXX | $XXX | $XXX | $XXX |
| Participant Support | $XXX | $XXX | $XXX | $XXX | $XXX | $XXX |
| Consortium/Subawards | $XXX | $XXX | $XXX | $XXX | $XXX | $XXX |
| **Total Direct Costs** | $XXX | $XXX | $XXX | $XXX | $XXX | $XXX |
| Indirect Costs (F&A) | $XXX | $XXX | $XXX | $XXX | $XXX | $XXX |
| **TOTAL COSTS** | $XXX | $XXX | $XXX | $XXX | $XXX | $XXX |
---
## Tips for Strong Budget Justifications
**Do**:
- Tie every cost directly to specific aims and activities
- Provide detailed calculations showing your work
- Explain why the amount is necessary and reasonable
- Use institutional or national standards for rates
- Show cost-effectiveness where possible
- Include escalation (inflation) for out-years
- Be specific about equipment models, conference names, etc.
**Don't**:
- Use vague language ("miscellaneous supplies")
- Forget to justify every line item
- Over-budget for contingency
- Include costs unrelated to the proposed work
- Underestimate costs (creates problems if funded)
- Forget agency-specific cost limitations (salary caps, F&A exclusions)
## Agency-Specific Notes
**NIH**:
- Salary cap applies (~$221,900 for 2024)
- Modular budgets (≤$250K direct) require less detail
- Participant support costs excluded from F&A
**NSF**:
- No salary cap
- Generally 2 summer months maximum for 9-month faculty
- Cost sharing not required (except specific programs)
**DOE**:
- Often requires detailed budgets by quarter
- May require cost sharing
- Equipment often requires special justification
**DARPA**:
- Detailed costs by phase and task
- Often requires supporting cost data
- May need rates approved (DCAA audit for industry)

View File

@@ -0,0 +1,166 @@
# NIH Specific Aims Page Template
**CRITICAL**: Exactly 1 page, 0.5-inch margins, 11-point font minimum
---
## Opening Paragraph: The Hook (3-5 sentences)
[Establish the importance of your research area with compelling statistics or biological significance]
**Template:**
[Disease/Problem] affects [number] people annually and [consequence - mortality, morbidity, cost]. Despite [current treatments/knowledge], [major limitation or gap]. [Why this limitation matters for patients/science]. [Opportunity or need for new approaches].
**Example:**
Type 2 diabetes (T2D) affects 37 million Americans and costs $327 billion annually in healthcare expenditures. Despite available therapies, fewer than 50% of patients achieve glycemic control, and complications including cardiovascular disease, neuropathy, and kidney failure remain common. Existing treatments primarily target insulin resistance and β-cell function, yet fail to address the underlying molecular heterogeneity driving variable therapeutic responses. Identifying molecular subtypes of T2D and their corresponding treatment vulnerabilities represents a critical unmet need for precision medicine approaches.
---
## Second Paragraph: Gap and Rationale (4-6 sentences)
[Define what's known, what's unknown, and why the gap matters]
**Template:**
Prior studies have established [current knowledge - 1-2 sentences]. However, [what remains unknown - the gap]. [Why current approaches are insufficient]. [Critical barrier to progress]. Understanding [the gap] is essential because [impact of filling the gap].
**Example:**
Prior studies have identified numerous genetic and environmental risk factors for T2D, and recent work has revealed metabolic heterogeneity among patients. However, molecular classification schemes have relied primarily on clinical phenotypes (age at onset, BMI, insulin levels) rather than underlying pathophysiology, limiting their therapeutic utility. Current approaches cannot predict which patients will respond to specific therapies, leading to inefficient trial-and-error treatment selection. Understanding the molecular drivers of T2D heterogeneity and their relationships to drug responses is essential for developing predictive biomarkers and targeted treatment strategies.
---
## Third Paragraph: Goal, Objective, Hypothesis, Rationale (5-7 sentences)
**Long-term goal**: [Overarching research program direction]
**Objective**: The objective of this application is to [specific goal of THIS grant - what you will accomplish].
**Central hypothesis**: [Testable prediction that unifies your aims].
This hypothesis is based on [rationale]: our preliminary data showing [key finding 1], [key finding 2], and [key finding 3] (Figures 1-2, Table 1). [Why this evidence supports the hypothesis].
**Example:**
Our long-term goal is to develop precision medicine approaches for type 2 diabetes based on molecular disease subtypes. The objective of this application is to define the molecular basis of T2D heterogeneity and identify subtype-specific therapeutic vulnerabilities. Our central hypothesis is that T2D comprises distinct molecular subtypes driven by different combinations of β-cell dysfunction, insulin resistance, and inflammation, and that these subtypes respond differentially to existing therapies. This hypothesis is based on our preliminary multi-omics profiling of 500 T2D patients revealing five distinct clusters with different genetic architectures, metabolic signatures, and clinical trajectories (Fig. 1). Retrospective analysis showed these subtypes had dramatically different responses to metformin and GLP-1 agonists (Fig. 2), and functional studies in islets confirmed subtype-specific mechanisms (Fig. 3). These findings suggest a molecular classification could guide treatment selection.
---
## Specific Aim 1: [Action Verb - What You Will Do]
[Brief rationale: why this aim is important, background context - 1-2 sentences]
**Working hypothesis**: [Testable prediction for this aim]
**Approach**: We will (1) [first set of experiments/methods], (2) [second set], and (3) [third set]. [Key model systems, sample sizes, or technical approaches].
**Expected outcomes**: We expect to [specific predictions], which will [how this advances knowledge or enables subsequent aims].
**Example:**
## Specific Aim 1: Define molecular subtypes of T2D through integrated multi-omics analysis
Current clinical classification of T2D lacks molecular granularity. Our preliminary clustering analysis identified 5 subtypes, but requires validation and mechanistic characterization.
**Working hypothesis**: T2D comprises at least five molecular subtypes with distinct genomic, transcriptomic, proteomic, and metabolomic signatures.
**Approach**: We will (1) perform multi-omics profiling (genome, transcriptome, proteome, metabolome) on 2,000 T2D patients from three independent cohorts, (2) apply unsupervised clustering and machine learning to identify robust subtypes, and (3) validate subtypes in 1,000 independent patients. We will develop a streamlined classification algorithm using the minimal set of biomarkers sufficient for subtype assignment.
**Expected outcomes**: We will define 5-7 molecular T2D subtypes, characterize their multi-omics signatures, and develop a clinically deployable classifier. This foundation will enable investigation of subtype-specific mechanisms (Aim 2) and treatment responses (Aim 3).
---
## Specific Aim 2: [Action Verb - What You Will Do]
[Brief rationale and background - 1-2 sentences]
**Working hypothesis**: [Testable prediction]
**Approach**: [Detailed methods - 3-5 sentences outlining key experiments, models, techniques, and sample sizes]
**Expected outcomes**: [Specific predictions and impact]
**Example:**
## Specific Aim 2: Elucidate pathophysiological mechanisms underlying each molecular subtype
Molecular subtypes likely reflect distinct disease mechanisms, but causal pathways remain unknown.
**Working hypothesis**: Each T2D subtype is driven by a distinct combination of β-cell dysfunction, hepatic insulin resistance, adipose tissue inflammation, and incretin deficiency.
**Approach**: Using patient-derived iPSCs, primary adipocytes, and liver organoids from each subtype, we will (1) assess β-cell function (insulin secretion dynamics, ER stress, apoptosis), (2) measure insulin signaling in hepatocytes and adipocytes using phosphoproteomics and glucose uptake assays, (3) profile immune cell infiltration and inflammatory cytokines in adipose tissue, and (4) measure GLP-1 secretion and receptor expression. We will perform integrative analysis relating cellular phenotypes to clinical outcomes in n=100 patients per subtype.
**Expected outcomes**: We will define the primary pathophysiological defects in each subtype and identify targetable vulnerabilities. This mechanistic understanding will inform selection of appropriate therapies in Aim 3.
---
## Specific Aim 3: [Action Verb - What You Will Do]
[Brief rationale - 1-2 sentences]
**Working hypothesis**: [Testable prediction]
**Approach**: [Methods - 3-5 sentences]
**Expected outcomes**: [Predictions and impact]
**Example:**
## Specific Aim 3: Determine subtype-specific responses to existing T2D therapies
Current treatment algorithms do not account for molecular heterogeneity, leading to suboptimal outcomes.
**Working hypothesis**: T2D subtypes exhibit differential responses to metformin, GLP-1 agonists, SGLT2 inhibitors, and insulin, based on their underlying pathophysiology.
**Approach**: We will (1) conduct retrospective analysis of treatment responses in 5,000 patients with known subtypes from electronic health records, (2) validate findings in a prospective observational cohort (n=500, 18-month follow-up), and (3) test predicted drug sensitivities in patient-derived cell models and humanized mice (n=15 per subtype per drug). Primary outcomes are HbA1c reduction, with secondary outcomes including weight, hypoglycemia, and cardiovascular risk markers.
**Expected outcomes**: We will identify optimal first-line therapies for each subtype and develop a treatment algorithm. Retrospective data suggest subtype-guided therapy could improve HbA1c control by 0.8-1.2% compared to standard care. Results will inform an investigator-initiated clinical trial (resources available through our Clinical Research Center).
---
## Closing Paragraph: Impact and Significance (3-5 sentences)
[Summarize expected outcomes, how it advances the field, and positive impact]
**Template:**
The proposed research is significant because [why it matters]. Results will [specific advances - knowledge, tools, treatments]. We expect findings will [broader impact on field or health]. This work will [transformative potential or next steps].
**Example:**
The proposed research is significant because it will establish a molecular taxonomy of type 2 diabetes and identify subtype-specific treatment strategies, addressing a critical barrier to precision medicine in this prevalent disease. Results will provide mechanistic insights into T2D heterogeneity, immediately applicable biomarkers for patient stratification, and evidence-based treatment algorithms. We expect findings will enable personalized therapeutic approaches that substantially improve glycemic control and reduce complications for the 37 million Americans with T2D. This work will establish new paradigms for precision medicine in complex metabolic diseases and provide the foundation for a prospective subtype-guided treatment trial that could transform clinical practice.
---
## Formatting Checklist
- [ ] Exactly 1 page (not 1.1, not 0.9)
- [ ] 0.5-inch margins (all sides)
- [ ] 11-point Arial/Helvetica or equivalent
- [ ] Readable line spacing
- [ ] Aim statements are bold or underlined
- [ ] Gene names italicized (*TP53*)
- [ ] Figures (if included) are legible
- [ ] All abbreviations defined at first use
## Content Checklist
- [ ] Opens with compelling importance statement
- [ ] Includes epidemiological data or significance metrics
- [ ] Clearly defines the gap in knowledge
- [ ] States long-term goal
- [ ] States specific objective of THIS application
- [ ] Presents testable central hypothesis (or research questions)
- [ ] Mentions preliminary data supporting feasibility
- [ ] Includes 2-4 specific aims
- [ ] Each aim has: rationale, hypothesis, approach, expected outcomes
- [ ] Aims are testable and achievable
- [ ] Aims are independent but synergistic
- [ ] Expected outcomes are specific
- [ ] Closes with impact statement
- [ ] Passes the "skim test" (aim statements tell the story)
## Tips for Success
1. **Write 10+ drafts** - This page is too important to rush
2. **Get extensive feedback** - From colleagues, mentors, people outside your field
3. **Read it aloud** - Check for flow and clarity
4. **Study funded examples** - Look at successful aims pages in your field
5. **Test on non-experts** - Can someone in a different field understand the importance?
6. **Check every word** - Every sentence must earn its place on this precious page

View File

@@ -0,0 +1,92 @@
# NSF Project Summary Template
**IMPORTANT**: NSF requires three labeled sections in the project summary (max 1 page):
1. Overview
2. Intellectual Merit
3. Broader Impacts
---
## Overview
[Write a paragraph suitable for public dissemination that explains:
- The research question or problem
- The approach or methods
- Expected outcomes
- Significance
This should be accessible to a broad audience including non-scientists. Avoid jargon.]
**Example:**
This project investigates how coastal wetlands respond to rising sea levels and increased storm intensity caused by climate change. Using a combination of field observations, remote sensing, and computer modeling across 20 sites along the Atlantic coast, we will determine whether wetlands can migrate inland fast enough to keep pace with sea level rise. Results will inform coastal management policies and help predict the fate of critical ecosystems that protect shorelines and support fisheries. This work will train 5 graduate students and 10 undergraduates, with priority recruitment from underrepresented groups through partnerships with minority-serving institutions.
---
## Intellectual Merit
[Address the question: What is the potential for the proposed activity to advance knowledge?
Include:
- Why the research is important scientifically
- What knowledge gap it addresses
- What will be learned
- Novel aspects of the approach
- How it advances the field]
**Example:**
This research addresses a critical gap in understanding coastal wetland resilience under accelerating climate change. Current models of wetland migration fail to account for biological constraints on vegetation establishment and feedbacks between sediment dynamics and plant growth. We will develop the first integrated model coupling hydrological, ecological, and geomorphological processes across multiple spatial scales. Our novel approach combines high-resolution LiDAR elevation data with experimental manipulations of sediment and salinity to parameterize vegetation response functions. Expected outcomes include quantitative predictions of wetland migration rates under different sea level rise scenarios, identification of landscape features that facilitate or impede migration, and new theory on ecosystem tipping points. This work will transform our ability to predict and manage coastal ecosystem responses to climate change.
---
## Broader Impacts
[Address the question: What is the potential for the proposed activity to benefit society?
Must address at least one of NSF's five broader impacts areas with specific, measurable activities:
1. Advance discovery while promoting teaching, training, and learning
2. Broaden participation of underrepresented groups
3. Enhance infrastructure for research and education
4. Broadly disseminate to enhance scientific understanding
5. Benefit society
Be SPECIFIC with concrete activities, timelines, and assessment plans.]
**Example:**
This project will generate significant broader impacts through three integrated activities:
**1. Education and Training**: We will train 5 PhD students and 10 undergraduates in interdisciplinary coastal science, emphasizing field methods, remote sensing, and quantitative modeling. Undergraduates will participate through summer research internships (10 weeks, $5,000 stipends) with mentorship from graduate students. We will recruit 50% of undergraduates from groups underrepresented in STEM through partnerships with 4 historically Black colleges and universities (HBCUs). Students will present results at the Annual Biogeographical Research Conference and co-author peer-reviewed publications.
**2. Stakeholder Engagement and Policy Impact**: We will partner with 5 state coastal management agencies and The Nature Conservancy to translate research findings into management tools. Annual workshops will bring together 30 coastal managers, conservation practitioners, and researchers to co-develop decision-support frameworks. Results will inform state sea level rise adaptation plans, wetland restoration prioritization, and land acquisition strategies affecting 500,000 acres of coastal habitat.
**3. Public Science Communication**: We will create a publicly accessible web-based visualization tool showing projected wetland changes under different climate scenarios for the entire Atlantic coast. The tool will be promoted through social media, state agency websites, and science museums, with expected reach of 50,000 users. We will also develop bilingual (English/Spanish) educational materials for K-12 teachers, piloted in 10 schools serving predominantly underrepresented students.
Impact will be assessed through pre/post surveys of student participants, tracking of research participants into STEM careers, documentation of policy adoptions by management agencies, and analytics on public engagement platform usage.
---
## Formatting Requirements
- **Page Limit**: 1 page maximum
- **Margins**: 1 inch all sides
- **Font**: 11-point or larger (Times Roman, Arial, Palatino, Computer Modern)
- **Section Headers**: Must use exactly these three labels:
- Overview
- Intellectual Merit
- Broader Impacts
- **Public Accessibility**: Overview section suitable for general public
## Common Mistakes to Avoid
**Don't** omit any of the three required section headings
**Don't** make broader impacts vague ("will train students")
**Don't** use jargon in the Overview
**Don't** exceed 1 page
**Don't** forget to mention preliminary data or team qualifications
**Don't** make broader impacts an afterthought (they're equally important)
**Do** make all three sections substantive
**Do** be specific about broader impacts activities
**Do** write Overview for broad audience
**Do** convey enthusiasm and significance
**Do** proofread carefully (this is the first thing reviewers see)

View File

@@ -0,0 +1,392 @@
# Broader Impacts: Strategies and Best Practices
## Overview
**Broader Impacts** are one of two review criteria for NSF proposals, carrying equal weight with Intellectual Merit. Despite this, broader impacts are often treated as an afterthought—a critical mistake that costs otherwise strong proposals their funding.
**NSF Definition**: "The potential to benefit society and contribute to the achievement of specific, desired societal outcomes"
**Key Principle**: Broader impacts must be **specific, measurable, and integrated** with your research plan—not vague aspirations tacked onto the end.
## The Five Pillars of Broader Impacts
NSF evaluates broader impacts across five main areas. **You don't need to address all five**, but you should address at least one substantively with concrete activities, timelines, and assessment plans.
### 1. Advance Discovery While Promoting Teaching, Training, and Learning
**What This Means**: Integrate research and education to inspire the next generation of scientists and enhance scientific literacy.
**Effective Strategies**:
**Curriculum Development**:
- Create new courses incorporating research findings
- Develop course modules or laboratory exercises
- Design online learning materials (MOOCs, videos, interactive tools)
- Contribute to textbooks or educational resources
*Example*: "We will develop a 10-week computational biology module for undergraduate education, incorporating real datasets from this project. The module will include Jupyter notebooks with guided analysis, video tutorials, and assessment tools. Materials will be piloted at our institution (reaching 50 students annually) and made freely available through CourseSource for national adoption."
**Student Training**:
- Undergraduate research experiences
- Graduate student mentoring
- Postdoctoral training
- High school intern programs
- Research experiences for teachers (RET)
*Example*: "The project will support 3 PhD students and 6 undergraduate researchers over 5 years. Undergraduates will participate through our existing summer research program (10 weeks, $5,000 stipends) and will present findings at the annual undergraduate research symposium and regional conferences."
**Pedagogical Innovation**:
- Problem-based learning modules
- Active learning strategies
- Research-intensive courses
- Service learning projects
- Maker spaces or hands-on workshops
*Example*: "We will transform our introductory physics course (250 students/year) by implementing studio-style physics instruction based on results from this research. The new curriculum will include 3D visualization tools for electromagnetic fields, inquiry-based problem sets, and peer instruction protocols."
**Professional Development**:
- Workshops for faculty or teachers
- Training programs for early-career researchers
- Mentoring programs
- Career development resources
*Example*: "We will host annual 3-day workshops for 25 community college faculty, providing training in genome editing techniques. Participants will receive hands-on experience with CRISPR methods developed in this project, complete teaching modules for their courses, and ongoing support through a virtual learning community."
### 2. Broaden Participation of Underrepresented Groups
**What This Means**: Increase participation of groups underrepresented in STEM, including women, racial/ethnic minorities, persons with disabilities, and those from economically disadvantaged backgrounds.
**Effective Strategies**:
**Partnerships with Minority-Serving Institutions**:
- Collaborate with HBCUs (Historically Black Colleges and Universities)
- Partner with HSIs (Hispanic-Serving Institutions)
- Work with TCUs (Tribal Colleges and Universities)
- Engage with community colleges
*Example*: "We will establish formal research partnerships with 4 regional HBCUs (North Carolina A&T, Howard University, Morehouse College, and Spelman College). Each summer, 2 students from partner institutions will participate in 10-week research internships, including stipends ($6,000), housing, travel to field sites, and participation in our weekly research seminar series. A faculty liaison from each partner institution will co-mentor students and facilitate year-round engagement."
**Recruitment and Retention**:
- Targeted recruitment at conferences (SACNAS, ABRCMS, NSBE, SWE)
- Scholarship programs for underrepresented students
- Bridge programs for community college transfers
- Retention support (mentoring, peer networks, professional development)
*Example*: "We will recruit 50% of summer undergraduate researchers from groups underrepresented in computer science through partnerships with SACNAS and the National Society of Black Engineers. Participants will receive mentoring from graduate students with similar backgrounds, attend professional development workshops, and join our diversity-in-computing learning community that provides year-round support and networking."
**Culturally Relevant Engagement**:
- Research addressing community-identified needs
- Community-based participatory research
- Engagement with indigenous communities
- Bilingual materials and outreach
*Example*: "In partnership with the Navajo Nation, we will conduct participatory research on water quality in reservation communities. Community members will co-design the research questions, participate in data collection, and contribute indigenous knowledge about local hydrology. Results will be shared through community presentations in both English and Navajo, and will inform tribal water management policies."
**Addressing Systemic Barriers**:
- Flexible schedules for non-traditional students
- Childcare support for participants
- Accessible facilities and materials
- Financial support (stipends, travel, equipment)
- Mentoring networks and affinity groups
*Example*: "To support participation of students from low-income backgrounds, we will provide laptop computers, software licenses, and internet hotspots to all research participants. We will also offer flexible work schedules, remote participation options, and supplemental funding for students with childcare or eldercare responsibilities."
### 3. Enhance Infrastructure for Research and Education
**What This Means**: Build facilities, tools, databases, or networks that enable future research and education across the broader community.
**Effective Strategies**:
**Shared Research Infrastructure**:
- Multi-user instrumentation
- Core facilities
- Field stations or observatories
- Computational resources
- Cyberinfrastructure
*Example*: "We will establish a regional Cryo-Electron Microscopy facility serving 15 institutions in the Southwest. The facility will provide training and access to state-of-the-art imaging capabilities currently unavailable in the region. We will operate a user program with subsidized rates for academic users and offer annual training workshops for 50 researchers."
**Data and Software Resources**:
- Open-access databases
- Software tools and platforms
- Analysis pipelines
- Standardized protocols
- Data repositories
*Example*: "We will develop and maintain EcoDataHub, an open-source platform for ecological time-series analysis. The platform will include automated data cleaning, standardized analysis workflows, interactive visualization tools, and cloud computing integration. Software will be documented, version-controlled on GitHub, and supported through user forums and quarterly webinars. We expect 1,000+ users within 3 years based on community surveys."
**Biological or Physical Resources**:
- Living stock centers (model organisms, cell lines)
- Specimen collections
- Reagent repositories
- Seed banks or tissue collections
*Example*: "We will establish a publicly accessible repository of 500 sequenced bacterial strains isolated from extreme environments. Each strain will include full genome sequence, phenotypic characterization, and growth protocols. Materials will be available through the ATCC with metadata deposited in NCBI BioProject."
**Standards and Protocols**:
- Community standards
- Best practices guides
- Benchmarking datasets
- Quality control metrics
- Interoperability frameworks
*Example*: "Working with 20 international laboratories, we will develop and validate standardized protocols for single-cell RNA sequencing analysis. The resulting guidelines will address batch effects, quality control, normalization methods, and statistical best practices. Protocols will be published in peer-reviewed literature and deposited in protocols.io."
### 4. Broadly Disseminate to Enhance Scientific and Technological Understanding
**What This Means**: Communicate research to broader audiences including the public, K-12 students, policymakers, and stakeholders to enhance scientific literacy and informed decision-making.
**Effective Strategies**:
**K-12 Education Outreach**:
- School visits and science demonstrations
- After-school programs
- Science fairs and competitions
- Teacher professional development
- Classroom resources and lesson plans
*Example*: "We will partner with 10 local middle schools (serving 75% students from low-income families) to deliver hands-on robotics workshops. Each school will receive robot kits, and we will train teachers to lead a 12-week after-school robotics club. Students will apply concepts from this research (sensor fusion, autonomous navigation) to design robots for real-world challenges. The program will reach 200 students annually."
**Public Engagement**:
- Museum partnerships and exhibits
- Science cafés and public lectures
- Science festivals
- Citizen science projects
- Community workshops
*Example*: "We will collaborate with the Museum of Science and Industry to create a permanent interactive exhibit on climate modeling. The exhibit will allow visitors to manipulate climate variables and observe predicted outcomes using simplified versions of our models. We anticipate 500,000 annual visitors. We will also host quarterly 'Climate Science Saturday' public lectures reaching 2,000 community members annually."
**Media and Communications**:
- Blog posts and articles
- Podcasts or videos
- Social media engagement
- Press releases for major findings
- Popular science writing
*Example*: "We will produce a 6-episode podcast series exploring the intersection of artificial intelligence and creativity, featuring interviews with artists, musicians, and computer scientists. Episodes will be freely available on major platforms, with transcripts and educational materials on our website. Based on our existing podcast (15,000 downloads/episode), we expect to reach 100,000+ listeners."
**Policy Engagement**:
- Science policy fellowships
- Congressional briefings
- White papers for decision-makers
- Stakeholder workshops
- Regulatory science contributions
*Example*: "We will organize annual workshops bringing together researchers, water utilities, environmental regulators, and community advocates to discuss implications of our research for drinking water policy. Findings will be synthesized into policy briefs distributed to state and federal agencies. PI will participate in the AAAS Science and Technology Policy Fellowship to engage directly with EPA rulemaking."
**Citizen Science**:
- Community-based data collection
- Participatory research design
- Volunteer monitoring programs
- Crowdsourcing platforms
*Example*: "We will launch a citizen science program enlisting 500 volunteers across the Midwest to monitor pollinator populations using our smartphone app. Participants will receive training materials, identification guides, and regular feedback on their observations. Data will contribute directly to our research while building public understanding of pollinator ecology. Results will be visualized on an interactive public dashboard."
### 5. Benefit Society
**What This Means**: Apply research to address societal needs, improve quality of life, strengthen national security, or enhance economic competitiveness.
**Effective Strategies**:
**Health and Well-Being**:
- Clinical applications
- Public health improvements
- Healthcare accessibility
- Mental health resources
- Environmental health
*Example*: "Our diagnostic tool will reduce costs of malaria diagnosis from $10 to $0.50 per test, enabling deployment in resource-limited settings. We will partner with PATH and Médecins Sans Frontières to conduct field trials in 3 African countries and develop manufacturing partnerships for at-scale production. We project this technology could reach 10 million patients annually within 5 years."
**Economic Development**:
- Technology commercialization
- Job creation
- Industry partnerships
- Workforce development
- Startup formation
*Example*: "We will establish an industry partnership program with 5 regional manufacturing companies to transfer our advanced materials synthesis methods. Through quarterly technical workshops and on-site consultations, we will help companies integrate these processes into production lines, potentially creating 50-100 high-skill jobs over 5 years. Two graduate students will complete internships at partner companies."
**Environmental Sustainability**:
- Climate change mitigation or adaptation
- Conservation and biodiversity
- Pollution reduction
- Sustainable agriculture
- Renewable energy
*Example*: "Our soil carbon sequestration practices will be implemented on 1,000 acres of working farmland in partnership with 15 Iowa farmers. We will provide training, monitoring support, and carbon credit market access. If successful, practices could sequester 100,000 tons of CO2 equivalent annually if adopted across 10% of Midwest cropland, while increasing farmer income by $50-100/acre through carbon credits."
**National and Homeland Security**:
- Defense applications
- Cybersecurity
- Critical infrastructure protection
- Emergency response
- Intelligence capabilities
*Example*: "We will work with the Department of Homeland Security to adapt our threat detection algorithms for transportation security screening. Technology will be piloted at 3 major airports, with the goal of reducing false-positive rates by 40% while maintaining security effectiveness, decreasing passenger wait times and improving screening efficiency."
**Social and Cultural Benefits**:
- Preservation of cultural heritage
- Accessibility and inclusion
- Social justice
- Arts and humanities
- Quality of life improvements
*Example*: "Our 3D scanning and virtual reality platform will be used to digitally preserve 20 culturally significant sites threatened by climate change and development. Virtual reconstructions will be made freely available to descendant communities, schools, and the public through a web-based interface and VR experiences. We will partner with indigenous groups to ensure culturally appropriate representation."
## Best Practices for Broader Impacts
### Be Specific and Concrete
**Vague** ❌:
"This research will train the next generation of scientists."
**Specific** ✅:
"This project will support 3 PhD students, 2 postdocs, and 12 undergraduate researchers over 5 years. Undergraduates will be recruited through our partnership with the Louis Stokes Alliance for Minority Participation, with a goal of 50% participation from underrepresented groups. Students will receive training in advanced microscopy, data analysis, and scientific communication, and will present their research at the annual Emerging Researchers National Conference."
### Include Timelines and Milestones
**Vague** ❌:
"We will develop educational materials."
**Specific** ✅:
"Year 1: Develop draft curriculum modules and pilot with 50 students
Year 2: Revise based on assessment data and expand to 150 students across 3 institutions
Years 3-5: National dissemination through CourseSource, workshops at 2 professional conferences, and online repository. Target: Adoption by 20 institutions reaching 1,000 students annually by Year 5."
### Measure and Assess Impact
**Include**:
- Quantitative metrics (number of participants, downloads, users)
- Qualitative assessment (surveys, interviews, focus groups)
- Learning outcomes or behavioral changes
- Longitudinal tracking
- Comparison to baseline or control groups
**Example**:
"We will assess program effectiveness through: (1) Pre/post surveys measuring science self-efficacy using validated instruments, (2) Tracking participant persistence in STEM majors through institutional records, (3) Focus groups with participants and teachers, (4) Analysis of student work products. We expect to see a 30% increase in science self-efficacy scores and 90% retention in STEM majors among participants compared to 65% institutional baseline."
### Leverage Existing Infrastructure
**Don't reinvent the wheel**—build on existing programs and partnerships:
- Institutional programs (REU sites, AGEP, LSAMP, etc.)
- Community partnerships already established
- Shared facilities or resources
- Professional societies and organizations
**Example**:
"We will integrate with our institution's existing NSF REU site in Materials Science, adding 2 additional positions focused on our research area. This leverages established recruitment pipelines with 15 partner institutions, professional development programming, and assessment infrastructure while expanding opportunities for undergraduate researchers."
### Demonstrate Institutional Commitment
**Show that broader impacts will continue beyond grant period**:
- Institutional cost-sharing or support
- Integration into ongoing programs
- Sustainability plan
- Letters of commitment from partners
**Example**:
"The university has committed $50,000 annually in cost-share to sustain the high school outreach program beyond the grant period. The program will be integrated into our Center for STEM Education, ensuring administrative support, space, and continuity. Our partner school districts have committed teacher time and classroom access (see letters of commitment in supplementary documents)."
### Align with Research Plan
**Integration examples**:
- Students work on research questions from the proposal
- Educational materials use data generated by the research
- Outreach communicates research findings
- Community needs inform research questions
**Poor Integration** ❌:
Research on quantum computing + Unrelated marine biology outreach for middle schoolers
**Good Integration** ✅:
Research on quantum computing + Develop quantum computing curriculum modules + Summer program where students program quantum simulators + Public lectures on quantum technologies
## Common Broader Impacts Mistakes
### Mistake 1: Generic and Vague Statements
❌ "This project will train graduate students and postdocs."
❌ "Results will be broadly disseminated through publications and conferences."
❌ "We will engage in outreach activities."
These are baseline expectations, not broader impacts.
### Mistake 2: No Plan or Timeline
❌ "We hope to develop educational materials that could be used nationally."
✅ "Year 1: Develop and pilot 5 teaching modules. Year 2: Assess effectiveness and refine. Year 3: Publish in Journal of Chemical Education. Years 4-5: Disseminate through workshops at 3 national conferences and online repository. Target: Adoption by 30 institutions by Year 5."
### Mistake 3: No Assessment
❌ "We will run a summer camp for underrepresented students."
✅ "We will run a 4-week summer camp for 30 students (60% from underrepresented groups). We will assess impact through pre/post content knowledge tests, science identity surveys, and tracking of STEM course enrollment. We expect 80% of participants to enroll in advanced science courses the following year."
### Mistake 4: Unrealistic Scope
❌ "We will establish a national network of 100 schools, develop a comprehensive K-12 curriculum, create a museum exhibit, launch a nationwide citizen science program, and commercialize our technology" (with no budget or personnel allocated).
Be realistic about what you can accomplish with the resources and time available.
### Mistake 5: Poor Integration
❌ Research on plant genomics + Unrelated robotics outreach
✅ Research on plant genomics + Develop plant biology curriculum + Engage community gardens in phenotyping citizen science
### Mistake 6: Treating as Afterthought
❌ Half-page generic statement at end of proposal with no budget allocation
✅ Integrated throughout proposal, dedicated personnel (0.5 month PI time, 10% grad student, summer coordinator), allocated budget ($15K/year), detailed plan, and assessment strategy
### Mistake 7: No Track Record
If proposing extensive broader impacts activities but have no history of such work, reviewers will be skeptical.
✅ Show preliminary efforts, leverage existing programs, include collaborators with relevant expertise, cite successful prior broader impacts work
## Budgeting for Broader Impacts
**NSF expects resources allocated to broader impacts activities.**
**Typical Budget Items**:
- **Personnel**: Program coordinator, graduate students, undergraduate assistants
- **Participant support**: Stipends, travel, housing for students/teachers
- **Materials and supplies**: Educational materials, outreach equipment, workshop supplies
- **Travel**: Conference presentations of broader impacts work, site visits to partners
- **Subawards**: Payments to partnering institutions or organizations
- **Evaluation**: External evaluator for assessment
**Example Budget**:
- Summer program coordinator (2 months/year): $15,000/year
- Undergraduate stipends (10 students × $5,000): $50,000/year
- Materials and supplies for workshops: $5,000/year
- Travel for recruitment and partner meetings: $3,000/year
- External evaluator: $8,000/year
- **Total: $81,000/year (16% of $500K budget)**
## Resources for Broader Impacts
### NSF Resources
- **NSF Broader Impacts Website**: https://www.nsf.gov/od/oia/special/broaderimpacts/
- **BI Examples Repository**: https://www.cmu.edu/uro/resources for undergraduate research/best practices/broader-impacts.html
- **Broader Impacts Toolkit**: Many universities provide institutional resources
### Assessment Tools
- **STEM-OP (STEM Outreach Program)**: Survey instruments for outreach assessment
- **STELAR Network**: Resources for informal STEM education
- **Evaluation frameworks**: Logic models, theory of change
### Partner Organizations
- **SACNAS**: Society for Advancement of Chicanos/Hispanics and Native Americans in Science
- **ABRCMS**: Annual Biomedical Research Conference for Minority Students
- **NSBE, SWE, AISES**: Professional societies for underrepresented groups
- **Science museums and centers**: Partner for public engagement
- **School districts and community organizations**: For K-12 outreach
---
**Key Takeaway**: Effective broader impacts are specific, measurable, assessed, integrated with the research plan, and demonstrate institutional commitment. They should be planned with the same rigor as the research itself, with dedicated resources, timelines, milestones, and evaluation strategies. Generic statements about "training students" or "disseminating results" are insufficient—NSF expects concrete plans that demonstrably benefit society.

View File

@@ -0,0 +1,636 @@
# DARPA (Defense Advanced Research Projects Agency) Grant Writing Guidelines
## Agency Overview
**Mission**: Make pivotal investments in breakthrough technologies for national security
**Tagline**: "Creating breakthrough technologies and capabilities for national security"
**Annual Budget**: ~$4 billion
**Website**: https://www.darpa.mil
**Key Characteristics**:
- High-risk, high-reward research
- Focused on revolutionary breakthroughs, not incremental advances
- Technology transition to military and commercial applications
- Program managers with broad autonomy
- ~3-5 year programs with defined end goals
- Strong emphasis on prototypes and demonstrations
- "DARPA-hard" problems that others won't or can't tackle
**The DARPA Difference**:
- NOT basic research (that's ONR, AFOSR, ARO)
- NOT development and procurement (that's service acquisition)
- Focused on proof-of-concept to prototype stage
- Tolerates and expects failure in pursuit of breakthroughs
- Rapid transition to operational use
## DARPA Organization
### Six Technical Offices
#### 1. BTO (Biological Technologies Office)
**Focus**: Biology as technology, human-machine interfaces, synthetic biology
**Example Programs**:
- Neural interfaces and brain-computer interfaces
- Synthetic biology and living foundries
- Pandemic prevention and response
- Human performance enhancement
- Biotechnology for manufacturing
#### 2. DSO (Defense Sciences Office)
**Focus**: High-risk, high-payoff research in physical and mathematical sciences
**Example Programs**:
- Novel materials and chemistry
- Quantum technologies
- Electromagnetics and photonics
- Mathematics and algorithms
- Fundamental limits of physics
#### 3. I2O (Information Innovation Office)
**Focus**: Information advantage through computing, communications, and cyber
**Example Programs**:
- Artificial intelligence and machine learning
- Cybersecurity and cyber resilience
- Communications and networking
- Data analytics and processing
- Human-computer interaction
#### 4. MTO (Microsystems Technology Office)
**Focus**: Microelectronics, photonics, and heterogeneous microsystems
**Example Programs**:
- Advanced electronics and integrated circuits
- Photonics and optical systems
- Novel computational architectures
- RF and millimeter-wave systems
- MEMS and sensors
#### 5. STO (Strategic Technology Office)
**Focus**: Technologies for space, air, maritime, and ground systems
**Example Programs**:
- Autonomous systems (air, ground, sea, space)
- Advanced propulsion and power
- Space technologies
- Electronic warfare
- Long-range precision fires
#### 6. TTO (Tactical Technology Office)
**Focus**: Near-term technologies for ground, maritime, and expeditionary forces
**Example Programs**:
- Tactical autonomy
- Advanced weapons
- Urban operations
- Maneuver and logistics
- Special operations support
## How DARPA Works
### Program Manager-Centric Model
**Program Managers (PMs)**:
- ~100 PMs across DARPA
- Hired on 3-5 year rotations from academia, industry, government labs
- Have significant autonomy to create and run programs
- Identify "DARPA-hard" problems and solutions
- Manage portfolios of 10-20 projects
**PM Lifecycle**:
1. **Develop vision**: Identify transformative opportunity
2. **Create program**: Design research thrusts and metrics
3. **Issue BAA**: Broad Agency Announcement for proposals
4. **Select teams**: Choose performers and structure program
5. **Manage program**: Track milestones, adjust course, transition technology
6. **Transition**: Hand off successful technologies to services or industry
**Implication for Proposers**:
- PMs have the vision—your job is to execute it
- Contact PM before proposing (almost always required)
- Understand PM's technical vision and goals
- Build relationship with PM (within ethical bounds)
### The "DARPA-Hard" Test
**Three Questions Every DARPA Program Must Answer**:
1. **What are you trying to do?**
- Articulate objectives using absolutely no jargon
- Clear, specific technical goal
2. **How is it done today, and what are the limits of current practice?**
- What's the current state of the art?
- Why are current approaches insufficient?
- What fundamental barriers exist?
3. **What is new in your approach, and why do you think it will be successful?**
- What's the breakthrough insight or capability?
- Why hasn't this been done before?
- What's changed to make it possible now?
**Additional Considerations**:
- **Who cares?** (What's the national security impact?)
- **What if you're right?** (What becomes possible?)
- **What if you're wrong?** (Is the risk acceptable?)
- **What if you succeed?** (Is there a transition path?)
**DARPA Seeks**:
- **High Risk**: 50% chance of failure is acceptable
- **High Reward**: 10x improvement, not 10% improvement
- **Measurable**: Clear metrics of success
- **Transitional**: Path to operational use or commercial adoption
## Types of DARPA Solicitations
### 1. Broad Agency Announcements (BAAs)
**Most Common Mechanism**: Open solicitations for specific program areas
**Characteristics**:
- Issued by program managers for specific programs
- Describe technical objectives and research thrusts
- Multiple submission deadlines or rolling submission
- Full proposals typically 20-40 pages
- Often require abstract or white paper first
**Types of BAAs**:
**Program BAAs**: For specific named programs
- Clear technical objectives and metrics
- Defined research areas (thrusts)
- Specified deliverables and milestones
- Known PM with clear vision
**Office-Wide BAAs**: General solicitations by technical office
- Broader scope, less prescriptive
- Looking for transformative ideas
- More flexibility in approach
- May have multiple areas of interest
### 2. Small Business Innovation Research (SBIR)
**For Small Businesses**:
- **Phase I**: $150K-$250K, 6-9 months (feasibility)
- **Phase II**: $1M-$2M, 2 years (development)
- **Phase III**: Non-SBIR funds (commercialization)
### 3. Proposers Days and Special Notices
**Proposers Day**: Pre-solicitation event
- PM presents program vision and objectives
- Q&A with potential proposers
- Networking for team formation
- Often required or strongly encouraged to attend
**Special Notices**: Requests for Information (RFIs), teaming opportunities
## DARPA Proposal Structure
**Note**: Format varies by BAA. **Always follow the specific BAA instructions precisely.**
### Typical Structure
#### Volume 1: Technical and Management Proposal (20-40 pages)
**Section 1: Executive Summary** (1-2 pages)
- Overview of proposed research
- Technical approach and innovation
- Expected outcomes and deliverables
- Team qualifications
- Alignment with BAA objectives
**Section 2: Goals and Impact** (2-3 pages)
- Statement of the problem
- Importance and national security relevance
- Current state of the art and limitations
- How your work will advance the state of the art
- Impact if successful (What if true? Who cares?)
- Alignment with DARPA program goals
**Section 3: Technical Approach and Innovation** (10-20 pages)
- Detailed technical plan organized by phase or thrust
- Novel approaches and why they will work
- Technical risks and mitigation strategies
- Preliminary results or proof-of-concept data
- Technical barriers and how to overcome them
- Innovation and differentiation from existing work
**Organized by Phase** (typical):
**Phase 1 (Feasibility)**: 12-18 months
- Technical objectives and milestones
- Approach and methodology
- Expected outcomes
- Metrics for success
- Go/no-go criteria for Phase 2
**Phase 2 (Development)**: 18-24 months
- Building on Phase 1 results
- System integration and optimization
- Testing and validation
- Prototype development
- Metrics and evaluation
**Phase 3 (Demonstration)**: 12-18 months (if applicable)
- Field testing or operational demonstration
- Transition activities
- Handoff to transition partner
**Section 4: Capabilities and Resources** (2-3 pages)
- Team qualifications and expertise
- Facilities and equipment
- Relevant prior work and publications
- Subcontractor and collaborator roles
- Organizational structure
**Section 5: Statement of Work (SOW)** (3-5 pages)
- Detailed task breakdown
- Deliverables for each task
- Milestones and metrics
- Timeline (Gantt chart)
- Dependencies and critical path
- Government furnished property or information (if applicable)
**Section 6: Schedule and Milestones** (1-2 pages)
- Integrated master schedule
- Key decision points
- Deliverable schedule
- Go/no-go criteria
- Reporting and meeting schedule
**Section 7: Technology Transition Plan** (2-3 pages)
- Potential transition partners (military services, industry)
- Pathway to operational use or commercialization
- Market or operational analysis
- Transition activities during the program
- IP and licensing strategy (if applicable)
#### Volume 2: Cost Proposal (separate)
**Detailed Budget**:
- Costs by phase, task, and year
- Labor (personnel, hours, rates)
- Materials and supplies
- Equipment
- Travel
- Subcontracts
- Other direct costs
- Indirect costs (overhead, G&A)
- Fee or profit (for industry)
**Cost Narrative**:
- Justification for each cost element
- Labor categories and rates
- Basis of estimate
- Cost realism analysis
- Supporting documentation
**Supporting Documentation**:
- Cost accounting standards
- Approved indirect rate agreements
- Subcontractor quotes or cost proposals
#### Additional Volumes (if required)
**Attachments**:
- Quad charts (1-slide summary)
- Relevant publications or technical papers
- Letters of commitment from collaborators
- Facilities descriptions
- Equipment lists
## Review Criteria
### DARPA Evaluation Factors (Typical)
**Primary Criteria** (usually equal weight):
1. **Overall Scientific and Technical Merit**
- Technical soundness and feasibility
- Innovation and novelty
- Likelihood of achieving objectives
- Technical approach and methodology
- Understanding of problem and prior art
- Risk and risk mitigation
2. **Potential Contribution and Relevance to DARPA Mission**
- Alignment with program objectives
- National security impact
- Advancement over state of the art
- Potential for revolutionary breakthrough
- "What if true? Who cares?" test
3. **Cost Realism and Reasonableness**
- Budget aligned with technical plan
- Costs justified and realistic
- Value for investment
- Cost versus benefit analysis
4. **Capabilities and Related Experience**
- Team qualifications and track record
- Facilities and resources adequate
- Relevant prior work
- Ability to deliver on time and on budget
- Management approach
5. **Technology Transition**
- Pathway to operational use or market
- Transition partnerships
- Market analysis (if applicable)
- Plans for follow-on development
- IP strategy supporting transition
### The "Heilmeier Catechism"
**DARPA uses this set of questions** (created by former DARPA director George Heilmeier):
1. What are you trying to do? Articulate your objectives using absolutely no jargon.
2. How is it done today, and what are the limits of current practice?
3. What is new in your approach and why do you think it will be successful?
4. Who cares? If you succeed, what difference will it make?
5. What are the risks?
6. How much will it cost?
7. How long will it take?
8. What are the mid-term and final "exams" to check for success?
**Your proposal should clearly answer all eight questions.**
## DARPA Proposing Strategy
### Before Writing
**1. Contact the Program Manager**
- Email PM to introduce yourself and idea
- Request call to discuss fit with program
- Attend Proposers Day if available
- Ask clarifying questions about BAA
**2. Form a Strong Team**
- DARPA values multidisciplinary teams
- Include complementary expertise
- Mix of academia, industry, government labs
- Clearly defined roles
- Prior collaboration history (if possible)
**3. Understand the Vision**
- What is the PM trying to achieve?
- What technical barriers need to be overcome?
- What does success look like?
- What are the program metrics?
**4. Identify Transition Path**
- Who will use the technology?
- What's the path from prototype to product?
- Who are potential transition partners?
- What's the market or operational need?
### Writing the Proposal
**Lead with Impact**:
- Open with the "so what?"
- National security or economic impact
- What becomes possible if you succeed?
**Be Concrete and Specific**:
- Clear technical objectives with metrics
- Measurable milestones
- Quantitative targets (10x improvement, not "better")
- Specific deliverables
**Demonstrate Innovation**:
- What's the breakthrough?
- Why hasn't this been done before?
- What's changed to make it possible now?
- How is this different from evolutionary approaches?
**Address Risk Head-On**:
- Identify technical risks explicitly
- Explain mitigation strategies
- Show that you've thought through failure modes
- DARPA expects risk—don't hide it, manage it
**Show You Can Execute**:
- Detailed project plan with milestones
- Team with relevant track record
- Realistic schedule and budget
- Go/no-go decision points
- Management approach for complex programs
**Emphasize Transition**:
- Who will use the results?
- Path to operationalization or commercialization
- Engagement with potential users during program
- IP strategy that enables transition
### Common Mistakes
1. **Incremental Research**: Proposing 10% improvement instead of 10x
2. **Academic Focus**: Pure research without application focus
3. **No Transition Plan**: No pathway to use or commercialization
4. **Ignoring PM Vision**: Not aligned with program objectives
5. **Vague Metrics**: "Improve" or "enhance" instead of quantitative targets
6. **Underestimating Risk**: Claiming low risk (DARPA wants high risk, high reward)
7. **Weak Team**: Insufficient expertise or poorly defined roles
8. **No Differentiation**: Similar to existing efforts without clear advantage
9. **Ignoring BAA**: Not following proposal format or requirements
10. **Late Contact with PM**: Waiting until proposal due date to engage
## DARPA Contracting and Performance
### Award Types
**Procurement Contracts**: Most common for industry
- Firm Fixed Price (FFP)
- Cost Plus Fixed Fee (CPFF)
- Cost Plus Incentive Fee (CPIF)
**Grants and Cooperative Agreements**: For universities and nonprofits
- Grants: Minimal government involvement
- Cooperative Agreements: Substantial government involvement
**Other Transaction Agreements (OTAs)**: Flexible arrangements
- For research not requiring FAR compliance
- Faster, more flexible terms
- Common for consortia and partnerships
### Program Execution
**Kickoff Meeting**: Program launch with all performers
- PM presents program vision and goals
- Performers present approaches
- Technical exchange and collaboration
**Quarterly Reviews**: Progress reviews (virtual or in-person)
- Technical progress against milestones
- Challenges and solutions
- Path forward
- PM feedback and course corrections
**Annual or Phase Reviews**: Major assessment points
- Comprehensive technical review
- Go/no-go decisions
- Budget and schedule adjustments
**Site Visits**: PM and team visit performer sites
- See technical work firsthand
- Deep dive on specific areas
- Team building and collaboration
**Technical Interchange Meetings (TIMs)**: Deep dives on technical topics
- Cross-performer collaboration
- Sharing of results and approaches
- Problem-solving sessions
### Deliverables and Reporting
**Monthly Reports**: Brief progress updates
- Technical progress
- Budget status
- Issues and concerns
**Quarterly Reports**: Detailed technical reporting
- Accomplishments against milestones
- Data and results
- Upcoming activities
- Publications and IP
**Final Report**: Comprehensive program summary
- Technical achievements
- Lessons learned
- Transition activities
- Future directions
**Technical Data and Prototypes**: Specified in contract
- Software and code
- Hardware prototypes
- Data sets
- Documentation
## DARPA Culture and Expectations
### High Risk is Expected
- DARPA programs should have ~50% probability of failure
- Failure is acceptable if lessons are learned
- "Fail fast" to redirect resources
- Transparency about challenges valued
### Rapid Pivots
- PM may redirect program based on results
- Flexibility to pursue unexpected opportunities
- Willingness to stop unproductive efforts
- Adaptability is key
### Transition Focus
- Technology must have a path to use
- Engagement with transition partners during program
- Demonstrate prototypes and capabilities
- Handoff to services or industry
### Collaboration and Teaming
- Performers expected to collaborate
- Share results and insights (within IP bounds)
- Attend all program meetings
- Support overall program goals, not just own project
## Recent DARPA Priorities and Programs
### Key Technology Areas (2024-2025)
**Artificial Intelligence and Autonomy**:
- Trustworthy AI
- AI reasoning and understanding
- Human-AI teaming
- Autonomous systems across domains
**Quantum Technologies**:
- Quantum computing and algorithms
- Quantum sensing and metrology
- Quantum communications
- Post-quantum cryptography
**Biotechnology**:
- Pandemic prevention and response
- Synthetic biology
- Human performance
- Bio-manufacturing
**Microelectronics and Computing**:
- Advanced chip design and manufacturing
- Novel computing architectures
- 3D heterogeneous integration
- RF and millimeter-wave systems
**Hypersonics and Advanced Materials**:
- Hypersonic weapons and defense
- Advanced materials and manufacturing
- Thermal management
- Propulsion
**Space Technologies**:
- Space domain awareness
- On-orbit servicing and manufacturing
- Small satellite technologies
- Space-based intelligence
**Network Technologies**:
- Secure communications
- Resilient networks
- Spectrum dominance
- Cyber defense
## Tips for Competitive DARPA Proposals
### Do's
**Contact PM early** - Before writing, discuss your idea
**Attend Proposers Day** - Essential for understanding program
**Form strong team** - Complementary expertise, clear roles
**Be bold and ambitious** - 10x goals, not 10% improvements
**Quantify everything** - Specific metrics and targets
**Address transition** - Clear path to operational use
**Identify risks explicitly** - And explain mitigation
**Show preliminary results** - Proof of concept or feasibility
**Follow BAA exactly** - Format, page limits, content requirements
**Emphasize innovation** - What's revolutionary about your approach?
### Don'ts
**Don't propose incremental research** - DARPA wants breakthroughs
**Don't ignore national security relevance** - "Who cares?" matters
**Don't be vague** - Specific objectives, metrics, deliverables
**Don't hide risk** - DARPA expects and values high-risk research
**Don't forget transition** - Technology must have path to use
**Don't propose basic research** - That's for ONR, AFOSR, ARO
**Don't exceed page limits** - Automatic rejection
**Don't ignore PM feedback** - They're setting the direction
**Don't propose alone if team needed** - DARPA values strong teams
**Don't submit without PM contact** - Critical to gauge fit
## Resources
- **DARPA Website**: https://www.darpa.mil
- **DARPA Opportunities**: https://www.darpa.mil/work-with-us/opportunities
- **BAA Listings**: https://beta.sam.gov (search "DARPA")
- **DARPA Social Media**: Twitter @DARPA (PMs often announce programs)
- **SBIR/STTR**: https://www.darpa.mil/work-with-us/for-small-businesses
- **Heilmeier Catechism**: https://www.darpa.mil/about-us/timeline/heilmeier-catechism
### Key Contacts
- **DARPA Contracting**: via BAA points of contact
- **Program Managers**: Contact info in BAAs and program pages
- **SBIR/STTR Office**: sbir@darpa.mil
---
**Key Takeaway**: DARPA seeks revolutionary breakthroughs that advance national security, not incremental research. Successful proposals articulate clear, measurable objectives (answering "what if true?"), demonstrate innovative approaches to "DARPA-hard" problems, include strong multidisciplinary teams, proactively address technical risks, and provide realistic paths to transition. Early engagement with the Program Manager is essential—DARPA is a PM-driven agency where understanding the vision is critical to success.

View File

@@ -0,0 +1,586 @@
# DOE (Department of Energy) Grant Writing Guidelines
## Agency Overview
**Mission**: Ensure America's security and prosperity by addressing energy, environmental, and nuclear challenges through transformative science and technology solutions
**Annual Budget**: ~$50 billion (includes national laboratories, energy programs, nuclear security)
**Website**: https://www.energy.gov
**Key Characteristics**:
- Focus on energy, climate, environmental, computational, and physical sciences
- Operates 17 national laboratories (largest science infrastructure in US)
- Strong emphasis on industry partnerships and commercialization
- Basic science through applied research and development
- Cost sharing often required
- National security and energy security priorities
## Major DOE Offices and Programs
### Office of Science (SC)
**Budget**: ~$8 billion (largest supporter of physical sciences research in US)
**Mission**: Deliver scientific discoveries and major scientific tools to transform our understanding of nature and advance energy, economic, and national security
**Program Offices**:
1. **Advanced Scientific Computing Research (ASCR)**
- High-performance computing
- Applied mathematics
- Computational sciences
- Exascale computing
2. **Basic Energy Sciences (BES)**
- Materials science and engineering
- Chemical sciences
- Condensed matter and materials physics
- User facilities (light sources, neutron sources)
3. **Biological and Environmental Research (BER)**
- Biological systems science
- Climate and environmental sciences
- Environmental molecular sciences laboratory
4. **Fusion Energy Sciences (FES)**
- Plasma physics
- Fusion energy development
- ITER collaboration
5. **High Energy Physics (HEP)**
- Particle physics
- Accelerator science
- Quantum information science
6. **Nuclear Physics (NP)**
- Nuclear structure and dynamics
- Relativistic heavy ions
- Fundamental symmetries
**Funding Mechanisms**:
- **Early Career Research Program**: $750K over 5 years for early career scientists
- **Funding Opportunity Announcements (FOAs)**: Program-specific solicitations
- **Laboratory Directed Research and Development (LDRD)**: For national lab staff
### ARPA-E (Advanced Research Projects Agency-Energy)
**Mission**: Advance high-potential, high-impact energy technologies that are too early for private-sector investment
**Characteristics**:
- High-risk, high-reward transformative energy technologies
- Requires cost sharing (typically 20% for universities, more for industry)
- Emphasis on pathway to commercialization
- Strong project management and milestones
- Budget: ~$500M annually
**Program Types**:
- **Focused Programs**: Specific technology areas (announced via FOAs)
- **OPEN**: General solicitation across all energy technologies
- **SCALEUP**: Bridging from lab to market
**Typical Funding**:
- $1-10M per project
- 1-3 years duration
- Technology transition focus
### Office of Energy Efficiency and Renewable Energy (EERE)
**Mission**: Accelerate development and deployment of clean energy technologies
**Program Areas**:
- **Solar Energy Technologies Office (SETO)**
- **Wind Energy Technologies Office (WETO)**
- **Water Power Technologies Office (WPTO)**
- **Geothermal Technologies Office (GTO)**
- **Building Technologies Office (BTO)**
- **Advanced Manufacturing Office (AMO)**
- **Vehicle Technologies Office (VTO)**
- **Bioenergy Technologies Office (BETO)**
- **Hydrogen and Fuel Cell Technologies Office (HFTO)**
**Funding Mechanisms**:
- FOAs for specific technology areas
- Small Business Innovation Research (SBIR)
- Technology Commercialization Fund (TCF)
### Office of Fossil Energy and Carbon Management (FECM)
**Focus**: Carbon capture, utilization, and storage; hydrogen; critical minerals
### Office of Nuclear Energy (NE)
**Focus**: Advanced reactor technologies, nuclear fuel cycle, university programs
## DOE Proposal Structure
DOE proposal requirements vary significantly by program office and FOA. **Always read the specific FOA carefully.**
### Common Elements
#### Project Narrative (varies, typically 10-20 pages)
**Typical Structure**:
1. **Executive Summary / Abstract** (1 page)
- Project objectives and technical approach
- Expected outcomes and impact
- Team qualifications
- Alignment with DOE mission
2. **Background and Motivation** (2-3 pages)
- Current state of technology or knowledge
- Problem or opportunity
- Why DOE investment is needed
- Alignment with program goals
3. **Technical Approach and Innovation** (5-10 pages)
- Detailed technical plan
- Methodology and approach
- Innovation and novelty
- Risk assessment and mitigation
- Go/no-go decision points
- Performance metrics
4. **Impact and Energy Relevance** (1-2 pages)
- Expected technical outcomes
- Energy impact (cost, efficiency, emissions)
- Pathway to deployment or commercialization
- Economic benefits
- Timeline to market (for applied programs)
5. **Management Plan** (1-2 pages)
- Team organization and roles
- Timeline and milestones
- Risk management
- Communication and reporting
6. **Qualifications and Resources** (1-2 pages)
- Team expertise and experience
- Relevant prior work
- Facilities and equipment
- National lab or industry partners
#### Budget and Budget Justification
**Federal Cost Share**:
- Specify DOE funding requested by year
- Break down by category (labor, equipment, travel, etc.)
- Detailed justification for each item
**Cost Share** (often required):
- Specify source (cash vs. in-kind)
- Document commitment (letters from sponsors)
- Typical requirements:
- Universities: 20% (ARPA-E)
- Industry: 50% or more
- National labs: Varies
**Budget Categories**:
- Labor (personnel with hours/rates)
- Fringe benefits
- Travel
- Equipment and capital items
- Materials and supplies
- Other direct costs
- Subawards/subcontracts
- Indirect costs (F&A)
#### Biographical Sketches
**Format**: Often DOE-specific or NSF-style
- Professional preparation
- Appointments
- Relevant publications (5-10 most relevant)
- Synergistic activities
- Collaborators
#### Work Breakdown Structure (WBS)
**Often Required**: Detailed breakdown of tasks, milestones, and deliverables
- Task structure aligned with budget
- Quarterly or annual milestones
- Deliverables for each task
- Responsible parties
#### Letters of Commitment
**Required for**:
- Cost share partners
- Collaborating institutions
- National laboratory partnerships
- Industry partners
- Access to facilities or resources
**Must Include**:
- Specific commitment (funding, personnel, equipment)
- Signed by authorized representative
- On institutional letterhead
#### Facilities and Equipment
**Describe**:
- Available facilities relevant to project
- Major equipment accessible
- Computational resources
- Unique capabilities
#### Data Management Plan (DMP)
**Increasingly Required**:
- Types of data to be generated
- Standards and formats
- Access and sharing policies
- Long-term preservation
- Compliance with DOE policies
## Review Criteria
### Office of Science (SC) General Criteria
Proposals typically evaluated on:
1. **Scientific and/or Technical Merit** (35-40%)
- Importance and relevance of research
- Appropriateness of proposed method or approach
- Scientific or technical innovation
- Clarity of objectives and expected outcomes
2. **Appropriateness of Proposed Method or Approach** (25-30%)
- Technical feasibility
- Likelihood of success
- Adequacy of project design
- Rigor of technical approach
3. **Competency of Personnel and Adequacy of Facilities** (20-25%)
- Qualifications of PI and team
- Track record in relevant areas
- Access to necessary facilities and equipment
- Institutional support
4. **Reasonableness and Appropriateness of Budget** (10-15%)
- Budget aligned with proposed work
- Appropriate allocation of resources
- Cost effectiveness
5. **Relevance to DOE Mission and Program Goals** (10-15%)
- Alignment with program priorities
- Contribution to DOE mission
- Potential impact on energy/environment
### ARPA-E Review Criteria
**ARPA-E uses concept paper → full application process**
**Concept Paper Review** (typically 3-5 pages):
- Technical innovation and impact
- Potential for transformative advance
- Relevance to energy applications
- Feasibility (team, approach)
**Full Application Review** (if invited):
1. **Impact** (40%)
- Potential to dramatically improve energy technology
- Energy and economic impact
- Transformative vs. incremental
- Pathway to market adoption
2. **Innovation/Technical Merit** (30%)
- Novel approach or technology
- Technical rigor and feasibility
- Likelihood of meeting targets
- Risk and risk mitigation
3. **Qualifications** (20%)
- Team expertise and experience
- Resources and capabilities
- Management plan
- Track record
4. **Workplan** (10%)
- Clear milestones and go/no-go points
- Realistic timeline
- Appropriate budget
- Risk management
### Technology-to-Market (T2M) Evaluation (ARPA-E)
**Critical Component**: Path to commercialization
**Assessed**:
- Market opportunity and size
- Competitive landscape
- Barriers to adoption
- Go-to-market strategy
- Partnership and commercialization plan
- Economic viability
**Common Mistakes**:
- Underestimating time to market
- Ignoring competing technologies
- Unrealistic cost projections
- No clear adoption pathway
## DOE-Specific Considerations
### National Laboratory Collaboration
**Benefits**:
- Access to unique facilities and expertise
- Leveraging world-class capabilities
- Credibility and track record
**Mechanisms**:
- **Subcontract**: Lab is subcontractor to university/company
- **Cooperative Research and Development Agreement (CRADA)**: Partnership with industry
- **User Facility Proposal**: Access to major DOE user facilities
- **Strategic Partnership Project (SPP)**: Formal collaboration
**Process**:
- Identify appropriate lab partner early
- Contact lab scientist to discuss collaboration
- Develop work scope and budget together
- Obtain lab approval (can take 2-3 months)
- Include letter of commitment
**Major National Labs**:
- Argonne (ANL), Brookhaven (BNL), Lawrence Berkeley (LBNL)
- Oak Ridge (ORNL), Pacific Northwest (PNNL), SLAC
- Sandia (SNL), Los Alamos (LANL), Lawrence Livermore (LLNL)
- National Renewable Energy Lab (NREL), Idaho (INL), Fermilab
### User Facilities
**DOE operates 28 major user facilities** open to researchers
**Types**:
- **Light Sources**: X-ray and neutron scattering (APS, NSLS-II, ALS, etc.)
- **Nanoscale Science Centers**: Fabrication and characterization
- **High-Performance Computing**: Supercomputing centers (OLCF, NERSC, ALCF)
- **Genomic Science**: JGI, EMSL
- **Accelerators and Detectors**: Particle and nuclear physics facilities
**Access**:
- Submit user proposal (separate from research proposal)
- Peer-reviewed allocation of beam time or computing hours
- No cost for non-proprietary research
- Can include user facility access in grant proposals
### Cost Sharing Requirements
**Varies by Program**:
- **Office of Science**: Generally not required (except specific FOAs)
- **ARPA-E**: Required (typically 20% universities, 50%+ industry)
- **EERE**: Often required (varies by program)
- **FECM**: Often required
**Types**:
- **Cash**: Direct contribution of funds
- **In-kind**: Personnel time, equipment use, materials
- **Third-party**: Contribution from collaborator or sponsor
**Requirements**:
- Must be documented and verifiable
- Cannot be used for other federal awards
- Must be from non-federal sources (generally)
- Need letters of commitment
### Technology Readiness Levels (TRLs)
**DOE uses TRL scale 1-9** for technology development programs
**TRL Definitions**:
- **TRL 1-3**: Basic research (idea → proof of concept)
- **TRL 4-6**: Development (component → system prototype)
- **TRL 7-9**: Demonstration and deployment (prototype → commercial)
**Funding by TRL**:
- **Office of Science**: TRL 1-3 (basic research)
- **ARPA-E**: TRL 2-5 (proof of concept → prototype)
- **EERE**: TRL 4-8 (development → demonstration)
**Specify in Proposal**:
- Current TRL of technology
- Target TRL at project end
- Path from current to target
### Intellectual Property and Data Rights
**Standard Terms**:
- Awardee generally retains IP rights
- Government retains license for government purposes
- Must report inventions to DOE
- May have data sharing requirements
**Industry Partners**:
- Negotiate IP and data rights in advance
- Protected CRADA information (5 years)
- Background IP vs. foreground IP
### Teaming and Partnerships
**Encouraged for**:
- University-national lab partnerships
- University-industry partnerships
- Multi-institutional teams
- International collaborations (with approval)
**Teaming Partner Lists**: ARPA-E and other programs often provide teaming lists or events
## Submission Process
### Finding Opportunities
**Sources**:
- **EERE Exchange**: https://eere-exchange.energy.gov
- **ARPA-E OPEN**: https://arpa-e.energy.gov
- **Office of Science FOAs**: https://science.osti.gov/grants/Funding-Opportunities
- **Grants.gov**: Federal grants database
- **FedConnect**: Subscribe to FOA announcements
### Application Systems
**Varies by Office**:
- **EERE Exchange**: EERE programs
- **PAMS (Portfolio Analysis and Management System)**: Office of Science
- **ARPA-E OPEN**: ARPA-E submissions
- **Grants.gov**: Some programs
**Registration Required** (can take 2-4 weeks):
- SAM.gov (System for Award Management)
- Grants.gov
- DOE program-specific systems
### Proposal Development Timeline
**Recommended Timeline**:
- **3-6 months before deadline**: Identify FOA, assemble team, contact lab partners
- **2-3 months**: Develop technical approach, secure commitments
- **1-2 months**: Draft proposal, prepare budget
- **2-4 weeks**: Internal review, revisions
- **1 week**: Final preparation, institutional approvals
- **48 hours early**: Submit (don't wait for deadline)
### Required Registrations
**Before First Submission**:
1. **SAM.gov**: System for Award Management (2-3 weeks)
2. **Grants.gov**: Account and authorization (1 week)
3. **FedConnect**: Optional, for notifications
4. **PAMS/EERE Exchange**: Program-specific (immediate)
**Institutional Requirements**:
- Authorized Organizational Representative (AOR)
- Institutional approvals
- Cost accounting systems
## Review and Award Process
### Timeline
**Varies by Program**:
- **Office of Science**: 3-6 months
- **ARPA-E**: 4-6 months (after full application invitation)
- **EERE**: 3-6 months
**Steps**:
1. Administrative compliance check
2. Peer review (external reviewers)
3. Program manager evaluation
4. Selection for award negotiation
5. Budget negotiation
6. Award issuance
### Reviewer Feedback
**Provided**:
- Reviewer comments (often anonymized)
- Strengths and weaknesses
- Scores by criterion
**Not Always Provided**: Some programs provide limited feedback
### Success Rates
**Varies Widely**:
- **Office of Science Early Career**: ~10-15%
- **ARPA-E OPEN**: ~2-5% (concept papers → awards)
- **EERE FOAs**: 10-30% (depends on program)
- **Office of Science FOAs**: 20-40% (varies)
## Writing Tips for Competitive DOE Proposals
### Do's
**Align with DOE mission** - Energy, environment, or national security relevance
**Emphasize impact** - How will this advance energy technology or science?
**Quantify outcomes** - Energy savings, efficiency gains, cost reductions
**Show pathway to deployment** - For applied programs, how will technology reach market?
**Leverage DOE capabilities** - National labs, user facilities, unique resources
**Include strong management plan** - Milestones, go/no-go, risk mitigation
**Demonstrate team qualifications** - Track record in relevant area
**Be specific about innovation** - What's new and why it matters
**Address technology readiness** - Current TRL and path forward
**Secure cost share commitments** - If required, get letters early
### Don'ts
**Don't ignore FOA requirements** - Each FOA is different, read carefully
**Don't underestimate timeline** - Allow time for registrations and approvals
**Don't forget cost share** - If required, must be documented
**Don't overlook lab partnerships** - Can strengthen proposal significantly
**Don't be vague about impact** - Need quantitative energy/economic metrics
**Don't ignore commercialization** - For applied programs, market path is critical
**Don't submit without institutional approval** - Need AOR sign-off
**Don't wait for deadline** - Systems crash, submit 48 hours early
**Don't propose basic science to ARPA-E** - Or applied research to Office of Science
**Don't forget TRL discussion** - Important for technology programs
### Common Mistakes
1. **Wrong Program**: Proposing to inappropriate office or program
2. **Insufficient Energy Relevance**: Not clearly tied to DOE mission
3. **Weak Commercialization Plan**: For ARPA-E and EERE, lack of market strategy
4. **Unrealistic Milestones**: Overly optimistic timelines
5. **Poor Budget Justification**: Budget doesn't align with technical plan
6. **Missing Cost Share**: If required, not documented properly
7. **Weak Team**: Insufficient expertise or track record
8. **Ignoring Competing Technologies**: Not addressing competitive landscape
## Recent DOE Priorities (2024-2025)
### Key Focus Areas
- **Clean Energy Transition**: Renewable energy, storage, grid modernization
- **Carbon Management**: Carbon capture, utilization, storage, removal
- **Critical Materials**: Supply chain security, recycling, substitutes
- **Advanced Manufacturing**: Energy-efficient processes, sustainable materials
- **Quantum Information Science**: Computing, sensing, communications
- **Fusion Energy**: Accelerating fusion development
- **Hydrogen Economy**: Production, storage, utilization
- **Nuclear Energy**: Advanced reactors, microreactors, fuel cycle
- **Climate Adaptation**: Climate modeling, resilience, impacts
- **Energy Equity**: Environmental justice, workforce development
### Major Initiatives
- **Energy Earthshots**: Ambitious R&D goals (Hydrogen Shot, Long Duration Storage, Carbon Negative, etc.)
- **Bipartisan Infrastructure Law**: $62B for DOE programs
- **Inflation Reduction Act**: Clean energy tax credits and programs
- **CHIPS and Science Act**: Microelectronics, quantum, clean energy manufacturing
## Resources
- **DOE Office of Science**: https://science.osti.gov
- **ARPA-E**: https://arpa-e.energy.gov
- **EERE**: https://www.energy.gov/eere
- **DOE National Laboratories**: https://www.energy.gov/national-laboratories
- **EERE Exchange**: https://eere-exchange.energy.gov
- **Grants.gov**: https://www.grants.gov
- **SAM.gov**: https://sam.gov
---
**Key Takeaway**: DOE proposals require strong alignment with energy and national security missions, clear pathway to impact (especially for applied programs), and often benefit from partnerships with national laboratories or industry. Cost sharing, technology readiness levels, and commercialization strategies are critical considerations for competitive proposals.

View File

@@ -0,0 +1,851 @@
# NIH (National Institutes of Health) Grant Writing Guidelines
## Agency Overview
**Mission**: To seek fundamental knowledge about the nature and behavior of living systems and to apply that knowledge to enhance health, lengthen life, and reduce illness and disability
**Annual Budget**: ~$47 billion (largest biomedical research funder globally)
**Website**: https://www.nih.gov
**Key Characteristics**:
- 27 Institutes and Centers (ICs), each with specific research focus
- Supports biomedical and behavioral research
- Strong emphasis on rigor, reproducibility, and translation
- Clinical trials and human subjects research
- Patient-oriented and population health research
## NIH Institutes and Centers (Major ICs)
- **NCI** - National Cancer Institute
- **NHLBI** - National Heart, Lung, and Blood Institute
- **NIDDK** - National Institute of Diabetes and Digestive and Kidney Diseases
- **NIAID** - National Institute of Allergy and Infectious Diseases
- **NIGMS** - National Institute of General Medical Sciences
- **NINDS** - National Institute of Neurological Disorders and Stroke
- **NIMH** - National Institute of Mental Health
- **NICHD** - National Institute of Child Health and Human Development
- **NEI** - National Eye Institute
- **NIEHS** - National Institute of Environmental Health Sciences
- **NIA** - National Institute on Aging
- **NIAAA** - National Institute on Alcohol Abuse and Alcoholism
- **NIDA** - National Institute on Drug Abuse
- **NHGRI** - National Human Genome Research Institute
- **NCCIH** - National Center for Complementary and Integrative Health
**Plus**: NIBIB, NIDCD, NIDCR, NINR, FIC, NLM, and others
## Core Review Criteria
NIH proposals are evaluated using **scored criteria** (1-9 scale, 1 = exceptional, 9 = poor) and **additional review considerations** (not scored but discussed).
### Scored Criteria (Overall Impact Score)
#### 1. Significance
**Definition**: Does the project address an important problem or critical barrier to progress?
**Key Questions**:
- Will the project improve scientific knowledge, technical capability, or clinical practice?
- How will successful completion move the field forward?
- Does it address important scientific question or health need?
- Is there a clear rationale based on literature or preliminary data?
**What Reviewers Look For**:
- Clear statement of the problem and its importance
- Evidence that solving this problem will advance the field
- Strong conceptual framework
- Potential for broad impact (not just narrow niche)
- Alignment with NIH and Institute mission
**Writing Strategy**:
- Open with compelling statement of health burden or knowledge gap
- Cite epidemiological data, morbidity/mortality statistics
- Show that current approaches are insufficient
- Demonstrate how your work will make a difference
- Connect to clinical or translational outcomes when possible
#### 2. Investigator(s)
**Definition**: Are the investigators appropriately trained and well-suited to carry out this work?
**Key Questions**:
- Do they have appropriate expertise and track record?
- Is the proposed leadership approach appropriate for the project?
- Do they have prior experience in the research area?
- For Early Stage Investigators (ESI), is appropriate mentoring/support available?
**What Reviewers Look For**:
- Publications in the relevant area
- Preliminary data demonstrating capability
- Productivity and consistency
- Appropriate team composition
- For new investigators: strong mentorship and institutional support
- Career trajectory aligned with proposed work
**Writing Strategy**:
- Highlight most relevant publications (not total number)
- Show progression and focus in research program
- Demonstrate that you have necessary skills
- If new area, show collaborations or training
- For multi-PI, clearly define complementary roles
- Show stability and institutional commitment
#### 3. Innovation
**Definition**: Does the application challenge existing paradigms or develop new methodologies, technologies, or interventions?
**Key Questions**:
- Does the project employ novel concepts, approaches, or methodologies?
- Are the aims original and innovative?
- Does it challenge existing paradigms or address an innovative hypothesis?
- Does it refine, improve, or develop new instrumentation or methods?
**What Reviewers Look For**:
- Departure from standard approaches
- Novel application of methods to new problems
- Development of new technologies or tools
- Paradigm-shifting concepts
- Creative experimental design
- NOT just new to you, but new to the field
**Writing Strategy**:
- Explicitly state what is innovative
- Contrast with existing approaches and limitations
- Explain why innovation is necessary
- Provide preliminary data supporting feasibility
- Balance novelty with achievability
- Avoid over-claiming (incremental work ≠ transformative)
#### 4. Approach
**Definition**: Are the overall strategy, methodology, and analyses well-reasoned, appropriate, and rigorous?
**Key Questions**:
- Are the research design and methods appropriate for the proposed aims?
- Are potential problems, alternative strategies, and benchmarks for success presented?
- Is the timeline reasonable and is there adequate statistical power?
- Are the data management and analysis plans appropriate?
- Is rigor and transparency evident in the experimental design?
**What Reviewers Look For**:
- Detailed, specific methodology
- Appropriate experimental design (controls, replicates, randomization, blinding)
- Statistical justification (power calculations, sample size)
- Potential pitfalls identified with alternatives
- Feasibility demonstrated with preliminary data
- Logical flow from aims through methods to expected outcomes
- Rigor and reproducibility measures
**Writing Strategy**:
- Provide sufficient detail to judge feasibility
- Use subheadings for organization
- Include flowcharts or diagrams
- Address authentication of key biological resources
- Discuss biological variables (sex, age, etc.)
- Identify potential problems proactively
- Provide contingency plans
- Show that timeline is realistic
- Include preliminary data throughout
#### 5. Environment
**Definition**: Will the scientific environment contribute to the probability of success?
**Key Questions**:
- Do the proposed studies benefit from unique features of the scientific environment?
- Are the institutional support, equipment, and resources available?
- Are collaborative arrangements and contributions from colleagues appropriate?
- Is the environment conducive to the proposed research?
**What Reviewers Look For**:
- Access to necessary facilities (core facilities, equipment, patient populations)
- Institutional commitment and support
- Collaborative networks
- Track record of institutional productivity
- Training environment (for training grants)
- Sufficient space and resources
**Writing Strategy**:
- Highlight unique institutional resources
- Describe relevant core facilities with capabilities
- Show institutional investment in your research area
- Include letters documenting access to resources
- Describe collaborative environment
- For clinical research, show access to patient populations
### Additional Review Considerations (Not Scored)
These factors are discussed but do not contribute to the numerical score:
#### Protection of Human Subjects
- IRB approval status and process
- Risks to subjects justified by potential benefits
- Protections against risks adequate
- Informed consent process appropriate
- Data and safety monitoring plan (for trials)
- Inclusion of women, minorities, and children (see below)
#### Inclusion of Women, Minorities, and Children
- Adequate plan for inclusion of all groups
- Justification if any group excluded
- Statistical power adequate to detect differences
- Outreach and recruitment plans appropriate
#### Vertebrate Animals
- IACUC approval status
- Proposed procedures appropriate and humane
- Minimization of discomfort, distress, pain
- Euthanasia method appropriate
- Justification of species and numbers
#### Biohazards
- Appropriate safeguards and containment
- Training and expertise adequate
#### Resubmission (A1 applications)
- Are concerns from previous review adequately addressed?
- Has the application been substantially improved?
#### Budget and Period of Support
- Is budget reasonable for proposed work?
- Is timeline appropriate?
#### Resource Sharing Plans
- Data sharing plan adequate
- Model organism sharing plan (if applicable)
- Genomic data sharing plan (if applicable)
## Proposal Structure and Page Limits
### Specific Aims (1 page)
**Most important page of the entire application.** Reviewers often make initial impressions based on this page alone.
**Structure** (see detailed template in `specific_aims_guide.md`):
**Opening Paragraph** (3-5 sentences):
- Long-term goal of your research program
- Health burden or knowledge gap
- Critical need that motivates the work
**Objective and Central Hypothesis** (1 paragraph):
- Objective of THIS grant
- Central hypothesis or research question
- Rationale (brief mention of preliminary data)
**Specific Aims** (2-4 aims):
- Each aim: 1 paragraph (half page max)
- Aim statement (1-2 sentences, starts with action verb)
- Working hypothesis or research question
- Rationale (why this aim, what preliminary data supports it)
- Approach summary (brief methods)
- Expected outcomes and interpretation
**Payoff Paragraph** (closing):
- Expected outcomes of the overall project
- How findings will advance the field
- Positive impact on health (if relevant)
- Next steps or future directions
**Critical Rules**:
- Exactly 1 page (0.5-inch margins, 11-point Arial or similar)
- Must stand alone (reviewers read this first)
- Clear, specific aims that are testable
- Aims should be independent but synergistic
- Avoid jargon (panel members may not be in your subfield)
- Every sentence must earn its place
### Research Strategy (12 pages for R01)
**Section A: Significance** (typically 2-3 pages)
**Purpose**: Convince reviewers the problem is important and worth solving
**Content**:
- State the problem and its importance (health burden, knowledge gap)
- Review current state of knowledge (focused literature review)
- Identify limitations of current approaches
- Explain conceptual advance your work will provide
- Describe potential impact on the field or health outcomes
- Explain alignment with NIH mission and Institute priorities
**Writing Tips**:
- Start broad (importance of the problem) then narrow (specific gap)
- Use epidemiological data (prevalence, mortality, costs)
- Cite key literature systematically
- Identify the specific barrier or gap your work addresses
- End with how your work will advance the field
**Section B: Innovation** (typically 1-2 pages)
**Purpose**: Articulate what is novel and transformative
**Content**:
- Describe innovative elements of the proposed research
- Explain novel concepts, approaches, or methodologies
- Contrast with existing approaches and their limitations
- Explain why innovation is necessary (not just different)
- Demonstrate that innovation is achievable (preliminary data)
**Writing Tips**:
- Be explicit about what is innovative (don't assume it's obvious)
- Distinguish incremental from transformative advances
- Provide evidence that novel approach can work
- Don't confuse "new to me" with "new to the field"
- Avoid over-claiming
**Section C: Approach** (typically 8-10 pages)
**Purpose**: Provide detailed research plan demonstrating feasibility
**Organization** (for each Specific Aim):
**Aim [Number]: [Aim Title]**
**Rationale and Preliminary Data**:
- Why this aim is important
- Preliminary results supporting feasibility
- Key figures and data
**Research Design**:
- Overall experimental design
- Subject/sample populations and numbers
- Randomization, blinding, controls
- Timeline for this aim
**Methods** (organized by sub-aim or experiment):
- Detailed procedures and protocols
- Materials, reagents, equipment
- Data collection procedures
- Biological variables considered
**Data Analysis**:
- Statistical approaches
- Sample size justification and power calculations
- How results will be interpreted
**Expected Outcomes**:
- What you expect to find
- How results will be interpreted
- Alternative outcomes and what they would mean
**Potential Pitfalls and Alternative Approaches**:
- What could go wrong (be proactive)
- Contingency plans
- Alternative strategies if initial approach doesn't work
**Timeline**:
- Sequence of activities for this aim
- Estimated completion time
**Writing Tips**:
- Use consistent organization across aims
- Include subheadings for clarity
- Integrate preliminary data throughout (not just at beginning)
- Provide figures, flowcharts, and tables
- Address rigor and reproducibility explicitly
- Justify choice of methods and approaches
- Be specific about numbers, timelines, and analysis
- Show that you've thought through the research process
**Rigor and Reproducibility** (addressed throughout Approach):
NIH requires explicit discussion of:
- **Scientific rigor in experimental design**: Controls, replicates, blinding, randomization
- **Authentication of key biological resources**: Cell lines, antibodies, organisms
- **Consideration of biological variables**: Sex, age, strain, etc.
- **Statistical power**: Adequate sample sizes
- **Transparency**: Data management, protocols, reporting
### Bibliography (no page limit)
- Include all references cited
- Use consistent format (PubMed citations preferred)
- Include DOI or PMID when available
### Protection of Human Subjects or Vertebrate Animals (varies)
**Human Subjects Section**:
- Risks to subjects
- Protection against risks
- Potential benefits
- Importance of knowledge to be gained
- Inclusion of women and minorities
- Inclusion of children
- Data and safety monitoring
**Vertebrate Animals Section**:
- Justification of species and numbers
- Minimization of pain and distress
- Euthanasia method
## Key NIH Application Types
### R01 - Research Project Grant
**Description**: Standard NIH grant mechanism for established investigators
**Characteristics**:
- **Budget**: Modular (up to $250K direct costs/year) or detailed budget
- **Duration**: Typically 3-5 years
- **Eligibility**: Any eligible institution
- **Preliminary data**: Usually required (shows feasibility)
- **Page limits**: 12 pages Research Strategy
**Typical Timeline**:
- Prepare: 2-6 months
- Review: ~9 months from submission
- Earliest start: 9-12 months after submission
**Success Rate**: ~20% overall (varies by Institute)
**When to Apply**: When you have preliminary data and clear research direction
### R21 - Exploratory/Developmental Research Grant
**Description**: Encourages new exploratory and developmental research
**Characteristics**:
- **Budget**: Up to $275K total (direct costs) over 2 years
- **Duration**: Maximum 2 years
- **Preliminary data**: Not required (though can strengthen)
- **Page limits**: 6 pages Research Strategy
- **No-cost extensions**: Not allowed
**Purpose**:
- Pilot or feasibility studies
- Testing new methods or technologies
- Secondary analysis of existing data
- Exploratory clinical studies
**When to Apply**: When you need pilot data before R01, or for high-risk ideas
### R03 - Small Grant Program
**Description**: Small-scale research projects
**Characteristics**:
- **Budget**: Up to $50K/year direct costs (up to $100K total)
- **Duration**: Maximum 2 years
- **Page limits**: 6 pages Research Strategy
**Purpose**: Limited scope projects, pilot studies, secondary data analysis
### K Awards - Career Development Awards
**Purpose**: Support career development of researchers
**Major K Award Types**:
**K99/R00 - Pathway to Independence**:
- Two phases: K99 (mentored, 1-2 years) → R00 (independent, up to 3 years)
- For postdocs transitioning to independence
- Provides protected time and research support
- Competitive (~15% funded)
**K08 - Mentored Clinical Scientist Award**:
- For clinicians (MD, DO, DDS, etc.)
- 3-5 years protected time for research training
- Requires mentoring team
- Up to $100K direct costs/year
**K23 - Mentored Patient-Oriented Research Career Development Award**:
- For patient-oriented research
- Similar structure to K08
**All K Awards Require**:
- Career development plan
- Research plan (6-12 pages)
- Mentoring plan and letters from mentors
- Training plan
- Institutional commitment (75% protected time typically)
### Other Common Mechanisms
**R15 (AREA)**: For primarily undergraduate institutions
**P01**: Multi-project program project grants (large collaborative)
**U01**: Cooperative agreement (NIH involvement in conduct)
**R34**: Clinical trial planning grant
**DP1/DP2**: NIH Director's Pioneer/New Innovator Awards (special)
## Budget Preparation
### Modular Budgets (R01s up to $250K direct/year)
**Characteristics**:
- Requested in $25K increments (modules)
- Maximum 10 modules ($250K) per year
- Detailed budget not required
- Budget justification: Narrative (Personnel, Consortium, Other)
- Years 2-5: Brief justification if >$125K or increase >25%
**Personnel Justification**:
- List all personnel with roles, effort (% calendar months)
- Typical: PI (2-3 months = 16-25%), postdoc (12 months), grad student, tech
- Justify effort for each person
- Note: Salary cap applies (~$221,900 for 2024)
**Consortium/Contractual Costs**:
- F&A typically limited to 8% of total costs for subcontracts
**Other Costs**:
- Describe significant equipment, animals, patient costs, etc.
### Detailed Budgets (>$250K direct/year)
**Required Sections**:
- Personnel (with individual salary details)
- Equipment (≥$5,000 per item)
- Travel (domestic and foreign)
- Participant/Trainee Support Costs
- Other Direct Costs (materials, supplies, publications, consultants)
- Consortium/Contractual Costs (with detailed sub-budgets)
- Total Direct Costs
- Indirect Costs (F&A)
**Budget Justification**:
- Detailed narrative for each category
- Justify need for each item/person
- Explain calculations
### NIH Salary Cap
**Annual Update**: NIH sets maximum salary for grants
- 2024 Level: ~$221,900 (Executive Level II)
- Applies to all personnel
- Fringe benefits calculated on capped salary
### Allowable Costs
**Generally Allowed**:
- Salaries and wages
- Fringe benefits
- Equipment
- Supplies (consumables <$5,000)
- Travel (domestic and international)
- Consultant services
- Consortium/subaward costs
- Animal purchase and care
- Patient care costs (clinical trials)
- Alterations and renovations (with prior approval)
- Publication costs
**Generally Not Allowed** (without special justification):
- Office equipment (computers, printers, furniture)
- Administrative costs
- Tuition (except for K awards and training grants)
## Application Submission
### Deadlines
**Standard Dates** (most programs):
- February 5
- June 5
- October 5
**AIDS-Related Research**:
- January 7
- May 7
- September 7
**K Awards and Fellowship**: Different dates, typically 3 times/year
**Submission Time**: 5:00 PM local time of applicant organization
### Submission Systems
**eRA Commons**: Required for NIH submission
- Create account through institution
- Assign roles (PI, authorized organizational representative)
**ASSIST (Application Submission System & Interface for Submission Tracking)**:
- NIH's electronic submission system
- Create application, upload documents, submit
**Grants.gov**: Alternative submission route (not recommended)
### Just-in-Time Information
**After initial review** (if in fundable range), NIH requests:
- Other Support (updated)
- IRB/IACUC approval (or documentation that approval will be obtained)
- Vertebrate Animals/Human Subjects training certifications
**Timing**: Usually 6-9 months after submission
## Review Process
### Timeline
**Total Time**: ~9 months from submission to funding decision
**Stages**:
1. **Submission**: Deadline (Month 0)
2. **Referral**: Assignment to IC and study section (Month 1)
3. **Review**: Study section meeting (Months 3-4)
4. **Council**: Advisory council review (Months 6-7)
5. **Funding Decision**: Program officer and IC (Months 7-9)
### Study Sections
**Types**:
- **Standing Study Sections**: Permanent panels meeting 3x/year
- **Special Emphasis Panels (SEPs)**: Ad hoc panels for specific RFAs or topics
- **Scientific Review Groups (SRGs)**: Chartered study sections
**Process**:
- 3 assigned reviewers per application (prepare written critiques)
- ~15-25 applications discussed per study section
- ~50-100 applications assigned to each study section
**Participants**:
- Scientific Review Officer (SRO): NIH staff, manages process
- Reviewers: External scientists with expertise
- Grants management specialist
- Program officer (sometimes attends, doesn't vote)
### Scoring
**Preliminary Scoring** (before meeting):
- All panel members score 1-9 (1 = exceptional, 9 = poor)
- Applications in lower half typically "triaged" (not discussed)
- Top ~50% discussed at meeting
**Discussion** (at study section meeting):
- Assigned reviewers present their assessments
- Panel discusses strengths and weaknesses
- Open discussion among all panel members
- Questions about rigor, innovation, feasibility
**Final Scoring** (after discussion):
- All panel members score 1-9
- Scores averaged and multiplied by 10
- **Final Impact Score**: 10-90 (lower is better)
- 10-20: Exceptional
- 21-30: Outstanding
- 31-40: Excellent (often fundable)
- 41-50: Very good (may be fundable)
- 51+: Less competitive
**Individual Criterion Scores**: Also scored 1-9
- Significance
- Investigator(s)
- Innovation
- Approach
- Environment
### Percentile Ranking
**After all study sections meet**, applications are percentile-ranked within IC
- Based on Impact Score relative to other applications reviewed by same IC
- Percentile typically more important than Impact Score for funding decisions
- Lower percentile = better (1st percentile = top 1%)
**Example**: Impact Score of 35 might be:
- 15th percentile at NIGMS (likely funded)
- 40th percentile at NCI (likely not funded)
- Depends on competitiveness of IC and available funding
### Summary Statement
**Received**: ~30 days after study section meeting
**Contents**:
- Overall Impact/Priority Score and Percentile
- Individual criterion scores
- Resume and Summary of Discussion
- Detailed critiques from 3 assigned reviewers
- Additional comments from other panel members
- Human Subjects, Animals, Biohazards reviews
**Interpreting**:
- Focus on consistent themes across reviewers
- Identify major vs. minor criticisms
- Note what reviewers found strong
- Use for resubmission planning
## Resubmission (A1 Applications)
### NIH Resubmission Policy
**One Resubmission Allowed**: Can resubmit once (A1) after initial review (A0)
- After A1 review, cannot resubmit again
- Must submit new application if A1 not funded
**No Limits on New Applications**: Can submit completely new application anytime
### Introduction to Resubmission (1 page)
**Required Section**: Separate 1-page introduction responding to previous review
**Structure**:
- **Header**: "INTRODUCTION TO RESUBMISSION"
- **Summary of Criticisms**: Brief overview of major criticisms
- **Response to Criticisms**: Point-by-point response with page references
- **Use bullet points** for clarity
**Example Format**:
```
INTRODUCTION TO RESUBMISSION
The previous review raised the following concerns:
1. Inadequate preliminary data demonstrating feasibility of Aim 2
2. Statistical power insufficient for Aim 3
3. Lack of detail about quality control procedures
We have addressed these concerns as follows:
1. Preliminary data for Aim 2 (Response, p. 8-9; Research Strategy, p. 18-20)
• Generated pilot data showing [specific result]
• Optimized protocol achieving [specific outcome]
• New Figure 3 demonstrates feasibility
2. Statistical power for Aim 3 (Research Strategy, p. 24-25)
• Increased sample size from n=15 to n=25 per group
• Updated power calculations show >90% power
• Budget adjusted accordingly
3. Quality control procedures (Research Strategy, p. 12, 19, 26)
• Added detailed QC protocols for each method
• Implemented validation criteria and acceptance thresholds
• Described authentication of key reagents
```
**Tips**:
- Be respectful and professional (avoid defensiveness)
- Address every major criticism explicitly
- Indicate where changes are in revised application
- Show substantial revision, not minor tweaks
- Acknowledge valid criticisms and explain how addressed
- If disagree with criticism, explain politely with evidence
### Resubmission Strategy
**Decision Tree**:
**Impact Score ≤40 (Percentile ≤20)**: Strong application, likely competitive
- Address specific criticisms
- Strengthen weak areas
- Add preliminary data if criticized
- Consider minor scope adjustments
**Impact Score 41-50 (Percentile 21-40)**: Moderate application, needs improvement
- Substantial revision needed
- May need new preliminary data
- Consider revising aims if criticized
- Strengthen innovation or significance
- May want to wait for new data before resubmitting
**Impact Score ≥51 (Percentile ≥41)**: Weak application, major revision needed
- Consider whether resubmission is worthwhile
- May be better to develop new application
- If resubmitting: major restructuring likely needed
- Gather substantial new preliminary data
- Consider changing scope or aims
**Common Resubmission Improvements**:
1. **Add preliminary data**: Especially for Aim 2 or 3 if criticized
2. **Clarify methods**: Provide more detail, address technical concerns
3. **Increase rigor**: Better controls, larger n, statistical justification
4. **Revise specific aims**: If fundamentally flawed
5. **Add collaborators**: If expertise questioned
6. **Strengthen significance**: Better literature review, clearer impact
7. **Refocus innovation**: Clarify what's novel and why it matters
**Timing**:
- Can resubmit at any of the next 3 deadlines (36 months after initial submission)
- Use time wisely to generate new data
- Don't rush resubmission with minor changes
## NIH Funding Trends and Priorities (2024-2025)
### Current Priorities
- **Health Disparities and Health Equity**: Addressing disparities in disease burden
- **Alzheimer's Disease and Dementia**: Prevention, treatment, care
- **Substance Use and Mental Health**: Opioid crisis, addiction, mental health
- **Infectious Diseases**: Pandemic preparedness, antimicrobial resistance, vaccines
- **Cancer**: Cancer Moonshot initiatives
- **BRAIN Initiative**: Understanding the brain
- **All of Us Research Program**: Precision medicine
- **Climate Change and Health**: Environmental impacts on health
- **Artificial Intelligence**: AI for biomedical research and healthcare
### Success Rates by Career Stage
**Overall**: ~20% (varies by IC and mechanism)
**Established Investigators**: ~23%
**Early Stage Investigators (ESI)**: ~27% (higher due to ESI policy)
- ESI: Within 10 years of final degree, no prior R01-equivalent
**New Investigators**: ~24%
- New: No prior R01-equivalent (regardless of time since degree)
**Multiple PI**: ~18% (slightly lower than single PI)
### Paylines
**Varies by IC**: Each Institute sets own funding priorities
**Example Paylines (FY2023)**:
- NIGMS: ~23rd percentile
- NCI: ~12th percentile (highly competitive)
- NHLBI: ~11th percentile
- NIAID: ~15th percentile
- NIMH: ~12th percentile
**ESI Boost**: Most ICs fund ESIs at higher percentile than established investigators
**Check IC Websites**: Paylines and funding policies updated annually
## Tips for Competitive NIH Applications
### Do's
**Start with Specific Aims page** - Most important page, revise extensively
**Include substantial preliminary data** - Demonstrate feasibility (esp. for R01)
**Be explicit about innovation** - Don't assume reviewers will recognize it
**Address rigor and reproducibility** - Controls, power, authentication, variables
**Provide detailed methods** - Enough detail to assess feasibility
**Identify pitfalls proactively** - Show you've thought through challenges
**Use figures and diagrams** - Clarify complex ideas, show preliminary data
**Connect to health** - NIH mission is health-related
**Write clearly** - Panel members may not be in your exact subfield
**Get external review** - Mock review from colleagues and mentors
### Don'ts
**Don't exceed page limits** - Automatic rejection
**Don't be vague about methods** - "Standard protocols" is insufficient
**Don't ignore sample size** - Power calculations required
**Don't overpromise** - Be realistic about what's achievable
**Don't forget human subjects/animals sections** - Common mistake
**Don't submit without preliminary data** - For R01, this rarely succeeds
**Don't assume reviewers know your work** - Provide context
**Don't ignore sex as biological variable** - NIH policy requires consideration
**Don't submit at deadline** - Technical issues happen frequently
**Don't resubmit without substantial changes** - Minor revisions rarely succeed
## NIH Resources
- **NIH Homepage**: https://www.nih.gov
- **NIH RePORTER (funded grants)**: https://reporter.nih.gov
- **Grants & Funding**: https://grants.nih.gov
- **eRA Commons**: https://commons.era.nih.gov
- **ASSIST**: https://public.era.nih.gov/assist
- **Application Forms and Instructions**: https://grants.nih.gov/grants/how-to-apply-application-guide.html
- **NIH Data Sharing Policy**: https://sharing.nih.gov
- **Rigor and Reproducibility**: https://grants.nih.gov/reproducibility/index.htm
---
**Key Takeaway**: NIH applications succeed through clear articulation of an important health-related problem, preliminary data demonstrating feasibility, detailed rigorous approach, and innovative methods. The Specific Aims page is the most critical component—invest time in crafting a compelling narrative that immediately conveys significance and feasibility.

View File

@@ -0,0 +1,570 @@
# NSF (National Science Foundation) Grant Writing Guidelines
## Agency Overview
**Mission**: To promote the progress of science; to advance the national health, prosperity, and welfare; to secure the national defense
**Annual Budget**: ~$9-10 billion
**Website**: https://www.nsf.gov
**Key Characteristics**:
- Supports all fields of fundamental science and engineering (except medical sciences)
- Emphasis on education and workforce development
- Strong commitment to diversity, equity, and inclusion
- Promotes open science and data sharing
- Collaborative research across institutions encouraged
## NSF Directorates
1. **BIO** - Biological Sciences
2. **CISE** - Computer and Information Science and Engineering
3. **EHR** - Education and Human Resources
4. **ENG** - Engineering
5. **GEO** - Geosciences
6. **MPS** - Mathematical and Physical Sciences
7. **SBE** - Social, Behavioral, and Economic Sciences
8. **TIP** - Technology, Innovation, and Partnerships (formerly EDA)
9. **OPP** - Office of Polar Programs
10. **OISE** - Office of International Science and Engineering
## Core Review Criteria
NSF uses two equally weighted criteria for all proposals:
### Intellectual Merit
**Definition**: The potential to advance knowledge
**Evaluation Questions**:
- How important is the proposed activity to advancing knowledge and understanding within its own field or across different fields?
- How well-qualified is the proposer (individual or team) to conduct the project?
- To what extent does the proposed activity suggest and explore creative, original, or potentially transformative concepts?
- How well-conceived and organized is the proposed activity?
- Is there sufficient access to resources?
**Writing Strategy**:
- Lead with the research question and its importance
- Demonstrate deep knowledge of the field
- Articulate the knowledge gap clearly
- Present innovative approach to address the gap
- Show preliminary results or proof-of-concept
- Demonstrate team qualifications
- Present feasible, well-organized plan
### Broader Impacts
**Definition**: The potential to benefit society and contribute to the achievement of specific, desired societal outcomes
**Evaluation Questions**:
- What is the potential for the proposed activity to:
- Benefit society or advance desired societal outcomes?
- Broaden participation of underrepresented groups?
- Enhance infrastructure for research and education?
- Enhance scientific and technological understanding?
- Foster partnerships between academia, industry, and others?
**Critical Point**: Broader Impacts are NOT an afterthought. They carry equal weight with Intellectual Merit and must be substantive, specific, and measurable.
**Five Pillars of Broader Impacts** (address at least one substantively):
1. **Advance discovery and understanding while promoting teaching, training, and learning**
- Integrate research into courses
- Develop new curriculum materials
- Train undergraduate, graduate, and postdoctoral researchers
- Provide research experiences for students
- Create educational resources (videos, software, databases)
- Offer workshops or training programs
*Example*: "We will develop a 10-module online course on computational genomics, incorporating data from this project, to be offered to 500+ students annually across 15 partner institutions. Course materials will be open-access and include Jupyter notebooks for hands-on analysis."
2. **Broaden participation of underrepresented groups (in STEM)**
- Partner with minority-serving institutions (HBCUs, HSIs, TCUs)
- Recruit students from underrepresented groups
- Provide mentoring and support programs
- Address systemic barriers to participation
- Create inclusive research environments
- Engage underrepresented communities in research
*Example*: "We will establish a summer research program for 8 undergraduates annually from 4 partner HBCUs, providing stipends, housing, and year-round mentoring. Program will include professional development workshops and pathways to graduate school."
3. **Enhance infrastructure for research and education**
- Develop shared instrumentation or facilities
- Create cyberinfrastructure, software, or databases
- Build collaborative networks
- Establish living stock centers or repositories
- Develop standards or protocols
- Create open-source tools
*Example*: "We will develop and maintain an open-source software platform for analyzing spatial transcriptomics data, with comprehensive documentation, tutorials, and user support forum. Software will be deposited on GitHub and indexed in bio.tools."
4. **Disseminate to enhance scientific and technological understanding**
- Public outreach and science communication
- Engagement with K-12 students and teachers
- Museum exhibits or science festivals
- Media engagement (podcasts, videos, articles)
- Policy briefs for decision-makers
- Community science projects
*Example*: "We will partner with the City Science Museum to create a hands-on exhibit on AI and climate modeling, reaching 50,000+ annual visitors. Exhibit will include interactive simulations and bilingual materials. We will also host quarterly 'Science Saturdays' for local K-12 students."
5. **Benefit society**
- Economic development and competitiveness
- Health and quality of life improvements
- Environmental sustainability
- National security
- Societal well-being
- Workforce development
*Example*: "Our drought prediction models will be integrated into USDA's decision support system, benefiting 15,000+ farmers in the Southwest. We will work with extension agents to provide training and accessible interfaces for non-technical users."
**Common Broader Impacts Mistakes**:
- ❌ Vague statements: "We will train graduate students" (everyone does this)
- ❌ No plan: Aspirational goals without concrete activities
- ❌ No metrics: No way to assess success
- ❌ Tacked on: Not integrated with research plan
- ❌ Unrealistic: Grand claims without resources or expertise
- ✅ Specific and measurable: Clear activities, timelines, and assessment
## Proposal Sections and Page Limits
### Project Summary (1 page)
**Required Structure** (NSF mandates three labeled sections):
**Overview** (first paragraph):
- Research question and approach in accessible language
- Suitable for public dissemination
**Intellectual Merit**:
- Potential to advance knowledge
- Innovative aspects
- Qualifications of team
**Broader Impacts**:
- Societal benefits and specific activities
- How success will be measured
**Formatting**: Must use section headings exactly as shown above
### Project Description (15 pages for most programs)
**No required structure, but typical organization**:
1. **Introduction / Background** (1-2 pages)
- Research question and significance
- Current state of knowledge
- Knowledge gaps
- Preliminary results (if applicable)
2. **Research Objectives** (0.5-1 page)
- Specific, measurable goals
- Hypotheses or research questions
3. **Research Plan / Methodology** (8-10 pages)
- Detailed approach for each objective
- Methods and techniques
- Timeline and milestones
- Expected outcomes
- Potential challenges and alternatives
4. **Broader Impacts** (1-2 pages)
- Can be integrated throughout OR separate section
- Specific activities and timelines
- Assessment and evaluation plan
5. **Results from Prior NSF Support** (if applicable, up to 5 pages)
- Required if PI or co-PI has had NSF award in past 5 years
- Intellectual merit of prior work
- Broader impacts of prior work
- Publications and products
**Formatting Requirements**:
- Font: 11-point or larger (Times Roman, Arial, Palatino, Computer Modern)
- Margins: 1 inch all sides
- Line spacing: No more than 6 lines per inch
- Page size: 8.5 x 11 inches
- No smaller fonts in figures (must be legible)
### References Cited (no page limit)
- Each reference must include:
- Names of all authors
- Article and journal title
- Volume, page numbers, year
- DOI if available
- Use consistent format (doesn't have to match specific style)
- Sufficient information for reviewers to locate references
### Biographical Sketch (3 pages max per person)
**Required NSF Format** (as of 2023 PAPPG):
**Section A: Professional Preparation**
- Undergraduate, graduate, postdoctoral institutions
- Majors and degrees with years
**Section B: Appointments and Positions**
- Last 5 positions, current first
**Section C: Products** (up to 5 most relevant to proposal)
- Publications, datasets, software, patents, etc.
- Can include products in preparation
**Section D: Synergistic Activities** (up to 5)
- Service, teaching, mentoring, outreach
- Demonstrates broader engagement beyond research
### Current and Pending Support (no page limit)
- All current and pending support for PI and co-PIs
- Include project/proposal title, source, award amount, dates
- Describe overlap with proposed project (if any)
- Must be updated until award/decline
### Facilities, Equipment, and Other Resources (no page limit)
- Describe available facilities (labs, computational, libraries)
- Major equipment accessible to project
- Other resources (personnel, core facilities, partnerships)
- Demonstrate institutional commitment
### Data Management and Sharing Plan (2 pages max)
**Required for all proposals** (as of 2023 PAPPG)
**Must address**:
1. **Types of data**: What data will be generated?
2. **Standards**: Formats, metadata, standards for data and metadata
3. **Access**: How and when will data be shared?
4. **Reuse**: Who can access and under what conditions?
5. **Repository**: Where will data be archived long-term?
6. **Protection**: Privacy, confidentiality, intellectual property considerations
**NSF Expectations**:
- Data should be made publicly available in a timely manner
- Use discipline-specific repositories when available
- Justify any restrictions on data sharing
- Plan for data preservation beyond project period
### Postdoctoral Researcher Mentoring Plan (1 page max)
**Required if funding postdocs**
**Must address**:
- Career development objectives
- Mentoring activities (research, teaching, professional skills)
- Metrics for success
- Mentoring plan should be specific, not generic
## Special NSF Proposal Types
### CAREER (Faculty Early Career Development Program)
**Eligibility**: Tenure-track (or equivalent) faculty who have not yet received tenure, within 6 years of PhD (or equivalent)
**Requirements**:
- Integration of research and education
- Demonstrate potential for leadership
- Department chair letter required
- 5-year project plan
- Typical budget: $400,000-$500,000
**Key Elements**:
- Ambitious research plan
- Innovative educational component
- Strong integration (not just parallel tracks)
- Path to independence and leadership
- Institutional commitment
**Review Criteria**: Same two criteria (Intellectual Merit, Broader Impacts) but with emphasis on:
- Integration of research and education
- Innovative educational component
- Potential for leadership in field
**Common CAREER Mistakes**:
- Education component feels tacked on
- Overly ambitious research plan
- Weak integration between research and education
- Generic mentoring or teaching plans
- Insufficient preliminary data
### Collaborative Research
**Structure**: Multiple proposals submitted separately from different institutions, reviewed as a single project
**Requirements**:
- Lead institution designated
- All proposals must have identical titles (except institution name)
- Project descriptions should be substantially similar
- Clear division of labor
- Coordination plan
**Budget**: Each institution submits own budget for their portion
**Review**: Reviewed together as single integrated project
**Benefits**: Brings together complementary expertise and resources
### RAPID (Rapid Response Research)
**Purpose**: Support time-sensitive research opportunities
**Examples**:
- Natural disasters
- Disease outbreaks
- Unique astronomical events
- Rare opportunities for data collection
**Requirements**:
- Urgent need justification
- Up to $200,000
- Up to 1 year duration
- Simplified review process (program officer discretion)
- No preliminary data required
**Submission**: Contact program officer first, then submit proposal
### EAGER (Early-concept Grants for Exploratory Research)
**Purpose**: Support exploratory work on untested, but potentially transformative, ideas
**Requirements**:
- High-risk, high-reward research
- Radically different approaches
- Up to $300,000
- Up to 2 years
- Program officer approval required before submission
- No panel review (program officer decision)
**Key**: Must be truly exploratory and high-risk, not incremental
## Budget Considerations
### Allowable Costs
**Personnel**:
- Senior personnel: Up to 2 months (summer salary) for 9-month faculty
- Postdoctoral scholars: Full salary and benefits
- Graduate students: Stipend (tuition typically covered under fringe/indirect)
- Undergraduate students: Hourly or stipend
- Technical and administrative staff
**Fringe Benefits**: Follow institutional rates
**Equipment**: Items ≥$5,000 per unit
- Must be justified
- Shared equipment requires letters from collaborators
**Travel**:
- Domestic and international scientific meetings
- Collaboration and fieldwork
- Justification required
**Participant Support Costs**: For workshops, training, conferences
- Stipends, travel, subsistence for participants
- Not subject to indirect costs
**Other Direct Costs**:
- Publication costs
- Consulting services
- Computer services
- Materials and supplies
- Subawards to collaborating institutions
**Indirect Costs (F&A)**: Institutional negotiated rate applies to modified total direct costs (MTDC)
- MTDC excludes: equipment, participant support, subawards >$25K
### Cost Sharing
**NSF Policy**: Cost sharing is not required and should not be voluntary
**Exceptions**: Some programs require cost sharing (check program solicitation)
**When Included**: Must be documented, verifiable, auditable, and necessary for project
## Submission and Review Process
### Submission Deadlines
**Varies by program**:
- Some programs have specific deadlines (e.g., twice per year)
- Some programs accept proposals anytime (check with program officer)
- CAREER: July deadline (directorate-specific)
**Submission Windows**: NSF deadlines are typically 5 PM submitter's local time
### Submission Portal
**Research.gov** or **Grants.gov**: NSF accepts both
**Process**:
1. Institutional authorization required
2. Upload all required documents
3. Verify PDF compilation
4. Submit (aim for 48 hours early)
5. Receive confirmation and proposal number
### Review Process
**Timeline**: Typically 6 months from submission to decision
**Steps**:
1. **Administrative Review**: NSF checks compliance (1-2 weeks)
2. **Program Officer Assignment**: Assigned to appropriate program (1-2 weeks)
3. **Reviewer Selection**: Panel and/or ad hoc reviewers identified (2-4 weeks)
4. **Review**: Reviewers assess proposals (4-8 weeks)
5. **Panel Discussion**: Panel meets (virtual or in-person) to discuss proposals (1 week)
6. **Program Officer Recommendation**: Based on reviews and panel discussion (2-4 weeks)
7. **Division/Directorate Approval**: Final decision (2-4 weeks)
**Review Formats**:
- **Panel Review**: 10-20 proposals discussed at panel meeting
- **Ad hoc Review**: External reviewers submit written reviews
- **Hybrid**: Combination of panel and ad hoc reviews
**Number of Reviewers**: Typically 3-5 reviewers per proposal
### Review Outcomes
**Possible Decisions**:
- **Funded**: Congratulations! Award forthcoming
- **Declined**: Not recommended for funding
- **Returned Without Review**: Non-compliant with requirements
**Feedback**: Panel summary and individual reviews provided regardless of outcome
**Success Rates**: Vary by program, typically 15-30%
## Communicating with Program Officers
### When to Contact
**Appropriate**:
- Before submission: Discuss fit with program, feasibility of idea
- After reviews: Discuss feedback, resubmission strategy
- During project: Report significant changes, request no-cost extensions
**How to Contact**:
- Email program officer (contact info in program solicitation)
- Request 15-30 minute phone call
- Prepare concise summary of research idea (1 page)
### What to Ask
**Good Questions**:
- Is my research appropriate for this program?
- Are there upcoming solicitations or special initiatives?
- What are key areas of emphasis for the program?
- Is the scope and budget appropriate?
- After reviews: What are key issues to address in resubmission?
**Avoid**:
- Asking for guarantee of funding
- Arguing with review outcome
- Inappropriate requests for information about reviewers
## Resubmission Strategy
### NSF Resubmission Policies
**No Formal Resubmission Category**: NSF treats resubmissions as new proposals
**Can Resubmit**:
- To same program (after addressing reviews)
- To different program (if better fit)
- After substantial revision
**No Introduction Section**: Unlike NIH, NSF doesn't have formal resubmission response
**Strategy**:
- Carefully review panel summary and individual reviews
- Address all major criticisms
- Strengthen weak areas (prelim data, broader impacts, methods)
- Consider discussing with program officer
- May want to wait for next funding cycle to gather more data
**Tracking**: Proposals reviewed previously may be assigned same reviewers (sometimes)
## Recent NSF Policy Updates
### 2023-2024 Changes
1. **Data Management and Sharing Plan**: Now required for all proposals (2 pages max)
2. **Biographical Sketch Format**: Updated to include "Products" instead of "Publications"
3. **Open Science**: Increased emphasis on open-access publications and data
4. **Plan for Dissemination**: Some programs require explicit dissemination plans
5. **Mentoring Plans**: Enhanced requirements for postdoc mentoring plans
### NSF Priorities (2024-2025)
- **Climate and Clean Energy**: Climate change mitigation and adaptation
- **Quantum Information Science**: Quantum computing, sensing, networking
- **AI and Machine Learning**: Trustworthy AI, AI for science
- **Biotechnology**: Synthetic biology, bioengineering
- **Microelectronics**: Semiconductor research and workforce
- **STEM Education**: Broadening participation, innovative pedagogy
- **Convergence Accelerators**: Use-inspired research with pathway to impact
## NSF Big Ideas and Special Initiatives
### NSF "Big Ideas"
1. **Harnessing the Data Revolution (HDR)**
2. **The Future of Work at the Human-Technology Frontier**
3. **Navigating the New Arctic**
4. **Windows on the Universe**
5. **The Quantum Leap**
6. **Understanding the Rules of Life**
7. **Mid-scale Research Infrastructure**
### Major NSF Initiatives
- **National AI Research Institutes**: $20M over 5 years per institute
- **Science and Technology Centers (STCs)**: Large-scale collaborative centers
- **Engineering Research Centers (ERCs)**: Engineering innovation ecosystems
- **Materials Research Science and Engineering Centers (MRSECs)**: Materials research
- **NSF Graduate Research Fellowship Program (GRFP)**: Student fellowships
## Tips for Competitive NSF Proposals
### Do's
**Start with specific aims/objectives** - Crystal clear research goals
**Make broader impacts substantive** - Specific activities, not platitudes
**Use figures effectively** - Conceptual diagrams, preliminary data, timelines
**Be realistic about scope** - Achievable within 3-5 years
**Address both review criteria explicitly** - Don't make reviewers search
**Get external feedback** - Mock review before submission
**Follow formatting requirements exactly** - Auto-rejection for non-compliance
**Explain jargon and acronyms** - Panel members may not be in your subfield
**Integrate research and education** - Show natural connections
**Demonstrate team qualifications** - Track record in proposed area
### Don'ts
**Don't exceed page limits** - Automatic return without review
**Don't use smaller fonts in figures** - Must be legible
**Don't make broader impacts generic** - "Train students" is not enough
**Don't ignore prior NSF support** - Must report if you've had NSF funding
**Don't be overly ambitious** - Reviewers will see through unrealistic plans
**Don't skip data management plan** - Required for all proposals
**Don't forget biosketches for all personnel** - Common mistake
**Don't submit at deadline** - Technical issues happen
**Don't ignore program solicitation** - Requirements vary by program
**Don't assume reviewers know your work** - Provide context
## Resources and Links
- **NSF Homepage**: https://www.nsf.gov
- **Award Search**: https://www.nsf.gov/awardsearch/
- **Proposal & Award Policies & Procedures Guide (PAPPG)**: https://www.nsf.gov/publications/pub_summ.jsp?ods_key=pappg
- **FastLane**: https://www.fastlane.nsf.gov/
- **Research.gov**: https://www.research.gov/
- **Broader Impacts Resources**: https://www.nsf.gov/od/oia/special/broaderimpacts/
- **NSF Funding Statistics**: https://www.nsf.gov/statistics/
---
**Key Takeaway**: NSF values both scientific excellence (Intellectual Merit) and societal benefit (Broader Impacts) equally. Successful proposals demonstrate innovative, feasible research that advances knowledge while contributing to education, diversity, infrastructure, or societal well-being in specific, measurable ways.

View File

@@ -0,0 +1,458 @@
# NIH Specific Aims Page: The Complete Guide
## Overview
The **Specific Aims page** is the most important page of your entire NIH grant application. It's the first thing reviewers read, often determines their initial impression, and may be the only page read by some panel members before scoring begins.
**Length**: Exactly 1 page
**Margins**: 0.5 inches (all sides)
**Font**: 11-point Arial, Helvetica, or similar (no smaller)
**Line spacing**: Must be readable
**Purpose**:
- Communicate your research vision clearly and compellingly
- Establish significance and innovation
- Demonstrate feasibility
- Show that you can accomplish meaningful work in the proposed timeframe
- Make reviewers excited to fund your work
## Anatomy of a Specific Aims Page
### Essential Components (in order)
1. **Opening Hook** (2-4 sentences)
2. **Gap/Problem Statement** (2-4 sentences)
3. **Long-Term Goal** (1 sentence)
4. **Objective** (1-2 sentences)
5. **Central Hypothesis** (1 sentence) [or Research Questions]
6. **Rationale** (2-3 sentences with preliminary data mention)
7. **Specific Aims** (2-4 aims, ~½ page total)
8. **Expected Outcomes and Impact** (2-4 sentences)
## Detailed Structure
### Opening Paragraph: The Hook
**Purpose**: Establish importance and grab attention
**What to include**:
- Broad context (disease burden, biological importance, technological need)
- Epidemiological data or statistics that establish scale
- Why this problem matters for health or science
- Create urgency
**Length**: 2-4 sentences
**Writing tips**:
- Start strong with compelling statement
- Use concrete numbers (prevalence, mortality, costs)
- Avoid jargon in first sentence
- Make it accessible to non-specialists on panel
**Examples**:
*Clinical Example*:
"Pancreatic ductal adenocarcinoma (PDAC) is the third leading cause of cancer death in the United States, with a devastating 5-year survival rate of only 11%. Despite decades of research, therapeutic options remain limited, and most patients present with advanced, unresectable disease. The lack of effective early detection methods and targeted therapies represents a critical unmet medical need affecting over 62,000 Americans diagnosed annually."
*Basic Science Example*:
"Mitochondrial dysfunction is a hallmark of aging and age-related diseases, yet the mechanisms linking mitochondrial decline to cellular senescence remain poorly understood. Emerging evidence suggests that mitochondrial-nuclear communication pathways play a central role in longevity determination across species, from yeast to mammals. Understanding how cells sense and respond to mitochondrial stress could reveal new therapeutic targets for age-related diseases affecting millions worldwide."
### Second Paragraph: Gap and Context
**Purpose**: Define what's known, what's unknown, and why it matters
**What to include**:
- Current state of knowledge (brief literature context)
- Specific gap or barrier to progress
- Why this gap is critical to address
- Why current approaches are insufficient
**Length**: 3-5 sentences
**Structure**:
1. What we know (1-2 sentences)
2. What we don't know / what's limiting progress (1-2 sentences)
3. Why this gap matters (1 sentence)
**Examples**:
"Prior studies have identified numerous genetic mutations associated with PDAC development, including KRAS, TP53, SMAD4, and CDKN2A. However, the tumor microenvironment (TME), comprising immune cells, fibroblasts, and extracellular matrix, is increasingly recognized as a critical determinant of therapeutic resistance. Current models fail to recapitulate the complex TME architecture and cell-cell interactions that drive therapy resistance in vivo, limiting our ability to develop effective treatments. Understanding how the TME protects tumor cells from chemotherapy is essential for designing combination therapies that overcome resistance."
### Third Paragraph: Long-Term Goal, Objective, Hypothesis, Rationale
**Purpose**: Set up your specific approach and justification
**Structure**:
**Long-Term Goal** (1 sentence):
- Your overarching research program direction
- Broader than this specific proposal
- Provides context for this work
*Example*: "The long-term goal of our research is to elucidate the molecular mechanisms by which the tumor microenvironment promotes therapeutic resistance in pancreatic cancer."
**Objective** (1-2 sentences):
- Specific objective of THIS grant
- What you will accomplish in 3-5 years
- More focused than long-term goal
*Example*: "The objective of this application is to define the role of cancer-associated fibroblasts (CAFs) in mediating gemcitabine resistance and to develop combination therapies targeting CAF-tumor interactions."
**Central Hypothesis** (1 sentence):
- Testable prediction
- Should unify the specific aims
- Based on preliminary data or logical reasoning
- Clear and specific
*Example*: "Our central hypothesis is that CAF-secreted factors activate protective autophagy in tumor cells, conferring resistance to gemcitabine, and that dual inhibition of CAF signaling and autophagy will restore drug sensitivity."
**Alternative: Research Questions** (if hypothesis-testing isn't appropriate):
- 2-3 focused questions
- Should correspond to specific aims
*Example*: "This project will address the following questions: (1) What factors secreted by CAFs promote tumor cell survival during chemotherapy? (2) How do tumor cells integrate CAF signals to activate protective responses? (3) Can targeting CAF-tumor interactions enhance therapeutic efficacy in preclinical models?"
**Rationale** (2-3 sentences):
- Why you think the hypothesis is true
- Mention key preliminary data (very briefly)
- Logical basis for your approach
- Why this approach will work
*Example*: "This hypothesis is based on our preliminary data showing that CAF-conditioned medium protects tumor cells from gemcitabine-induced apoptosis by 60% (Fig. 1), and that this protection is blocked by autophagy inhibitors (Fig. 2). Proteomic analysis of CAF secretomes identified 15 candidate factors enriched in drug-resistant contexts (Table 1). These findings suggest a targetable pathway linking CAF signaling to tumor cell survival that could be exploited therapeutically."
### Specific Aims (Main Section)
**How many aims**: 2-4 aims (3 is most common for R01)
- **Too few (1)**: Insufficient work, appears risky
- **Just right (2-3)**: Focused, achievable, synergistic
- **Too many (4+)**: Overly ambitious, unlikely to complete
**Structure for each aim**:
1. **Aim Statement** (1-2 sentences, bold or underlined)
2. **Rationale and Background** (1-3 sentences)
3. **Working Hypothesis** (1 sentence, if applicable)
4. **Approach Summary** (2-4 sentences)
5. **Expected Outcomes and Interpretation** (1-2 sentences)
**Length per aim**: ~4-6 sentences (¼ to ⅓ page)
**Relationships between aims**:
- **Independent**: Failure of one aim doesn't doom the others
- **Synergistic**: Aims build on each other or address complementary questions
- **Progressive**: Aim 1 enables Aim 2, Aim 2 enables Aim 3 (be careful—creates risk)
#### Example Aim Structure:
**Aim 1: Identify CAF-secreted factors that mediate gemcitabine resistance.**
*Rationale*: CAF-conditioned medium confers significant protection against gemcitabine (Fig. 1), suggesting secreted factors are responsible. We have identified 15 candidate proteins enriched in CAF secretomes from resistant versus sensitive contexts (Table 1).
*Working Hypothesis*: CAFs secrete specific growth factors and cytokines (including IL-6, CXCL12, and HGF) that activate pro-survival pathways in tumor cells.
*Approach*: We will (1) validate candidate factors using neutralizing antibodies in co-culture assays, (2) measure activation of downstream signaling pathways (STAT3, PI3K/AKT, MAPK) in tumor cells, and (3) perform CRISPR screens in CAFs to identify factors required for resistance phenotype. We will use patient-derived CAFs and tumor cells to ensure clinical relevance.
*Expected Outcomes*: We expect to identify 3-5 CAF-secreted factors sufficient and necessary for gemcitabine resistance, and define their signaling mechanisms. These will serve as therapeutic targets for Aims 2-3.
---
**Aim 2: Determine the mechanisms by which CAF signals activate protective autophagy in tumor cells.**
*Rationale*: Our data show that CAF-mediated resistance requires autophagy (Fig. 2), but the signaling pathways linking CAF factors to autophagy activation remain unknown.
*Working Hypothesis*: CAF-secreted factors activate mTOR-independent autophagy through AMPK and ULK1 phosphorylation.
*Approach*: We will (1) measure autophagy flux in tumor cells exposed to CAF factors using LC3 turnover assays and electron microscopy, (2) define signaling pathways using phosphoproteomic analysis and pharmacologic inhibitors, and (3) validate pathways using genetic knockdowns (shRNA/CRISPR) of key nodes. Studies will be performed in 2D and 3D co-culture systems.
*Expected Outcomes*: We will define the signaling cascade from CAF factors to autophagy activation, identifying druggable nodes for combination therapy. Results will inform Aim 3 therapeutic strategies.
---
**Aim 3: Evaluate combination therapies targeting CAF-tumor interactions in preclinical models.**
*Rationale*: Single-agent therapies targeting CAFs or autophagy have shown limited efficacy clinically, suggesting combination approaches are needed.
*Working Hypothesis*: Dual inhibition of CAF signaling and autophagy will synergistically restore gemcitabine sensitivity in vivo.
*Approach*: Using patient-derived xenograft (PDX) models and genetically engineered mouse models (GEMM) of PDAC, we will test combinations of (1) gemcitabine + CAF pathway inhibitors identified in Aim 1, (2) gemcitabine + autophagy inhibitors, and (3) triple combinations. We will assess tumor growth, survival, and mechanism (IHC, RNA-seq) in n=10-15 mice per group.
*Expected Outcomes*: We expect combination therapies will reduce tumor growth by ≥60% compared to gemcitabine alone, with synergistic effects. The most effective regimen will be advanced toward clinical translation through an investigator-initiated trial (we have IND-enabling resources available at our institution).
### Closing Paragraph: Impact and Significance
**Purpose**: Leave reviewers with enthusiasm and clear understanding of importance
**What to include**:
- Expected outcomes of the overall project
- How findings will advance the field
- Positive impact on health or science
- Next steps or future directions
- Why this matters
**Length**: 2-4 sentences
**Writing tips**:
- Be confident but not arrogant
- Connect back to opening (full circle)
- Emphasize transformative potential
- Avoid over-promising
**Examples**:
"The proposed research is significant because it will define a novel mechanism of chemotherapy resistance in pancreatic cancer and identify new therapeutic targets to overcome this resistance. Results will provide mechanistic insights into CAF-tumor interactions that drive drug resistance, immediately applicable to clinical trial design. We expect findings will enable rational design of combination therapies that improve outcomes for PDAC patients, who currently have few effective treatment options. This work will establish new paradigms for targeting the tumor microenvironment in solid cancers."
## Writing Principles
### Clarity and Accessibility
**Write for a mixed audience**:
- Some panel members will be experts in your area
- Others will be in related but not identical fields
- Program officers and council members will read it
- Some reviewers will only read this page before scoring
**Strategies**:
- Define technical terms at first use
- Explain abbreviations (except very common ones)
- Use clear, direct language
- Avoid excessive jargon
- Make logical flow obvious
### Confidence Without Arrogance
**Confident** ✅:
- "Our preliminary data demonstrate..."
- "We have established a robust model system..."
- "This approach will elucidate..."
**Arrogant** ❌:
- "We are uniquely qualified..."
- "Only our lab can do this..."
- "This will revolutionize the field..."
**Tentative** ❌:
- "We hope to..."
- "We will try to..."
- "It is possible that..."
### Active and Specific
**Aim statements should**:
- Start with action verbs (Determine, Identify, Elucidate, Define, Characterize, Validate, Develop)
- Be specific and testable
- Indicate what will be learned
**Weak Aim** ❌:
"Aim 1: Study the role of protein X in disease Y"
**Strong Aim** ✅:
"Aim 1: Determine how protein X phosphorylation regulates disease Y progression using genetic and pharmacologic approaches"
### Show Feasibility
**Throughout the aims page**:
- Mention preliminary data (figures, tables)
- Reference established methods
- Show you have necessary resources
- Demonstrate expertise
- Indicate prior success
**Don't**:
- Relegate all preliminary data to Research Strategy
- Make it seem like you're starting from scratch
- Propose overly ambitious aims without support
## Common Mistakes
### Mistake 1: Too Much Background
❌ Half page of background before getting to aims
✅ Focused background that motivates your specific approach
The aims page is NOT a mini review article. Provide only enough background to establish importance and gaps.
### Mistake 2: Vague Objectives
❌ "We will study the mechanisms of disease X"
❌ "We will investigate the role of protein Y"
✅ "We will identify the phosphorylation sites on protein Y that regulate its interaction with Z using mass spectrometry and mutagenesis"
### Mistake 3: Overly Ambitious Scope
❌ Four aims, each of which could be a separate R01
❌ Proposing to solve multiple major questions in the field
❌ "Boil the ocean" approach
✅ Focused aims that are clearly achievable in 3-5 years
### Mistake 4: Dependent Aims
❌ Aim 2 and Aim 3 both require Aim 1 to succeed
✅ Aims are synergistic but independent (failure of one doesn't doom the others)
### Mistake 5: No Preliminary Data Mentioned
❌ Seems like a fishing expedition
❌ Reviewers wonder if it's feasible
✅ Brief mentions of preliminary data throughout (refer to figures)
### Mistake 6: Weak Impact Statement
❌ "This will advance our understanding of X"
❌ "Results will be published and presented"
✅ "This will identify new therapeutic targets for disease X, affecting 500,000 patients annually, and provide the foundation for investigator-initiated clinical trials"
### Mistake 7: Jargon-Heavy First Paragraph
❌ Opening sentence full of abbreviations and specialized terminology
❌ Assumes all reviewers are experts in your subfield
✅ Opening that's comprehensible to broad scientific audience
### Mistake 8: No Clear Hypothesis
❌ Just listing aims without unifying framework
❌ Purely descriptive aims
✅ Clear, testable hypothesis that unifies the aims
### Mistake 9: Forgetting Page Limits
❌ Using 1.1 pages (will be deleted or rejected)
❌ Tiny fonts to cram in more content (violations)
✅ Exactly 1 page with compliant formatting
### Mistake 10: Not Telling a Story
❌ Disconnected aims that feel like 3 separate projects
❌ No logical flow or coherence
✅ Unified narrative with aims building on each other
## Advanced Tips
### Use Visual Elements
**Figures on Specific Aims Page**:
- NIH allows figures on aims page
- Can be very effective to show key preliminary data
- Must be legible (font size requirements apply)
- Don't let figure crowd out text
- Typical: 1 small figure or panel showing most critical data
**Tables**:
- Can summarize preliminary data compactly
- Show patient characteristics, gene lists, etc.
- Must be readable
### Strategic Use of Bold/Italics
**Appropriate**:
- Bold aim statements to make them stand out
- Italicize gene names (standard convention)
- Underline key points (sparingly)
**Avoid**:
- Excessive formatting that looks cluttered
- All caps (looks like shouting)
- Colors (may not print/display correctly)
### The "Skim Test"
**Your aims page should pass the skim test**:
- Someone reading just aim statements should understand the project
- Bold aim statements that can be read independently
- Each paragraph has clear topic sentence
- Logical flow is apparent even when skimming
**Exercise**: Ask colleague to read only bold/underlined text—can they understand the project?
### Tailoring to Career Stage
**Early Stage Investigators**:
- Show you've thought through challenges
- Demonstrate strong mentorship and institutional support
- Emphasize innovation while ensuring feasibility
- Don't over-promise
**Established Investigators**:
- Show how this extends your research program
- Emphasize track record implicitly
- Can propose more ambitious aims if supported by extensive preliminary data
- Show how this opens new directions
## Examples of Strong Opening Paragraphs
### Example 1: Cancer Biology
"Metastatic breast cancer kills over 42,000 women annually in the United States, with median survival of only 2-3 years after diagnosis. While primary tumors are often curable, metastatic disease remains incurable due to therapy resistance and tumor heterogeneity. The emergence of drug-resistant cell populations during treatment represents the major barrier to long-term survival, yet the mechanisms governing resistance evolution remain poorly understood. Understanding how tumor heterogeneity and plasticity drive resistance could reveal new therapeutic strategies to prevent or reverse treatment failure."
### Example 2: Neuroscience
"Alzheimer's disease (AD) affects 6.7 million Americans and is projected to reach 13 million by 2050, with annual costs exceeding $355 billion. Despite decades of research focused on amyloid-β and tau pathologies, no disease-modifying therapies exist. Emerging evidence implicates synaptic dysfunction as the earliest pathological event in AD, preceding neurodegeneration by years. The molecular mechanisms linking synaptic failure to cognitive decline represent a critical therapeutic window, yet remain poorly defined. Identifying early synaptic alterations could enable intervention before irreversible neuronal loss occurs."
### Example 3: Infectious Disease
"Antimicrobial-resistant (AMR) infections cause over 2.8 million illnesses and 35,000 deaths annually in the US, with healthcare costs exceeding $4.6 billion. Carbapenem-resistant Enterobacterales (CRE) represent an urgent threat, with mortality rates exceeding 50% for bloodstream infections. Despite this crisis, only two new antibiotics targeting CRE have been approved in the past decade, both with significant limitations. Novel therapeutic approaches that bypass traditional antibiotic mechanisms are urgently needed to combat this growing threat. Targeting host-pathogen interactions rather than bacterial viability represents a promising strategy to combat AMR while reducing selection pressure for resistance."
## Revision Checklist
Before finalizing, ensure your aims page:
**Content**:
- [ ] Opens with compelling statement of importance
- [ ] Clearly defines the gap or problem
- [ ] States specific, measurable objective
- [ ] Presents testable hypothesis (or focused research questions)
- [ ] Mentions preliminary data supporting feasibility
- [ ] Includes 2-4 specific aims
- [ ] Each aim is testable and achievable
- [ ] Aims are independent but synergistic
- [ ] Expected outcomes are clearly stated
- [ ] Closes with impact and significance
**Clarity**:
- [ ] First paragraph is accessible to non-specialists
- [ ] Technical terms are defined
- [ ] Abbreviations are spelled out at first use
- [ ] Logical flow is clear
- [ ] Aim statements can stand alone
- [ ] Language is confident and active
**Format**:
- [ ] Exactly 1 page
- [ ] 0.5-inch margins
- [ ] 11-point font or larger
- [ ] Readable line spacing
- [ ] Compliant with NIH formatting requirements
- [ ] Figures (if included) are legible
**Impact**:
- [ ] Passes the "skim test"
- [ ] Would make you excited if you were a reviewer
- [ ] Clearly articulates significance
- [ ] Shows feasibility without over-selling
- [ ] Connects to health or scientific impact
## Final Thoughts
The Specific Aims page is where grants are won or lost. **Invest time in getting this right**:
- Write 10+ drafts
- Get feedback from colleagues and mentors
- Test it on people outside your field
- Read it aloud to check flow
- Let it sit, then revise with fresh eyes
- Study funded examples in your field
**Remember**: Reviewers are reading 10-20 applications. Your aims page needs to immediately communicate importance, innovation, and feasibility—and make them want to fund your work.
---
**Key Takeaway**: The perfect Specific Aims page tells a compelling story in exactly one page—establishing a significant problem, presenting an innovative and feasible solution, showing preliminary evidence of success, and articulating transformative impact. Every sentence must earn its place.