Initial commit

This commit is contained in:
Zhongwei Li
2025-11-30 08:30:18 +08:00
commit 74bee324ab
335 changed files with 147377 additions and 0 deletions

View File

@@ -0,0 +1,392 @@
# Broader Impacts: Strategies and Best Practices
## Overview
**Broader Impacts** are one of two review criteria for NSF proposals, carrying equal weight with Intellectual Merit. Despite this, broader impacts are often treated as an afterthought—a critical mistake that costs otherwise strong proposals their funding.
**NSF Definition**: "The potential to benefit society and contribute to the achievement of specific, desired societal outcomes"
**Key Principle**: Broader impacts must be **specific, measurable, and integrated** with your research plan—not vague aspirations tacked onto the end.
## The Five Pillars of Broader Impacts
NSF evaluates broader impacts across five main areas. **You don't need to address all five**, but you should address at least one substantively with concrete activities, timelines, and assessment plans.
### 1. Advance Discovery While Promoting Teaching, Training, and Learning
**What This Means**: Integrate research and education to inspire the next generation of scientists and enhance scientific literacy.
**Effective Strategies**:
**Curriculum Development**:
- Create new courses incorporating research findings
- Develop course modules or laboratory exercises
- Design online learning materials (MOOCs, videos, interactive tools)
- Contribute to textbooks or educational resources
*Example*: "We will develop a 10-week computational biology module for undergraduate education, incorporating real datasets from this project. The module will include Jupyter notebooks with guided analysis, video tutorials, and assessment tools. Materials will be piloted at our institution (reaching 50 students annually) and made freely available through CourseSource for national adoption."
**Student Training**:
- Undergraduate research experiences
- Graduate student mentoring
- Postdoctoral training
- High school intern programs
- Research experiences for teachers (RET)
*Example*: "The project will support 3 PhD students and 6 undergraduate researchers over 5 years. Undergraduates will participate through our existing summer research program (10 weeks, $5,000 stipends) and will present findings at the annual undergraduate research symposium and regional conferences."
**Pedagogical Innovation**:
- Problem-based learning modules
- Active learning strategies
- Research-intensive courses
- Service learning projects
- Maker spaces or hands-on workshops
*Example*: "We will transform our introductory physics course (250 students/year) by implementing studio-style physics instruction based on results from this research. The new curriculum will include 3D visualization tools for electromagnetic fields, inquiry-based problem sets, and peer instruction protocols."
**Professional Development**:
- Workshops for faculty or teachers
- Training programs for early-career researchers
- Mentoring programs
- Career development resources
*Example*: "We will host annual 3-day workshops for 25 community college faculty, providing training in genome editing techniques. Participants will receive hands-on experience with CRISPR methods developed in this project, complete teaching modules for their courses, and ongoing support through a virtual learning community."
### 2. Broaden Participation of Underrepresented Groups
**What This Means**: Increase participation of groups underrepresented in STEM, including women, racial/ethnic minorities, persons with disabilities, and those from economically disadvantaged backgrounds.
**Effective Strategies**:
**Partnerships with Minority-Serving Institutions**:
- Collaborate with HBCUs (Historically Black Colleges and Universities)
- Partner with HSIs (Hispanic-Serving Institutions)
- Work with TCUs (Tribal Colleges and Universities)
- Engage with community colleges
*Example*: "We will establish formal research partnerships with 4 regional HBCUs (North Carolina A&T, Howard University, Morehouse College, and Spelman College). Each summer, 2 students from partner institutions will participate in 10-week research internships, including stipends ($6,000), housing, travel to field sites, and participation in our weekly research seminar series. A faculty liaison from each partner institution will co-mentor students and facilitate year-round engagement."
**Recruitment and Retention**:
- Targeted recruitment at conferences (SACNAS, ABRCMS, NSBE, SWE)
- Scholarship programs for underrepresented students
- Bridge programs for community college transfers
- Retention support (mentoring, peer networks, professional development)
*Example*: "We will recruit 50% of summer undergraduate researchers from groups underrepresented in computer science through partnerships with SACNAS and the National Society of Black Engineers. Participants will receive mentoring from graduate students with similar backgrounds, attend professional development workshops, and join our diversity-in-computing learning community that provides year-round support and networking."
**Culturally Relevant Engagement**:
- Research addressing community-identified needs
- Community-based participatory research
- Engagement with indigenous communities
- Bilingual materials and outreach
*Example*: "In partnership with the Navajo Nation, we will conduct participatory research on water quality in reservation communities. Community members will co-design the research questions, participate in data collection, and contribute indigenous knowledge about local hydrology. Results will be shared through community presentations in both English and Navajo, and will inform tribal water management policies."
**Addressing Systemic Barriers**:
- Flexible schedules for non-traditional students
- Childcare support for participants
- Accessible facilities and materials
- Financial support (stipends, travel, equipment)
- Mentoring networks and affinity groups
*Example*: "To support participation of students from low-income backgrounds, we will provide laptop computers, software licenses, and internet hotspots to all research participants. We will also offer flexible work schedules, remote participation options, and supplemental funding for students with childcare or eldercare responsibilities."
### 3. Enhance Infrastructure for Research and Education
**What This Means**: Build facilities, tools, databases, or networks that enable future research and education across the broader community.
**Effective Strategies**:
**Shared Research Infrastructure**:
- Multi-user instrumentation
- Core facilities
- Field stations or observatories
- Computational resources
- Cyberinfrastructure
*Example*: "We will establish a regional Cryo-Electron Microscopy facility serving 15 institutions in the Southwest. The facility will provide training and access to state-of-the-art imaging capabilities currently unavailable in the region. We will operate a user program with subsidized rates for academic users and offer annual training workshops for 50 researchers."
**Data and Software Resources**:
- Open-access databases
- Software tools and platforms
- Analysis pipelines
- Standardized protocols
- Data repositories
*Example*: "We will develop and maintain EcoDataHub, an open-source platform for ecological time-series analysis. The platform will include automated data cleaning, standardized analysis workflows, interactive visualization tools, and cloud computing integration. Software will be documented, version-controlled on GitHub, and supported through user forums and quarterly webinars. We expect 1,000+ users within 3 years based on community surveys."
**Biological or Physical Resources**:
- Living stock centers (model organisms, cell lines)
- Specimen collections
- Reagent repositories
- Seed banks or tissue collections
*Example*: "We will establish a publicly accessible repository of 500 sequenced bacterial strains isolated from extreme environments. Each strain will include full genome sequence, phenotypic characterization, and growth protocols. Materials will be available through the ATCC with metadata deposited in NCBI BioProject."
**Standards and Protocols**:
- Community standards
- Best practices guides
- Benchmarking datasets
- Quality control metrics
- Interoperability frameworks
*Example*: "Working with 20 international laboratories, we will develop and validate standardized protocols for single-cell RNA sequencing analysis. The resulting guidelines will address batch effects, quality control, normalization methods, and statistical best practices. Protocols will be published in peer-reviewed literature and deposited in protocols.io."
### 4. Broadly Disseminate to Enhance Scientific and Technological Understanding
**What This Means**: Communicate research to broader audiences including the public, K-12 students, policymakers, and stakeholders to enhance scientific literacy and informed decision-making.
**Effective Strategies**:
**K-12 Education Outreach**:
- School visits and science demonstrations
- After-school programs
- Science fairs and competitions
- Teacher professional development
- Classroom resources and lesson plans
*Example*: "We will partner with 10 local middle schools (serving 75% students from low-income families) to deliver hands-on robotics workshops. Each school will receive robot kits, and we will train teachers to lead a 12-week after-school robotics club. Students will apply concepts from this research (sensor fusion, autonomous navigation) to design robots for real-world challenges. The program will reach 200 students annually."
**Public Engagement**:
- Museum partnerships and exhibits
- Science cafés and public lectures
- Science festivals
- Citizen science projects
- Community workshops
*Example*: "We will collaborate with the Museum of Science and Industry to create a permanent interactive exhibit on climate modeling. The exhibit will allow visitors to manipulate climate variables and observe predicted outcomes using simplified versions of our models. We anticipate 500,000 annual visitors. We will also host quarterly 'Climate Science Saturday' public lectures reaching 2,000 community members annually."
**Media and Communications**:
- Blog posts and articles
- Podcasts or videos
- Social media engagement
- Press releases for major findings
- Popular science writing
*Example*: "We will produce a 6-episode podcast series exploring the intersection of artificial intelligence and creativity, featuring interviews with artists, musicians, and computer scientists. Episodes will be freely available on major platforms, with transcripts and educational materials on our website. Based on our existing podcast (15,000 downloads/episode), we expect to reach 100,000+ listeners."
**Policy Engagement**:
- Science policy fellowships
- Congressional briefings
- White papers for decision-makers
- Stakeholder workshops
- Regulatory science contributions
*Example*: "We will organize annual workshops bringing together researchers, water utilities, environmental regulators, and community advocates to discuss implications of our research for drinking water policy. Findings will be synthesized into policy briefs distributed to state and federal agencies. PI will participate in the AAAS Science and Technology Policy Fellowship to engage directly with EPA rulemaking."
**Citizen Science**:
- Community-based data collection
- Participatory research design
- Volunteer monitoring programs
- Crowdsourcing platforms
*Example*: "We will launch a citizen science program enlisting 500 volunteers across the Midwest to monitor pollinator populations using our smartphone app. Participants will receive training materials, identification guides, and regular feedback on their observations. Data will contribute directly to our research while building public understanding of pollinator ecology. Results will be visualized on an interactive public dashboard."
### 5. Benefit Society
**What This Means**: Apply research to address societal needs, improve quality of life, strengthen national security, or enhance economic competitiveness.
**Effective Strategies**:
**Health and Well-Being**:
- Clinical applications
- Public health improvements
- Healthcare accessibility
- Mental health resources
- Environmental health
*Example*: "Our diagnostic tool will reduce costs of malaria diagnosis from $10 to $0.50 per test, enabling deployment in resource-limited settings. We will partner with PATH and Médecins Sans Frontières to conduct field trials in 3 African countries and develop manufacturing partnerships for at-scale production. We project this technology could reach 10 million patients annually within 5 years."
**Economic Development**:
- Technology commercialization
- Job creation
- Industry partnerships
- Workforce development
- Startup formation
*Example*: "We will establish an industry partnership program with 5 regional manufacturing companies to transfer our advanced materials synthesis methods. Through quarterly technical workshops and on-site consultations, we will help companies integrate these processes into production lines, potentially creating 50-100 high-skill jobs over 5 years. Two graduate students will complete internships at partner companies."
**Environmental Sustainability**:
- Climate change mitigation or adaptation
- Conservation and biodiversity
- Pollution reduction
- Sustainable agriculture
- Renewable energy
*Example*: "Our soil carbon sequestration practices will be implemented on 1,000 acres of working farmland in partnership with 15 Iowa farmers. We will provide training, monitoring support, and carbon credit market access. If successful, practices could sequester 100,000 tons of CO2 equivalent annually if adopted across 10% of Midwest cropland, while increasing farmer income by $50-100/acre through carbon credits."
**National and Homeland Security**:
- Defense applications
- Cybersecurity
- Critical infrastructure protection
- Emergency response
- Intelligence capabilities
*Example*: "We will work with the Department of Homeland Security to adapt our threat detection algorithms for transportation security screening. Technology will be piloted at 3 major airports, with the goal of reducing false-positive rates by 40% while maintaining security effectiveness, decreasing passenger wait times and improving screening efficiency."
**Social and Cultural Benefits**:
- Preservation of cultural heritage
- Accessibility and inclusion
- Social justice
- Arts and humanities
- Quality of life improvements
*Example*: "Our 3D scanning and virtual reality platform will be used to digitally preserve 20 culturally significant sites threatened by climate change and development. Virtual reconstructions will be made freely available to descendant communities, schools, and the public through a web-based interface and VR experiences. We will partner with indigenous groups to ensure culturally appropriate representation."
## Best Practices for Broader Impacts
### Be Specific and Concrete
**Vague** ❌:
"This research will train the next generation of scientists."
**Specific** ✅:
"This project will support 3 PhD students, 2 postdocs, and 12 undergraduate researchers over 5 years. Undergraduates will be recruited through our partnership with the Louis Stokes Alliance for Minority Participation, with a goal of 50% participation from underrepresented groups. Students will receive training in advanced microscopy, data analysis, and scientific communication, and will present their research at the annual Emerging Researchers National Conference."
### Include Timelines and Milestones
**Vague** ❌:
"We will develop educational materials."
**Specific** ✅:
"Year 1: Develop draft curriculum modules and pilot with 50 students
Year 2: Revise based on assessment data and expand to 150 students across 3 institutions
Years 3-5: National dissemination through CourseSource, workshops at 2 professional conferences, and online repository. Target: Adoption by 20 institutions reaching 1,000 students annually by Year 5."
### Measure and Assess Impact
**Include**:
- Quantitative metrics (number of participants, downloads, users)
- Qualitative assessment (surveys, interviews, focus groups)
- Learning outcomes or behavioral changes
- Longitudinal tracking
- Comparison to baseline or control groups
**Example**:
"We will assess program effectiveness through: (1) Pre/post surveys measuring science self-efficacy using validated instruments, (2) Tracking participant persistence in STEM majors through institutional records, (3) Focus groups with participants and teachers, (4) Analysis of student work products. We expect to see a 30% increase in science self-efficacy scores and 90% retention in STEM majors among participants compared to 65% institutional baseline."
### Leverage Existing Infrastructure
**Don't reinvent the wheel**—build on existing programs and partnerships:
- Institutional programs (REU sites, AGEP, LSAMP, etc.)
- Community partnerships already established
- Shared facilities or resources
- Professional societies and organizations
**Example**:
"We will integrate with our institution's existing NSF REU site in Materials Science, adding 2 additional positions focused on our research area. This leverages established recruitment pipelines with 15 partner institutions, professional development programming, and assessment infrastructure while expanding opportunities for undergraduate researchers."
### Demonstrate Institutional Commitment
**Show that broader impacts will continue beyond grant period**:
- Institutional cost-sharing or support
- Integration into ongoing programs
- Sustainability plan
- Letters of commitment from partners
**Example**:
"The university has committed $50,000 annually in cost-share to sustain the high school outreach program beyond the grant period. The program will be integrated into our Center for STEM Education, ensuring administrative support, space, and continuity. Our partner school districts have committed teacher time and classroom access (see letters of commitment in supplementary documents)."
### Align with Research Plan
**Integration examples**:
- Students work on research questions from the proposal
- Educational materials use data generated by the research
- Outreach communicates research findings
- Community needs inform research questions
**Poor Integration** ❌:
Research on quantum computing + Unrelated marine biology outreach for middle schoolers
**Good Integration** ✅:
Research on quantum computing + Develop quantum computing curriculum modules + Summer program where students program quantum simulators + Public lectures on quantum technologies
## Common Broader Impacts Mistakes
### Mistake 1: Generic and Vague Statements
❌ "This project will train graduate students and postdocs."
❌ "Results will be broadly disseminated through publications and conferences."
❌ "We will engage in outreach activities."
These are baseline expectations, not broader impacts.
### Mistake 2: No Plan or Timeline
❌ "We hope to develop educational materials that could be used nationally."
✅ "Year 1: Develop and pilot 5 teaching modules. Year 2: Assess effectiveness and refine. Year 3: Publish in Journal of Chemical Education. Years 4-5: Disseminate through workshops at 3 national conferences and online repository. Target: Adoption by 30 institutions by Year 5."
### Mistake 3: No Assessment
❌ "We will run a summer camp for underrepresented students."
✅ "We will run a 4-week summer camp for 30 students (60% from underrepresented groups). We will assess impact through pre/post content knowledge tests, science identity surveys, and tracking of STEM course enrollment. We expect 80% of participants to enroll in advanced science courses the following year."
### Mistake 4: Unrealistic Scope
❌ "We will establish a national network of 100 schools, develop a comprehensive K-12 curriculum, create a museum exhibit, launch a nationwide citizen science program, and commercialize our technology" (with no budget or personnel allocated).
Be realistic about what you can accomplish with the resources and time available.
### Mistake 5: Poor Integration
❌ Research on plant genomics + Unrelated robotics outreach
✅ Research on plant genomics + Develop plant biology curriculum + Engage community gardens in phenotyping citizen science
### Mistake 6: Treating as Afterthought
❌ Half-page generic statement at end of proposal with no budget allocation
✅ Integrated throughout proposal, dedicated personnel (0.5 month PI time, 10% grad student, summer coordinator), allocated budget ($15K/year), detailed plan, and assessment strategy
### Mistake 7: No Track Record
If proposing extensive broader impacts activities but have no history of such work, reviewers will be skeptical.
✅ Show preliminary efforts, leverage existing programs, include collaborators with relevant expertise, cite successful prior broader impacts work
## Budgeting for Broader Impacts
**NSF expects resources allocated to broader impacts activities.**
**Typical Budget Items**:
- **Personnel**: Program coordinator, graduate students, undergraduate assistants
- **Participant support**: Stipends, travel, housing for students/teachers
- **Materials and supplies**: Educational materials, outreach equipment, workshop supplies
- **Travel**: Conference presentations of broader impacts work, site visits to partners
- **Subawards**: Payments to partnering institutions or organizations
- **Evaluation**: External evaluator for assessment
**Example Budget**:
- Summer program coordinator (2 months/year): $15,000/year
- Undergraduate stipends (10 students × $5,000): $50,000/year
- Materials and supplies for workshops: $5,000/year
- Travel for recruitment and partner meetings: $3,000/year
- External evaluator: $8,000/year
- **Total: $81,000/year (16% of $500K budget)**
## Resources for Broader Impacts
### NSF Resources
- **NSF Broader Impacts Website**: https://www.nsf.gov/od/oia/special/broaderimpacts/
- **BI Examples Repository**: https://www.cmu.edu/uro/resources for undergraduate research/best practices/broader-impacts.html
- **Broader Impacts Toolkit**: Many universities provide institutional resources
### Assessment Tools
- **STEM-OP (STEM Outreach Program)**: Survey instruments for outreach assessment
- **STELAR Network**: Resources for informal STEM education
- **Evaluation frameworks**: Logic models, theory of change
### Partner Organizations
- **SACNAS**: Society for Advancement of Chicanos/Hispanics and Native Americans in Science
- **ABRCMS**: Annual Biomedical Research Conference for Minority Students
- **NSBE, SWE, AISES**: Professional societies for underrepresented groups
- **Science museums and centers**: Partner for public engagement
- **School districts and community organizations**: For K-12 outreach
---
**Key Takeaway**: Effective broader impacts are specific, measurable, assessed, integrated with the research plan, and demonstrate institutional commitment. They should be planned with the same rigor as the research itself, with dedicated resources, timelines, milestones, and evaluation strategies. Generic statements about "training students" or "disseminating results" are insufficient—NSF expects concrete plans that demonstrably benefit society.

View File

@@ -0,0 +1,636 @@
# DARPA (Defense Advanced Research Projects Agency) Grant Writing Guidelines
## Agency Overview
**Mission**: Make pivotal investments in breakthrough technologies for national security
**Tagline**: "Creating breakthrough technologies and capabilities for national security"
**Annual Budget**: ~$4 billion
**Website**: https://www.darpa.mil
**Key Characteristics**:
- High-risk, high-reward research
- Focused on revolutionary breakthroughs, not incremental advances
- Technology transition to military and commercial applications
- Program managers with broad autonomy
- ~3-5 year programs with defined end goals
- Strong emphasis on prototypes and demonstrations
- "DARPA-hard" problems that others won't or can't tackle
**The DARPA Difference**:
- NOT basic research (that's ONR, AFOSR, ARO)
- NOT development and procurement (that's service acquisition)
- Focused on proof-of-concept to prototype stage
- Tolerates and expects failure in pursuit of breakthroughs
- Rapid transition to operational use
## DARPA Organization
### Six Technical Offices
#### 1. BTO (Biological Technologies Office)
**Focus**: Biology as technology, human-machine interfaces, synthetic biology
**Example Programs**:
- Neural interfaces and brain-computer interfaces
- Synthetic biology and living foundries
- Pandemic prevention and response
- Human performance enhancement
- Biotechnology for manufacturing
#### 2. DSO (Defense Sciences Office)
**Focus**: High-risk, high-payoff research in physical and mathematical sciences
**Example Programs**:
- Novel materials and chemistry
- Quantum technologies
- Electromagnetics and photonics
- Mathematics and algorithms
- Fundamental limits of physics
#### 3. I2O (Information Innovation Office)
**Focus**: Information advantage through computing, communications, and cyber
**Example Programs**:
- Artificial intelligence and machine learning
- Cybersecurity and cyber resilience
- Communications and networking
- Data analytics and processing
- Human-computer interaction
#### 4. MTO (Microsystems Technology Office)
**Focus**: Microelectronics, photonics, and heterogeneous microsystems
**Example Programs**:
- Advanced electronics and integrated circuits
- Photonics and optical systems
- Novel computational architectures
- RF and millimeter-wave systems
- MEMS and sensors
#### 5. STO (Strategic Technology Office)
**Focus**: Technologies for space, air, maritime, and ground systems
**Example Programs**:
- Autonomous systems (air, ground, sea, space)
- Advanced propulsion and power
- Space technologies
- Electronic warfare
- Long-range precision fires
#### 6. TTO (Tactical Technology Office)
**Focus**: Near-term technologies for ground, maritime, and expeditionary forces
**Example Programs**:
- Tactical autonomy
- Advanced weapons
- Urban operations
- Maneuver and logistics
- Special operations support
## How DARPA Works
### Program Manager-Centric Model
**Program Managers (PMs)**:
- ~100 PMs across DARPA
- Hired on 3-5 year rotations from academia, industry, government labs
- Have significant autonomy to create and run programs
- Identify "DARPA-hard" problems and solutions
- Manage portfolios of 10-20 projects
**PM Lifecycle**:
1. **Develop vision**: Identify transformative opportunity
2. **Create program**: Design research thrusts and metrics
3. **Issue BAA**: Broad Agency Announcement for proposals
4. **Select teams**: Choose performers and structure program
5. **Manage program**: Track milestones, adjust course, transition technology
6. **Transition**: Hand off successful technologies to services or industry
**Implication for Proposers**:
- PMs have the vision—your job is to execute it
- Contact PM before proposing (almost always required)
- Understand PM's technical vision and goals
- Build relationship with PM (within ethical bounds)
### The "DARPA-Hard" Test
**Three Questions Every DARPA Program Must Answer**:
1. **What are you trying to do?**
- Articulate objectives using absolutely no jargon
- Clear, specific technical goal
2. **How is it done today, and what are the limits of current practice?**
- What's the current state of the art?
- Why are current approaches insufficient?
- What fundamental barriers exist?
3. **What is new in your approach, and why do you think it will be successful?**
- What's the breakthrough insight or capability?
- Why hasn't this been done before?
- What's changed to make it possible now?
**Additional Considerations**:
- **Who cares?** (What's the national security impact?)
- **What if you're right?** (What becomes possible?)
- **What if you're wrong?** (Is the risk acceptable?)
- **What if you succeed?** (Is there a transition path?)
**DARPA Seeks**:
- **High Risk**: 50% chance of failure is acceptable
- **High Reward**: 10x improvement, not 10% improvement
- **Measurable**: Clear metrics of success
- **Transitional**: Path to operational use or commercial adoption
## Types of DARPA Solicitations
### 1. Broad Agency Announcements (BAAs)
**Most Common Mechanism**: Open solicitations for specific program areas
**Characteristics**:
- Issued by program managers for specific programs
- Describe technical objectives and research thrusts
- Multiple submission deadlines or rolling submission
- Full proposals typically 20-40 pages
- Often require abstract or white paper first
**Types of BAAs**:
**Program BAAs**: For specific named programs
- Clear technical objectives and metrics
- Defined research areas (thrusts)
- Specified deliverables and milestones
- Known PM with clear vision
**Office-Wide BAAs**: General solicitations by technical office
- Broader scope, less prescriptive
- Looking for transformative ideas
- More flexibility in approach
- May have multiple areas of interest
### 2. Small Business Innovation Research (SBIR)
**For Small Businesses**:
- **Phase I**: $150K-$250K, 6-9 months (feasibility)
- **Phase II**: $1M-$2M, 2 years (development)
- **Phase III**: Non-SBIR funds (commercialization)
### 3. Proposers Days and Special Notices
**Proposers Day**: Pre-solicitation event
- PM presents program vision and objectives
- Q&A with potential proposers
- Networking for team formation
- Often required or strongly encouraged to attend
**Special Notices**: Requests for Information (RFIs), teaming opportunities
## DARPA Proposal Structure
**Note**: Format varies by BAA. **Always follow the specific BAA instructions precisely.**
### Typical Structure
#### Volume 1: Technical and Management Proposal (20-40 pages)
**Section 1: Executive Summary** (1-2 pages)
- Overview of proposed research
- Technical approach and innovation
- Expected outcomes and deliverables
- Team qualifications
- Alignment with BAA objectives
**Section 2: Goals and Impact** (2-3 pages)
- Statement of the problem
- Importance and national security relevance
- Current state of the art and limitations
- How your work will advance the state of the art
- Impact if successful (What if true? Who cares?)
- Alignment with DARPA program goals
**Section 3: Technical Approach and Innovation** (10-20 pages)
- Detailed technical plan organized by phase or thrust
- Novel approaches and why they will work
- Technical risks and mitigation strategies
- Preliminary results or proof-of-concept data
- Technical barriers and how to overcome them
- Innovation and differentiation from existing work
**Organized by Phase** (typical):
**Phase 1 (Feasibility)**: 12-18 months
- Technical objectives and milestones
- Approach and methodology
- Expected outcomes
- Metrics for success
- Go/no-go criteria for Phase 2
**Phase 2 (Development)**: 18-24 months
- Building on Phase 1 results
- System integration and optimization
- Testing and validation
- Prototype development
- Metrics and evaluation
**Phase 3 (Demonstration)**: 12-18 months (if applicable)
- Field testing or operational demonstration
- Transition activities
- Handoff to transition partner
**Section 4: Capabilities and Resources** (2-3 pages)
- Team qualifications and expertise
- Facilities and equipment
- Relevant prior work and publications
- Subcontractor and collaborator roles
- Organizational structure
**Section 5: Statement of Work (SOW)** (3-5 pages)
- Detailed task breakdown
- Deliverables for each task
- Milestones and metrics
- Timeline (Gantt chart)
- Dependencies and critical path
- Government furnished property or information (if applicable)
**Section 6: Schedule and Milestones** (1-2 pages)
- Integrated master schedule
- Key decision points
- Deliverable schedule
- Go/no-go criteria
- Reporting and meeting schedule
**Section 7: Technology Transition Plan** (2-3 pages)
- Potential transition partners (military services, industry)
- Pathway to operational use or commercialization
- Market or operational analysis
- Transition activities during the program
- IP and licensing strategy (if applicable)
#### Volume 2: Cost Proposal (separate)
**Detailed Budget**:
- Costs by phase, task, and year
- Labor (personnel, hours, rates)
- Materials and supplies
- Equipment
- Travel
- Subcontracts
- Other direct costs
- Indirect costs (overhead, G&A)
- Fee or profit (for industry)
**Cost Narrative**:
- Justification for each cost element
- Labor categories and rates
- Basis of estimate
- Cost realism analysis
- Supporting documentation
**Supporting Documentation**:
- Cost accounting standards
- Approved indirect rate agreements
- Subcontractor quotes or cost proposals
#### Additional Volumes (if required)
**Attachments**:
- Quad charts (1-slide summary)
- Relevant publications or technical papers
- Letters of commitment from collaborators
- Facilities descriptions
- Equipment lists
## Review Criteria
### DARPA Evaluation Factors (Typical)
**Primary Criteria** (usually equal weight):
1. **Overall Scientific and Technical Merit**
- Technical soundness and feasibility
- Innovation and novelty
- Likelihood of achieving objectives
- Technical approach and methodology
- Understanding of problem and prior art
- Risk and risk mitigation
2. **Potential Contribution and Relevance to DARPA Mission**
- Alignment with program objectives
- National security impact
- Advancement over state of the art
- Potential for revolutionary breakthrough
- "What if true? Who cares?" test
3. **Cost Realism and Reasonableness**
- Budget aligned with technical plan
- Costs justified and realistic
- Value for investment
- Cost versus benefit analysis
4. **Capabilities and Related Experience**
- Team qualifications and track record
- Facilities and resources adequate
- Relevant prior work
- Ability to deliver on time and on budget
- Management approach
5. **Technology Transition**
- Pathway to operational use or market
- Transition partnerships
- Market analysis (if applicable)
- Plans for follow-on development
- IP strategy supporting transition
### The "Heilmeier Catechism"
**DARPA uses this set of questions** (created by former DARPA director George Heilmeier):
1. What are you trying to do? Articulate your objectives using absolutely no jargon.
2. How is it done today, and what are the limits of current practice?
3. What is new in your approach and why do you think it will be successful?
4. Who cares? If you succeed, what difference will it make?
5. What are the risks?
6. How much will it cost?
7. How long will it take?
8. What are the mid-term and final "exams" to check for success?
**Your proposal should clearly answer all eight questions.**
## DARPA Proposing Strategy
### Before Writing
**1. Contact the Program Manager**
- Email PM to introduce yourself and idea
- Request call to discuss fit with program
- Attend Proposers Day if available
- Ask clarifying questions about BAA
**2. Form a Strong Team**
- DARPA values multidisciplinary teams
- Include complementary expertise
- Mix of academia, industry, government labs
- Clearly defined roles
- Prior collaboration history (if possible)
**3. Understand the Vision**
- What is the PM trying to achieve?
- What technical barriers need to be overcome?
- What does success look like?
- What are the program metrics?
**4. Identify Transition Path**
- Who will use the technology?
- What's the path from prototype to product?
- Who are potential transition partners?
- What's the market or operational need?
### Writing the Proposal
**Lead with Impact**:
- Open with the "so what?"
- National security or economic impact
- What becomes possible if you succeed?
**Be Concrete and Specific**:
- Clear technical objectives with metrics
- Measurable milestones
- Quantitative targets (10x improvement, not "better")
- Specific deliverables
**Demonstrate Innovation**:
- What's the breakthrough?
- Why hasn't this been done before?
- What's changed to make it possible now?
- How is this different from evolutionary approaches?
**Address Risk Head-On**:
- Identify technical risks explicitly
- Explain mitigation strategies
- Show that you've thought through failure modes
- DARPA expects risk—don't hide it, manage it
**Show You Can Execute**:
- Detailed project plan with milestones
- Team with relevant track record
- Realistic schedule and budget
- Go/no-go decision points
- Management approach for complex programs
**Emphasize Transition**:
- Who will use the results?
- Path to operationalization or commercialization
- Engagement with potential users during program
- IP strategy that enables transition
### Common Mistakes
1. **Incremental Research**: Proposing 10% improvement instead of 10x
2. **Academic Focus**: Pure research without application focus
3. **No Transition Plan**: No pathway to use or commercialization
4. **Ignoring PM Vision**: Not aligned with program objectives
5. **Vague Metrics**: "Improve" or "enhance" instead of quantitative targets
6. **Underestimating Risk**: Claiming low risk (DARPA wants high risk, high reward)
7. **Weak Team**: Insufficient expertise or poorly defined roles
8. **No Differentiation**: Similar to existing efforts without clear advantage
9. **Ignoring BAA**: Not following proposal format or requirements
10. **Late Contact with PM**: Waiting until proposal due date to engage
## DARPA Contracting and Performance
### Award Types
**Procurement Contracts**: Most common for industry
- Firm Fixed Price (FFP)
- Cost Plus Fixed Fee (CPFF)
- Cost Plus Incentive Fee (CPIF)
**Grants and Cooperative Agreements**: For universities and nonprofits
- Grants: Minimal government involvement
- Cooperative Agreements: Substantial government involvement
**Other Transaction Agreements (OTAs)**: Flexible arrangements
- For research not requiring FAR compliance
- Faster, more flexible terms
- Common for consortia and partnerships
### Program Execution
**Kickoff Meeting**: Program launch with all performers
- PM presents program vision and goals
- Performers present approaches
- Technical exchange and collaboration
**Quarterly Reviews**: Progress reviews (virtual or in-person)
- Technical progress against milestones
- Challenges and solutions
- Path forward
- PM feedback and course corrections
**Annual or Phase Reviews**: Major assessment points
- Comprehensive technical review
- Go/no-go decisions
- Budget and schedule adjustments
**Site Visits**: PM and team visit performer sites
- See technical work firsthand
- Deep dive on specific areas
- Team building and collaboration
**Technical Interchange Meetings (TIMs)**: Deep dives on technical topics
- Cross-performer collaboration
- Sharing of results and approaches
- Problem-solving sessions
### Deliverables and Reporting
**Monthly Reports**: Brief progress updates
- Technical progress
- Budget status
- Issues and concerns
**Quarterly Reports**: Detailed technical reporting
- Accomplishments against milestones
- Data and results
- Upcoming activities
- Publications and IP
**Final Report**: Comprehensive program summary
- Technical achievements
- Lessons learned
- Transition activities
- Future directions
**Technical Data and Prototypes**: Specified in contract
- Software and code
- Hardware prototypes
- Data sets
- Documentation
## DARPA Culture and Expectations
### High Risk is Expected
- DARPA programs should have ~50% probability of failure
- Failure is acceptable if lessons are learned
- "Fail fast" to redirect resources
- Transparency about challenges valued
### Rapid Pivots
- PM may redirect program based on results
- Flexibility to pursue unexpected opportunities
- Willingness to stop unproductive efforts
- Adaptability is key
### Transition Focus
- Technology must have a path to use
- Engagement with transition partners during program
- Demonstrate prototypes and capabilities
- Handoff to services or industry
### Collaboration and Teaming
- Performers expected to collaborate
- Share results and insights (within IP bounds)
- Attend all program meetings
- Support overall program goals, not just own project
## Recent DARPA Priorities and Programs
### Key Technology Areas (2024-2025)
**Artificial Intelligence and Autonomy**:
- Trustworthy AI
- AI reasoning and understanding
- Human-AI teaming
- Autonomous systems across domains
**Quantum Technologies**:
- Quantum computing and algorithms
- Quantum sensing and metrology
- Quantum communications
- Post-quantum cryptography
**Biotechnology**:
- Pandemic prevention and response
- Synthetic biology
- Human performance
- Bio-manufacturing
**Microelectronics and Computing**:
- Advanced chip design and manufacturing
- Novel computing architectures
- 3D heterogeneous integration
- RF and millimeter-wave systems
**Hypersonics and Advanced Materials**:
- Hypersonic weapons and defense
- Advanced materials and manufacturing
- Thermal management
- Propulsion
**Space Technologies**:
- Space domain awareness
- On-orbit servicing and manufacturing
- Small satellite technologies
- Space-based intelligence
**Network Technologies**:
- Secure communications
- Resilient networks
- Spectrum dominance
- Cyber defense
## Tips for Competitive DARPA Proposals
### Do's
**Contact PM early** - Before writing, discuss your idea
**Attend Proposers Day** - Essential for understanding program
**Form strong team** - Complementary expertise, clear roles
**Be bold and ambitious** - 10x goals, not 10% improvements
**Quantify everything** - Specific metrics and targets
**Address transition** - Clear path to operational use
**Identify risks explicitly** - And explain mitigation
**Show preliminary results** - Proof of concept or feasibility
**Follow BAA exactly** - Format, page limits, content requirements
**Emphasize innovation** - What's revolutionary about your approach?
### Don'ts
**Don't propose incremental research** - DARPA wants breakthroughs
**Don't ignore national security relevance** - "Who cares?" matters
**Don't be vague** - Specific objectives, metrics, deliverables
**Don't hide risk** - DARPA expects and values high-risk research
**Don't forget transition** - Technology must have path to use
**Don't propose basic research** - That's for ONR, AFOSR, ARO
**Don't exceed page limits** - Automatic rejection
**Don't ignore PM feedback** - They're setting the direction
**Don't propose alone if team needed** - DARPA values strong teams
**Don't submit without PM contact** - Critical to gauge fit
## Resources
- **DARPA Website**: https://www.darpa.mil
- **DARPA Opportunities**: https://www.darpa.mil/work-with-us/opportunities
- **BAA Listings**: https://beta.sam.gov (search "DARPA")
- **DARPA Social Media**: Twitter @DARPA (PMs often announce programs)
- **SBIR/STTR**: https://www.darpa.mil/work-with-us/for-small-businesses
- **Heilmeier Catechism**: https://www.darpa.mil/about-us/timeline/heilmeier-catechism
### Key Contacts
- **DARPA Contracting**: via BAA points of contact
- **Program Managers**: Contact info in BAAs and program pages
- **SBIR/STTR Office**: sbir@darpa.mil
---
**Key Takeaway**: DARPA seeks revolutionary breakthroughs that advance national security, not incremental research. Successful proposals articulate clear, measurable objectives (answering "what if true?"), demonstrate innovative approaches to "DARPA-hard" problems, include strong multidisciplinary teams, proactively address technical risks, and provide realistic paths to transition. Early engagement with the Program Manager is essential—DARPA is a PM-driven agency where understanding the vision is critical to success.

View File

@@ -0,0 +1,586 @@
# DOE (Department of Energy) Grant Writing Guidelines
## Agency Overview
**Mission**: Ensure America's security and prosperity by addressing energy, environmental, and nuclear challenges through transformative science and technology solutions
**Annual Budget**: ~$50 billion (includes national laboratories, energy programs, nuclear security)
**Website**: https://www.energy.gov
**Key Characteristics**:
- Focus on energy, climate, environmental, computational, and physical sciences
- Operates 17 national laboratories (largest science infrastructure in US)
- Strong emphasis on industry partnerships and commercialization
- Basic science through applied research and development
- Cost sharing often required
- National security and energy security priorities
## Major DOE Offices and Programs
### Office of Science (SC)
**Budget**: ~$8 billion (largest supporter of physical sciences research in US)
**Mission**: Deliver scientific discoveries and major scientific tools to transform our understanding of nature and advance energy, economic, and national security
**Program Offices**:
1. **Advanced Scientific Computing Research (ASCR)**
- High-performance computing
- Applied mathematics
- Computational sciences
- Exascale computing
2. **Basic Energy Sciences (BES)**
- Materials science and engineering
- Chemical sciences
- Condensed matter and materials physics
- User facilities (light sources, neutron sources)
3. **Biological and Environmental Research (BER)**
- Biological systems science
- Climate and environmental sciences
- Environmental molecular sciences laboratory
4. **Fusion Energy Sciences (FES)**
- Plasma physics
- Fusion energy development
- ITER collaboration
5. **High Energy Physics (HEP)**
- Particle physics
- Accelerator science
- Quantum information science
6. **Nuclear Physics (NP)**
- Nuclear structure and dynamics
- Relativistic heavy ions
- Fundamental symmetries
**Funding Mechanisms**:
- **Early Career Research Program**: $750K over 5 years for early career scientists
- **Funding Opportunity Announcements (FOAs)**: Program-specific solicitations
- **Laboratory Directed Research and Development (LDRD)**: For national lab staff
### ARPA-E (Advanced Research Projects Agency-Energy)
**Mission**: Advance high-potential, high-impact energy technologies that are too early for private-sector investment
**Characteristics**:
- High-risk, high-reward transformative energy technologies
- Requires cost sharing (typically 20% for universities, more for industry)
- Emphasis on pathway to commercialization
- Strong project management and milestones
- Budget: ~$500M annually
**Program Types**:
- **Focused Programs**: Specific technology areas (announced via FOAs)
- **OPEN**: General solicitation across all energy technologies
- **SCALEUP**: Bridging from lab to market
**Typical Funding**:
- $1-10M per project
- 1-3 years duration
- Technology transition focus
### Office of Energy Efficiency and Renewable Energy (EERE)
**Mission**: Accelerate development and deployment of clean energy technologies
**Program Areas**:
- **Solar Energy Technologies Office (SETO)**
- **Wind Energy Technologies Office (WETO)**
- **Water Power Technologies Office (WPTO)**
- **Geothermal Technologies Office (GTO)**
- **Building Technologies Office (BTO)**
- **Advanced Manufacturing Office (AMO)**
- **Vehicle Technologies Office (VTO)**
- **Bioenergy Technologies Office (BETO)**
- **Hydrogen and Fuel Cell Technologies Office (HFTO)**
**Funding Mechanisms**:
- FOAs for specific technology areas
- Small Business Innovation Research (SBIR)
- Technology Commercialization Fund (TCF)
### Office of Fossil Energy and Carbon Management (FECM)
**Focus**: Carbon capture, utilization, and storage; hydrogen; critical minerals
### Office of Nuclear Energy (NE)
**Focus**: Advanced reactor technologies, nuclear fuel cycle, university programs
## DOE Proposal Structure
DOE proposal requirements vary significantly by program office and FOA. **Always read the specific FOA carefully.**
### Common Elements
#### Project Narrative (varies, typically 10-20 pages)
**Typical Structure**:
1. **Executive Summary / Abstract** (1 page)
- Project objectives and technical approach
- Expected outcomes and impact
- Team qualifications
- Alignment with DOE mission
2. **Background and Motivation** (2-3 pages)
- Current state of technology or knowledge
- Problem or opportunity
- Why DOE investment is needed
- Alignment with program goals
3. **Technical Approach and Innovation** (5-10 pages)
- Detailed technical plan
- Methodology and approach
- Innovation and novelty
- Risk assessment and mitigation
- Go/no-go decision points
- Performance metrics
4. **Impact and Energy Relevance** (1-2 pages)
- Expected technical outcomes
- Energy impact (cost, efficiency, emissions)
- Pathway to deployment or commercialization
- Economic benefits
- Timeline to market (for applied programs)
5. **Management Plan** (1-2 pages)
- Team organization and roles
- Timeline and milestones
- Risk management
- Communication and reporting
6. **Qualifications and Resources** (1-2 pages)
- Team expertise and experience
- Relevant prior work
- Facilities and equipment
- National lab or industry partners
#### Budget and Budget Justification
**Federal Cost Share**:
- Specify DOE funding requested by year
- Break down by category (labor, equipment, travel, etc.)
- Detailed justification for each item
**Cost Share** (often required):
- Specify source (cash vs. in-kind)
- Document commitment (letters from sponsors)
- Typical requirements:
- Universities: 20% (ARPA-E)
- Industry: 50% or more
- National labs: Varies
**Budget Categories**:
- Labor (personnel with hours/rates)
- Fringe benefits
- Travel
- Equipment and capital items
- Materials and supplies
- Other direct costs
- Subawards/subcontracts
- Indirect costs (F&A)
#### Biographical Sketches
**Format**: Often DOE-specific or NSF-style
- Professional preparation
- Appointments
- Relevant publications (5-10 most relevant)
- Synergistic activities
- Collaborators
#### Work Breakdown Structure (WBS)
**Often Required**: Detailed breakdown of tasks, milestones, and deliverables
- Task structure aligned with budget
- Quarterly or annual milestones
- Deliverables for each task
- Responsible parties
#### Letters of Commitment
**Required for**:
- Cost share partners
- Collaborating institutions
- National laboratory partnerships
- Industry partners
- Access to facilities or resources
**Must Include**:
- Specific commitment (funding, personnel, equipment)
- Signed by authorized representative
- On institutional letterhead
#### Facilities and Equipment
**Describe**:
- Available facilities relevant to project
- Major equipment accessible
- Computational resources
- Unique capabilities
#### Data Management Plan (DMP)
**Increasingly Required**:
- Types of data to be generated
- Standards and formats
- Access and sharing policies
- Long-term preservation
- Compliance with DOE policies
## Review Criteria
### Office of Science (SC) General Criteria
Proposals typically evaluated on:
1. **Scientific and/or Technical Merit** (35-40%)
- Importance and relevance of research
- Appropriateness of proposed method or approach
- Scientific or technical innovation
- Clarity of objectives and expected outcomes
2. **Appropriateness of Proposed Method or Approach** (25-30%)
- Technical feasibility
- Likelihood of success
- Adequacy of project design
- Rigor of technical approach
3. **Competency of Personnel and Adequacy of Facilities** (20-25%)
- Qualifications of PI and team
- Track record in relevant areas
- Access to necessary facilities and equipment
- Institutional support
4. **Reasonableness and Appropriateness of Budget** (10-15%)
- Budget aligned with proposed work
- Appropriate allocation of resources
- Cost effectiveness
5. **Relevance to DOE Mission and Program Goals** (10-15%)
- Alignment with program priorities
- Contribution to DOE mission
- Potential impact on energy/environment
### ARPA-E Review Criteria
**ARPA-E uses concept paper → full application process**
**Concept Paper Review** (typically 3-5 pages):
- Technical innovation and impact
- Potential for transformative advance
- Relevance to energy applications
- Feasibility (team, approach)
**Full Application Review** (if invited):
1. **Impact** (40%)
- Potential to dramatically improve energy technology
- Energy and economic impact
- Transformative vs. incremental
- Pathway to market adoption
2. **Innovation/Technical Merit** (30%)
- Novel approach or technology
- Technical rigor and feasibility
- Likelihood of meeting targets
- Risk and risk mitigation
3. **Qualifications** (20%)
- Team expertise and experience
- Resources and capabilities
- Management plan
- Track record
4. **Workplan** (10%)
- Clear milestones and go/no-go points
- Realistic timeline
- Appropriate budget
- Risk management
### Technology-to-Market (T2M) Evaluation (ARPA-E)
**Critical Component**: Path to commercialization
**Assessed**:
- Market opportunity and size
- Competitive landscape
- Barriers to adoption
- Go-to-market strategy
- Partnership and commercialization plan
- Economic viability
**Common Mistakes**:
- Underestimating time to market
- Ignoring competing technologies
- Unrealistic cost projections
- No clear adoption pathway
## DOE-Specific Considerations
### National Laboratory Collaboration
**Benefits**:
- Access to unique facilities and expertise
- Leveraging world-class capabilities
- Credibility and track record
**Mechanisms**:
- **Subcontract**: Lab is subcontractor to university/company
- **Cooperative Research and Development Agreement (CRADA)**: Partnership with industry
- **User Facility Proposal**: Access to major DOE user facilities
- **Strategic Partnership Project (SPP)**: Formal collaboration
**Process**:
- Identify appropriate lab partner early
- Contact lab scientist to discuss collaboration
- Develop work scope and budget together
- Obtain lab approval (can take 2-3 months)
- Include letter of commitment
**Major National Labs**:
- Argonne (ANL), Brookhaven (BNL), Lawrence Berkeley (LBNL)
- Oak Ridge (ORNL), Pacific Northwest (PNNL), SLAC
- Sandia (SNL), Los Alamos (LANL), Lawrence Livermore (LLNL)
- National Renewable Energy Lab (NREL), Idaho (INL), Fermilab
### User Facilities
**DOE operates 28 major user facilities** open to researchers
**Types**:
- **Light Sources**: X-ray and neutron scattering (APS, NSLS-II, ALS, etc.)
- **Nanoscale Science Centers**: Fabrication and characterization
- **High-Performance Computing**: Supercomputing centers (OLCF, NERSC, ALCF)
- **Genomic Science**: JGI, EMSL
- **Accelerators and Detectors**: Particle and nuclear physics facilities
**Access**:
- Submit user proposal (separate from research proposal)
- Peer-reviewed allocation of beam time or computing hours
- No cost for non-proprietary research
- Can include user facility access in grant proposals
### Cost Sharing Requirements
**Varies by Program**:
- **Office of Science**: Generally not required (except specific FOAs)
- **ARPA-E**: Required (typically 20% universities, 50%+ industry)
- **EERE**: Often required (varies by program)
- **FECM**: Often required
**Types**:
- **Cash**: Direct contribution of funds
- **In-kind**: Personnel time, equipment use, materials
- **Third-party**: Contribution from collaborator or sponsor
**Requirements**:
- Must be documented and verifiable
- Cannot be used for other federal awards
- Must be from non-federal sources (generally)
- Need letters of commitment
### Technology Readiness Levels (TRLs)
**DOE uses TRL scale 1-9** for technology development programs
**TRL Definitions**:
- **TRL 1-3**: Basic research (idea → proof of concept)
- **TRL 4-6**: Development (component → system prototype)
- **TRL 7-9**: Demonstration and deployment (prototype → commercial)
**Funding by TRL**:
- **Office of Science**: TRL 1-3 (basic research)
- **ARPA-E**: TRL 2-5 (proof of concept → prototype)
- **EERE**: TRL 4-8 (development → demonstration)
**Specify in Proposal**:
- Current TRL of technology
- Target TRL at project end
- Path from current to target
### Intellectual Property and Data Rights
**Standard Terms**:
- Awardee generally retains IP rights
- Government retains license for government purposes
- Must report inventions to DOE
- May have data sharing requirements
**Industry Partners**:
- Negotiate IP and data rights in advance
- Protected CRADA information (5 years)
- Background IP vs. foreground IP
### Teaming and Partnerships
**Encouraged for**:
- University-national lab partnerships
- University-industry partnerships
- Multi-institutional teams
- International collaborations (with approval)
**Teaming Partner Lists**: ARPA-E and other programs often provide teaming lists or events
## Submission Process
### Finding Opportunities
**Sources**:
- **EERE Exchange**: https://eere-exchange.energy.gov
- **ARPA-E OPEN**: https://arpa-e.energy.gov
- **Office of Science FOAs**: https://science.osti.gov/grants/Funding-Opportunities
- **Grants.gov**: Federal grants database
- **FedConnect**: Subscribe to FOA announcements
### Application Systems
**Varies by Office**:
- **EERE Exchange**: EERE programs
- **PAMS (Portfolio Analysis and Management System)**: Office of Science
- **ARPA-E OPEN**: ARPA-E submissions
- **Grants.gov**: Some programs
**Registration Required** (can take 2-4 weeks):
- SAM.gov (System for Award Management)
- Grants.gov
- DOE program-specific systems
### Proposal Development Timeline
**Recommended Timeline**:
- **3-6 months before deadline**: Identify FOA, assemble team, contact lab partners
- **2-3 months**: Develop technical approach, secure commitments
- **1-2 months**: Draft proposal, prepare budget
- **2-4 weeks**: Internal review, revisions
- **1 week**: Final preparation, institutional approvals
- **48 hours early**: Submit (don't wait for deadline)
### Required Registrations
**Before First Submission**:
1. **SAM.gov**: System for Award Management (2-3 weeks)
2. **Grants.gov**: Account and authorization (1 week)
3. **FedConnect**: Optional, for notifications
4. **PAMS/EERE Exchange**: Program-specific (immediate)
**Institutional Requirements**:
- Authorized Organizational Representative (AOR)
- Institutional approvals
- Cost accounting systems
## Review and Award Process
### Timeline
**Varies by Program**:
- **Office of Science**: 3-6 months
- **ARPA-E**: 4-6 months (after full application invitation)
- **EERE**: 3-6 months
**Steps**:
1. Administrative compliance check
2. Peer review (external reviewers)
3. Program manager evaluation
4. Selection for award negotiation
5. Budget negotiation
6. Award issuance
### Reviewer Feedback
**Provided**:
- Reviewer comments (often anonymized)
- Strengths and weaknesses
- Scores by criterion
**Not Always Provided**: Some programs provide limited feedback
### Success Rates
**Varies Widely**:
- **Office of Science Early Career**: ~10-15%
- **ARPA-E OPEN**: ~2-5% (concept papers → awards)
- **EERE FOAs**: 10-30% (depends on program)
- **Office of Science FOAs**: 20-40% (varies)
## Writing Tips for Competitive DOE Proposals
### Do's
**Align with DOE mission** - Energy, environment, or national security relevance
**Emphasize impact** - How will this advance energy technology or science?
**Quantify outcomes** - Energy savings, efficiency gains, cost reductions
**Show pathway to deployment** - For applied programs, how will technology reach market?
**Leverage DOE capabilities** - National labs, user facilities, unique resources
**Include strong management plan** - Milestones, go/no-go, risk mitigation
**Demonstrate team qualifications** - Track record in relevant area
**Be specific about innovation** - What's new and why it matters
**Address technology readiness** - Current TRL and path forward
**Secure cost share commitments** - If required, get letters early
### Don'ts
**Don't ignore FOA requirements** - Each FOA is different, read carefully
**Don't underestimate timeline** - Allow time for registrations and approvals
**Don't forget cost share** - If required, must be documented
**Don't overlook lab partnerships** - Can strengthen proposal significantly
**Don't be vague about impact** - Need quantitative energy/economic metrics
**Don't ignore commercialization** - For applied programs, market path is critical
**Don't submit without institutional approval** - Need AOR sign-off
**Don't wait for deadline** - Systems crash, submit 48 hours early
**Don't propose basic science to ARPA-E** - Or applied research to Office of Science
**Don't forget TRL discussion** - Important for technology programs
### Common Mistakes
1. **Wrong Program**: Proposing to inappropriate office or program
2. **Insufficient Energy Relevance**: Not clearly tied to DOE mission
3. **Weak Commercialization Plan**: For ARPA-E and EERE, lack of market strategy
4. **Unrealistic Milestones**: Overly optimistic timelines
5. **Poor Budget Justification**: Budget doesn't align with technical plan
6. **Missing Cost Share**: If required, not documented properly
7. **Weak Team**: Insufficient expertise or track record
8. **Ignoring Competing Technologies**: Not addressing competitive landscape
## Recent DOE Priorities (2024-2025)
### Key Focus Areas
- **Clean Energy Transition**: Renewable energy, storage, grid modernization
- **Carbon Management**: Carbon capture, utilization, storage, removal
- **Critical Materials**: Supply chain security, recycling, substitutes
- **Advanced Manufacturing**: Energy-efficient processes, sustainable materials
- **Quantum Information Science**: Computing, sensing, communications
- **Fusion Energy**: Accelerating fusion development
- **Hydrogen Economy**: Production, storage, utilization
- **Nuclear Energy**: Advanced reactors, microreactors, fuel cycle
- **Climate Adaptation**: Climate modeling, resilience, impacts
- **Energy Equity**: Environmental justice, workforce development
### Major Initiatives
- **Energy Earthshots**: Ambitious R&D goals (Hydrogen Shot, Long Duration Storage, Carbon Negative, etc.)
- **Bipartisan Infrastructure Law**: $62B for DOE programs
- **Inflation Reduction Act**: Clean energy tax credits and programs
- **CHIPS and Science Act**: Microelectronics, quantum, clean energy manufacturing
## Resources
- **DOE Office of Science**: https://science.osti.gov
- **ARPA-E**: https://arpa-e.energy.gov
- **EERE**: https://www.energy.gov/eere
- **DOE National Laboratories**: https://www.energy.gov/national-laboratories
- **EERE Exchange**: https://eere-exchange.energy.gov
- **Grants.gov**: https://www.grants.gov
- **SAM.gov**: https://sam.gov
---
**Key Takeaway**: DOE proposals require strong alignment with energy and national security missions, clear pathway to impact (especially for applied programs), and often benefit from partnerships with national laboratories or industry. Cost sharing, technology readiness levels, and commercialization strategies are critical considerations for competitive proposals.

View File

@@ -0,0 +1,851 @@
# NIH (National Institutes of Health) Grant Writing Guidelines
## Agency Overview
**Mission**: To seek fundamental knowledge about the nature and behavior of living systems and to apply that knowledge to enhance health, lengthen life, and reduce illness and disability
**Annual Budget**: ~$47 billion (largest biomedical research funder globally)
**Website**: https://www.nih.gov
**Key Characteristics**:
- 27 Institutes and Centers (ICs), each with specific research focus
- Supports biomedical and behavioral research
- Strong emphasis on rigor, reproducibility, and translation
- Clinical trials and human subjects research
- Patient-oriented and population health research
## NIH Institutes and Centers (Major ICs)
- **NCI** - National Cancer Institute
- **NHLBI** - National Heart, Lung, and Blood Institute
- **NIDDK** - National Institute of Diabetes and Digestive and Kidney Diseases
- **NIAID** - National Institute of Allergy and Infectious Diseases
- **NIGMS** - National Institute of General Medical Sciences
- **NINDS** - National Institute of Neurological Disorders and Stroke
- **NIMH** - National Institute of Mental Health
- **NICHD** - National Institute of Child Health and Human Development
- **NEI** - National Eye Institute
- **NIEHS** - National Institute of Environmental Health Sciences
- **NIA** - National Institute on Aging
- **NIAAA** - National Institute on Alcohol Abuse and Alcoholism
- **NIDA** - National Institute on Drug Abuse
- **NHGRI** - National Human Genome Research Institute
- **NCCIH** - National Center for Complementary and Integrative Health
**Plus**: NIBIB, NIDCD, NIDCR, NINR, FIC, NLM, and others
## Core Review Criteria
NIH proposals are evaluated using **scored criteria** (1-9 scale, 1 = exceptional, 9 = poor) and **additional review considerations** (not scored but discussed).
### Scored Criteria (Overall Impact Score)
#### 1. Significance
**Definition**: Does the project address an important problem or critical barrier to progress?
**Key Questions**:
- Will the project improve scientific knowledge, technical capability, or clinical practice?
- How will successful completion move the field forward?
- Does it address important scientific question or health need?
- Is there a clear rationale based on literature or preliminary data?
**What Reviewers Look For**:
- Clear statement of the problem and its importance
- Evidence that solving this problem will advance the field
- Strong conceptual framework
- Potential for broad impact (not just narrow niche)
- Alignment with NIH and Institute mission
**Writing Strategy**:
- Open with compelling statement of health burden or knowledge gap
- Cite epidemiological data, morbidity/mortality statistics
- Show that current approaches are insufficient
- Demonstrate how your work will make a difference
- Connect to clinical or translational outcomes when possible
#### 2. Investigator(s)
**Definition**: Are the investigators appropriately trained and well-suited to carry out this work?
**Key Questions**:
- Do they have appropriate expertise and track record?
- Is the proposed leadership approach appropriate for the project?
- Do they have prior experience in the research area?
- For Early Stage Investigators (ESI), is appropriate mentoring/support available?
**What Reviewers Look For**:
- Publications in the relevant area
- Preliminary data demonstrating capability
- Productivity and consistency
- Appropriate team composition
- For new investigators: strong mentorship and institutional support
- Career trajectory aligned with proposed work
**Writing Strategy**:
- Highlight most relevant publications (not total number)
- Show progression and focus in research program
- Demonstrate that you have necessary skills
- If new area, show collaborations or training
- For multi-PI, clearly define complementary roles
- Show stability and institutional commitment
#### 3. Innovation
**Definition**: Does the application challenge existing paradigms or develop new methodologies, technologies, or interventions?
**Key Questions**:
- Does the project employ novel concepts, approaches, or methodologies?
- Are the aims original and innovative?
- Does it challenge existing paradigms or address an innovative hypothesis?
- Does it refine, improve, or develop new instrumentation or methods?
**What Reviewers Look For**:
- Departure from standard approaches
- Novel application of methods to new problems
- Development of new technologies or tools
- Paradigm-shifting concepts
- Creative experimental design
- NOT just new to you, but new to the field
**Writing Strategy**:
- Explicitly state what is innovative
- Contrast with existing approaches and limitations
- Explain why innovation is necessary
- Provide preliminary data supporting feasibility
- Balance novelty with achievability
- Avoid over-claiming (incremental work ≠ transformative)
#### 4. Approach
**Definition**: Are the overall strategy, methodology, and analyses well-reasoned, appropriate, and rigorous?
**Key Questions**:
- Are the research design and methods appropriate for the proposed aims?
- Are potential problems, alternative strategies, and benchmarks for success presented?
- Is the timeline reasonable and is there adequate statistical power?
- Are the data management and analysis plans appropriate?
- Is rigor and transparency evident in the experimental design?
**What Reviewers Look For**:
- Detailed, specific methodology
- Appropriate experimental design (controls, replicates, randomization, blinding)
- Statistical justification (power calculations, sample size)
- Potential pitfalls identified with alternatives
- Feasibility demonstrated with preliminary data
- Logical flow from aims through methods to expected outcomes
- Rigor and reproducibility measures
**Writing Strategy**:
- Provide sufficient detail to judge feasibility
- Use subheadings for organization
- Include flowcharts or diagrams
- Address authentication of key biological resources
- Discuss biological variables (sex, age, etc.)
- Identify potential problems proactively
- Provide contingency plans
- Show that timeline is realistic
- Include preliminary data throughout
#### 5. Environment
**Definition**: Will the scientific environment contribute to the probability of success?
**Key Questions**:
- Do the proposed studies benefit from unique features of the scientific environment?
- Are the institutional support, equipment, and resources available?
- Are collaborative arrangements and contributions from colleagues appropriate?
- Is the environment conducive to the proposed research?
**What Reviewers Look For**:
- Access to necessary facilities (core facilities, equipment, patient populations)
- Institutional commitment and support
- Collaborative networks
- Track record of institutional productivity
- Training environment (for training grants)
- Sufficient space and resources
**Writing Strategy**:
- Highlight unique institutional resources
- Describe relevant core facilities with capabilities
- Show institutional investment in your research area
- Include letters documenting access to resources
- Describe collaborative environment
- For clinical research, show access to patient populations
### Additional Review Considerations (Not Scored)
These factors are discussed but do not contribute to the numerical score:
#### Protection of Human Subjects
- IRB approval status and process
- Risks to subjects justified by potential benefits
- Protections against risks adequate
- Informed consent process appropriate
- Data and safety monitoring plan (for trials)
- Inclusion of women, minorities, and children (see below)
#### Inclusion of Women, Minorities, and Children
- Adequate plan for inclusion of all groups
- Justification if any group excluded
- Statistical power adequate to detect differences
- Outreach and recruitment plans appropriate
#### Vertebrate Animals
- IACUC approval status
- Proposed procedures appropriate and humane
- Minimization of discomfort, distress, pain
- Euthanasia method appropriate
- Justification of species and numbers
#### Biohazards
- Appropriate safeguards and containment
- Training and expertise adequate
#### Resubmission (A1 applications)
- Are concerns from previous review adequately addressed?
- Has the application been substantially improved?
#### Budget and Period of Support
- Is budget reasonable for proposed work?
- Is timeline appropriate?
#### Resource Sharing Plans
- Data sharing plan adequate
- Model organism sharing plan (if applicable)
- Genomic data sharing plan (if applicable)
## Proposal Structure and Page Limits
### Specific Aims (1 page)
**Most important page of the entire application.** Reviewers often make initial impressions based on this page alone.
**Structure** (see detailed template in `specific_aims_guide.md`):
**Opening Paragraph** (3-5 sentences):
- Long-term goal of your research program
- Health burden or knowledge gap
- Critical need that motivates the work
**Objective and Central Hypothesis** (1 paragraph):
- Objective of THIS grant
- Central hypothesis or research question
- Rationale (brief mention of preliminary data)
**Specific Aims** (2-4 aims):
- Each aim: 1 paragraph (half page max)
- Aim statement (1-2 sentences, starts with action verb)
- Working hypothesis or research question
- Rationale (why this aim, what preliminary data supports it)
- Approach summary (brief methods)
- Expected outcomes and interpretation
**Payoff Paragraph** (closing):
- Expected outcomes of the overall project
- How findings will advance the field
- Positive impact on health (if relevant)
- Next steps or future directions
**Critical Rules**:
- Exactly 1 page (0.5-inch margins, 11-point Arial or similar)
- Must stand alone (reviewers read this first)
- Clear, specific aims that are testable
- Aims should be independent but synergistic
- Avoid jargon (panel members may not be in your subfield)
- Every sentence must earn its place
### Research Strategy (12 pages for R01)
**Section A: Significance** (typically 2-3 pages)
**Purpose**: Convince reviewers the problem is important and worth solving
**Content**:
- State the problem and its importance (health burden, knowledge gap)
- Review current state of knowledge (focused literature review)
- Identify limitations of current approaches
- Explain conceptual advance your work will provide
- Describe potential impact on the field or health outcomes
- Explain alignment with NIH mission and Institute priorities
**Writing Tips**:
- Start broad (importance of the problem) then narrow (specific gap)
- Use epidemiological data (prevalence, mortality, costs)
- Cite key literature systematically
- Identify the specific barrier or gap your work addresses
- End with how your work will advance the field
**Section B: Innovation** (typically 1-2 pages)
**Purpose**: Articulate what is novel and transformative
**Content**:
- Describe innovative elements of the proposed research
- Explain novel concepts, approaches, or methodologies
- Contrast with existing approaches and their limitations
- Explain why innovation is necessary (not just different)
- Demonstrate that innovation is achievable (preliminary data)
**Writing Tips**:
- Be explicit about what is innovative (don't assume it's obvious)
- Distinguish incremental from transformative advances
- Provide evidence that novel approach can work
- Don't confuse "new to me" with "new to the field"
- Avoid over-claiming
**Section C: Approach** (typically 8-10 pages)
**Purpose**: Provide detailed research plan demonstrating feasibility
**Organization** (for each Specific Aim):
**Aim [Number]: [Aim Title]**
**Rationale and Preliminary Data**:
- Why this aim is important
- Preliminary results supporting feasibility
- Key figures and data
**Research Design**:
- Overall experimental design
- Subject/sample populations and numbers
- Randomization, blinding, controls
- Timeline for this aim
**Methods** (organized by sub-aim or experiment):
- Detailed procedures and protocols
- Materials, reagents, equipment
- Data collection procedures
- Biological variables considered
**Data Analysis**:
- Statistical approaches
- Sample size justification and power calculations
- How results will be interpreted
**Expected Outcomes**:
- What you expect to find
- How results will be interpreted
- Alternative outcomes and what they would mean
**Potential Pitfalls and Alternative Approaches**:
- What could go wrong (be proactive)
- Contingency plans
- Alternative strategies if initial approach doesn't work
**Timeline**:
- Sequence of activities for this aim
- Estimated completion time
**Writing Tips**:
- Use consistent organization across aims
- Include subheadings for clarity
- Integrate preliminary data throughout (not just at beginning)
- Provide figures, flowcharts, and tables
- Address rigor and reproducibility explicitly
- Justify choice of methods and approaches
- Be specific about numbers, timelines, and analysis
- Show that you've thought through the research process
**Rigor and Reproducibility** (addressed throughout Approach):
NIH requires explicit discussion of:
- **Scientific rigor in experimental design**: Controls, replicates, blinding, randomization
- **Authentication of key biological resources**: Cell lines, antibodies, organisms
- **Consideration of biological variables**: Sex, age, strain, etc.
- **Statistical power**: Adequate sample sizes
- **Transparency**: Data management, protocols, reporting
### Bibliography (no page limit)
- Include all references cited
- Use consistent format (PubMed citations preferred)
- Include DOI or PMID when available
### Protection of Human Subjects or Vertebrate Animals (varies)
**Human Subjects Section**:
- Risks to subjects
- Protection against risks
- Potential benefits
- Importance of knowledge to be gained
- Inclusion of women and minorities
- Inclusion of children
- Data and safety monitoring
**Vertebrate Animals Section**:
- Justification of species and numbers
- Minimization of pain and distress
- Euthanasia method
## Key NIH Application Types
### R01 - Research Project Grant
**Description**: Standard NIH grant mechanism for established investigators
**Characteristics**:
- **Budget**: Modular (up to $250K direct costs/year) or detailed budget
- **Duration**: Typically 3-5 years
- **Eligibility**: Any eligible institution
- **Preliminary data**: Usually required (shows feasibility)
- **Page limits**: 12 pages Research Strategy
**Typical Timeline**:
- Prepare: 2-6 months
- Review: ~9 months from submission
- Earliest start: 9-12 months after submission
**Success Rate**: ~20% overall (varies by Institute)
**When to Apply**: When you have preliminary data and clear research direction
### R21 - Exploratory/Developmental Research Grant
**Description**: Encourages new exploratory and developmental research
**Characteristics**:
- **Budget**: Up to $275K total (direct costs) over 2 years
- **Duration**: Maximum 2 years
- **Preliminary data**: Not required (though can strengthen)
- **Page limits**: 6 pages Research Strategy
- **No-cost extensions**: Not allowed
**Purpose**:
- Pilot or feasibility studies
- Testing new methods or technologies
- Secondary analysis of existing data
- Exploratory clinical studies
**When to Apply**: When you need pilot data before R01, or for high-risk ideas
### R03 - Small Grant Program
**Description**: Small-scale research projects
**Characteristics**:
- **Budget**: Up to $50K/year direct costs (up to $100K total)
- **Duration**: Maximum 2 years
- **Page limits**: 6 pages Research Strategy
**Purpose**: Limited scope projects, pilot studies, secondary data analysis
### K Awards - Career Development Awards
**Purpose**: Support career development of researchers
**Major K Award Types**:
**K99/R00 - Pathway to Independence**:
- Two phases: K99 (mentored, 1-2 years) → R00 (independent, up to 3 years)
- For postdocs transitioning to independence
- Provides protected time and research support
- Competitive (~15% funded)
**K08 - Mentored Clinical Scientist Award**:
- For clinicians (MD, DO, DDS, etc.)
- 3-5 years protected time for research training
- Requires mentoring team
- Up to $100K direct costs/year
**K23 - Mentored Patient-Oriented Research Career Development Award**:
- For patient-oriented research
- Similar structure to K08
**All K Awards Require**:
- Career development plan
- Research plan (6-12 pages)
- Mentoring plan and letters from mentors
- Training plan
- Institutional commitment (75% protected time typically)
### Other Common Mechanisms
**R15 (AREA)**: For primarily undergraduate institutions
**P01**: Multi-project program project grants (large collaborative)
**U01**: Cooperative agreement (NIH involvement in conduct)
**R34**: Clinical trial planning grant
**DP1/DP2**: NIH Director's Pioneer/New Innovator Awards (special)
## Budget Preparation
### Modular Budgets (R01s up to $250K direct/year)
**Characteristics**:
- Requested in $25K increments (modules)
- Maximum 10 modules ($250K) per year
- Detailed budget not required
- Budget justification: Narrative (Personnel, Consortium, Other)
- Years 2-5: Brief justification if >$125K or increase >25%
**Personnel Justification**:
- List all personnel with roles, effort (% calendar months)
- Typical: PI (2-3 months = 16-25%), postdoc (12 months), grad student, tech
- Justify effort for each person
- Note: Salary cap applies (~$221,900 for 2024)
**Consortium/Contractual Costs**:
- F&A typically limited to 8% of total costs for subcontracts
**Other Costs**:
- Describe significant equipment, animals, patient costs, etc.
### Detailed Budgets (>$250K direct/year)
**Required Sections**:
- Personnel (with individual salary details)
- Equipment (≥$5,000 per item)
- Travel (domestic and foreign)
- Participant/Trainee Support Costs
- Other Direct Costs (materials, supplies, publications, consultants)
- Consortium/Contractual Costs (with detailed sub-budgets)
- Total Direct Costs
- Indirect Costs (F&A)
**Budget Justification**:
- Detailed narrative for each category
- Justify need for each item/person
- Explain calculations
### NIH Salary Cap
**Annual Update**: NIH sets maximum salary for grants
- 2024 Level: ~$221,900 (Executive Level II)
- Applies to all personnel
- Fringe benefits calculated on capped salary
### Allowable Costs
**Generally Allowed**:
- Salaries and wages
- Fringe benefits
- Equipment
- Supplies (consumables <$5,000)
- Travel (domestic and international)
- Consultant services
- Consortium/subaward costs
- Animal purchase and care
- Patient care costs (clinical trials)
- Alterations and renovations (with prior approval)
- Publication costs
**Generally Not Allowed** (without special justification):
- Office equipment (computers, printers, furniture)
- Administrative costs
- Tuition (except for K awards and training grants)
## Application Submission
### Deadlines
**Standard Dates** (most programs):
- February 5
- June 5
- October 5
**AIDS-Related Research**:
- January 7
- May 7
- September 7
**K Awards and Fellowship**: Different dates, typically 3 times/year
**Submission Time**: 5:00 PM local time of applicant organization
### Submission Systems
**eRA Commons**: Required for NIH submission
- Create account through institution
- Assign roles (PI, authorized organizational representative)
**ASSIST (Application Submission System & Interface for Submission Tracking)**:
- NIH's electronic submission system
- Create application, upload documents, submit
**Grants.gov**: Alternative submission route (not recommended)
### Just-in-Time Information
**After initial review** (if in fundable range), NIH requests:
- Other Support (updated)
- IRB/IACUC approval (or documentation that approval will be obtained)
- Vertebrate Animals/Human Subjects training certifications
**Timing**: Usually 6-9 months after submission
## Review Process
### Timeline
**Total Time**: ~9 months from submission to funding decision
**Stages**:
1. **Submission**: Deadline (Month 0)
2. **Referral**: Assignment to IC and study section (Month 1)
3. **Review**: Study section meeting (Months 3-4)
4. **Council**: Advisory council review (Months 6-7)
5. **Funding Decision**: Program officer and IC (Months 7-9)
### Study Sections
**Types**:
- **Standing Study Sections**: Permanent panels meeting 3x/year
- **Special Emphasis Panels (SEPs)**: Ad hoc panels for specific RFAs or topics
- **Scientific Review Groups (SRGs)**: Chartered study sections
**Process**:
- 3 assigned reviewers per application (prepare written critiques)
- ~15-25 applications discussed per study section
- ~50-100 applications assigned to each study section
**Participants**:
- Scientific Review Officer (SRO): NIH staff, manages process
- Reviewers: External scientists with expertise
- Grants management specialist
- Program officer (sometimes attends, doesn't vote)
### Scoring
**Preliminary Scoring** (before meeting):
- All panel members score 1-9 (1 = exceptional, 9 = poor)
- Applications in lower half typically "triaged" (not discussed)
- Top ~50% discussed at meeting
**Discussion** (at study section meeting):
- Assigned reviewers present their assessments
- Panel discusses strengths and weaknesses
- Open discussion among all panel members
- Questions about rigor, innovation, feasibility
**Final Scoring** (after discussion):
- All panel members score 1-9
- Scores averaged and multiplied by 10
- **Final Impact Score**: 10-90 (lower is better)
- 10-20: Exceptional
- 21-30: Outstanding
- 31-40: Excellent (often fundable)
- 41-50: Very good (may be fundable)
- 51+: Less competitive
**Individual Criterion Scores**: Also scored 1-9
- Significance
- Investigator(s)
- Innovation
- Approach
- Environment
### Percentile Ranking
**After all study sections meet**, applications are percentile-ranked within IC
- Based on Impact Score relative to other applications reviewed by same IC
- Percentile typically more important than Impact Score for funding decisions
- Lower percentile = better (1st percentile = top 1%)
**Example**: Impact Score of 35 might be:
- 15th percentile at NIGMS (likely funded)
- 40th percentile at NCI (likely not funded)
- Depends on competitiveness of IC and available funding
### Summary Statement
**Received**: ~30 days after study section meeting
**Contents**:
- Overall Impact/Priority Score and Percentile
- Individual criterion scores
- Resume and Summary of Discussion
- Detailed critiques from 3 assigned reviewers
- Additional comments from other panel members
- Human Subjects, Animals, Biohazards reviews
**Interpreting**:
- Focus on consistent themes across reviewers
- Identify major vs. minor criticisms
- Note what reviewers found strong
- Use for resubmission planning
## Resubmission (A1 Applications)
### NIH Resubmission Policy
**One Resubmission Allowed**: Can resubmit once (A1) after initial review (A0)
- After A1 review, cannot resubmit again
- Must submit new application if A1 not funded
**No Limits on New Applications**: Can submit completely new application anytime
### Introduction to Resubmission (1 page)
**Required Section**: Separate 1-page introduction responding to previous review
**Structure**:
- **Header**: "INTRODUCTION TO RESUBMISSION"
- **Summary of Criticisms**: Brief overview of major criticisms
- **Response to Criticisms**: Point-by-point response with page references
- **Use bullet points** for clarity
**Example Format**:
```
INTRODUCTION TO RESUBMISSION
The previous review raised the following concerns:
1. Inadequate preliminary data demonstrating feasibility of Aim 2
2. Statistical power insufficient for Aim 3
3. Lack of detail about quality control procedures
We have addressed these concerns as follows:
1. Preliminary data for Aim 2 (Response, p. 8-9; Research Strategy, p. 18-20)
• Generated pilot data showing [specific result]
• Optimized protocol achieving [specific outcome]
• New Figure 3 demonstrates feasibility
2. Statistical power for Aim 3 (Research Strategy, p. 24-25)
• Increased sample size from n=15 to n=25 per group
• Updated power calculations show >90% power
• Budget adjusted accordingly
3. Quality control procedures (Research Strategy, p. 12, 19, 26)
• Added detailed QC protocols for each method
• Implemented validation criteria and acceptance thresholds
• Described authentication of key reagents
```
**Tips**:
- Be respectful and professional (avoid defensiveness)
- Address every major criticism explicitly
- Indicate where changes are in revised application
- Show substantial revision, not minor tweaks
- Acknowledge valid criticisms and explain how addressed
- If disagree with criticism, explain politely with evidence
### Resubmission Strategy
**Decision Tree**:
**Impact Score ≤40 (Percentile ≤20)**: Strong application, likely competitive
- Address specific criticisms
- Strengthen weak areas
- Add preliminary data if criticized
- Consider minor scope adjustments
**Impact Score 41-50 (Percentile 21-40)**: Moderate application, needs improvement
- Substantial revision needed
- May need new preliminary data
- Consider revising aims if criticized
- Strengthen innovation or significance
- May want to wait for new data before resubmitting
**Impact Score ≥51 (Percentile ≥41)**: Weak application, major revision needed
- Consider whether resubmission is worthwhile
- May be better to develop new application
- If resubmitting: major restructuring likely needed
- Gather substantial new preliminary data
- Consider changing scope or aims
**Common Resubmission Improvements**:
1. **Add preliminary data**: Especially for Aim 2 or 3 if criticized
2. **Clarify methods**: Provide more detail, address technical concerns
3. **Increase rigor**: Better controls, larger n, statistical justification
4. **Revise specific aims**: If fundamentally flawed
5. **Add collaborators**: If expertise questioned
6. **Strengthen significance**: Better literature review, clearer impact
7. **Refocus innovation**: Clarify what's novel and why it matters
**Timing**:
- Can resubmit at any of the next 3 deadlines (36 months after initial submission)
- Use time wisely to generate new data
- Don't rush resubmission with minor changes
## NIH Funding Trends and Priorities (2024-2025)
### Current Priorities
- **Health Disparities and Health Equity**: Addressing disparities in disease burden
- **Alzheimer's Disease and Dementia**: Prevention, treatment, care
- **Substance Use and Mental Health**: Opioid crisis, addiction, mental health
- **Infectious Diseases**: Pandemic preparedness, antimicrobial resistance, vaccines
- **Cancer**: Cancer Moonshot initiatives
- **BRAIN Initiative**: Understanding the brain
- **All of Us Research Program**: Precision medicine
- **Climate Change and Health**: Environmental impacts on health
- **Artificial Intelligence**: AI for biomedical research and healthcare
### Success Rates by Career Stage
**Overall**: ~20% (varies by IC and mechanism)
**Established Investigators**: ~23%
**Early Stage Investigators (ESI)**: ~27% (higher due to ESI policy)
- ESI: Within 10 years of final degree, no prior R01-equivalent
**New Investigators**: ~24%
- New: No prior R01-equivalent (regardless of time since degree)
**Multiple PI**: ~18% (slightly lower than single PI)
### Paylines
**Varies by IC**: Each Institute sets own funding priorities
**Example Paylines (FY2023)**:
- NIGMS: ~23rd percentile
- NCI: ~12th percentile (highly competitive)
- NHLBI: ~11th percentile
- NIAID: ~15th percentile
- NIMH: ~12th percentile
**ESI Boost**: Most ICs fund ESIs at higher percentile than established investigators
**Check IC Websites**: Paylines and funding policies updated annually
## Tips for Competitive NIH Applications
### Do's
**Start with Specific Aims page** - Most important page, revise extensively
**Include substantial preliminary data** - Demonstrate feasibility (esp. for R01)
**Be explicit about innovation** - Don't assume reviewers will recognize it
**Address rigor and reproducibility** - Controls, power, authentication, variables
**Provide detailed methods** - Enough detail to assess feasibility
**Identify pitfalls proactively** - Show you've thought through challenges
**Use figures and diagrams** - Clarify complex ideas, show preliminary data
**Connect to health** - NIH mission is health-related
**Write clearly** - Panel members may not be in your exact subfield
**Get external review** - Mock review from colleagues and mentors
### Don'ts
**Don't exceed page limits** - Automatic rejection
**Don't be vague about methods** - "Standard protocols" is insufficient
**Don't ignore sample size** - Power calculations required
**Don't overpromise** - Be realistic about what's achievable
**Don't forget human subjects/animals sections** - Common mistake
**Don't submit without preliminary data** - For R01, this rarely succeeds
**Don't assume reviewers know your work** - Provide context
**Don't ignore sex as biological variable** - NIH policy requires consideration
**Don't submit at deadline** - Technical issues happen frequently
**Don't resubmit without substantial changes** - Minor revisions rarely succeed
## NIH Resources
- **NIH Homepage**: https://www.nih.gov
- **NIH RePORTER (funded grants)**: https://reporter.nih.gov
- **Grants & Funding**: https://grants.nih.gov
- **eRA Commons**: https://commons.era.nih.gov
- **ASSIST**: https://public.era.nih.gov/assist
- **Application Forms and Instructions**: https://grants.nih.gov/grants/how-to-apply-application-guide.html
- **NIH Data Sharing Policy**: https://sharing.nih.gov
- **Rigor and Reproducibility**: https://grants.nih.gov/reproducibility/index.htm
---
**Key Takeaway**: NIH applications succeed through clear articulation of an important health-related problem, preliminary data demonstrating feasibility, detailed rigorous approach, and innovative methods. The Specific Aims page is the most critical component—invest time in crafting a compelling narrative that immediately conveys significance and feasibility.

View File

@@ -0,0 +1,570 @@
# NSF (National Science Foundation) Grant Writing Guidelines
## Agency Overview
**Mission**: To promote the progress of science; to advance the national health, prosperity, and welfare; to secure the national defense
**Annual Budget**: ~$9-10 billion
**Website**: https://www.nsf.gov
**Key Characteristics**:
- Supports all fields of fundamental science and engineering (except medical sciences)
- Emphasis on education and workforce development
- Strong commitment to diversity, equity, and inclusion
- Promotes open science and data sharing
- Collaborative research across institutions encouraged
## NSF Directorates
1. **BIO** - Biological Sciences
2. **CISE** - Computer and Information Science and Engineering
3. **EHR** - Education and Human Resources
4. **ENG** - Engineering
5. **GEO** - Geosciences
6. **MPS** - Mathematical and Physical Sciences
7. **SBE** - Social, Behavioral, and Economic Sciences
8. **TIP** - Technology, Innovation, and Partnerships (formerly EDA)
9. **OPP** - Office of Polar Programs
10. **OISE** - Office of International Science and Engineering
## Core Review Criteria
NSF uses two equally weighted criteria for all proposals:
### Intellectual Merit
**Definition**: The potential to advance knowledge
**Evaluation Questions**:
- How important is the proposed activity to advancing knowledge and understanding within its own field or across different fields?
- How well-qualified is the proposer (individual or team) to conduct the project?
- To what extent does the proposed activity suggest and explore creative, original, or potentially transformative concepts?
- How well-conceived and organized is the proposed activity?
- Is there sufficient access to resources?
**Writing Strategy**:
- Lead with the research question and its importance
- Demonstrate deep knowledge of the field
- Articulate the knowledge gap clearly
- Present innovative approach to address the gap
- Show preliminary results or proof-of-concept
- Demonstrate team qualifications
- Present feasible, well-organized plan
### Broader Impacts
**Definition**: The potential to benefit society and contribute to the achievement of specific, desired societal outcomes
**Evaluation Questions**:
- What is the potential for the proposed activity to:
- Benefit society or advance desired societal outcomes?
- Broaden participation of underrepresented groups?
- Enhance infrastructure for research and education?
- Enhance scientific and technological understanding?
- Foster partnerships between academia, industry, and others?
**Critical Point**: Broader Impacts are NOT an afterthought. They carry equal weight with Intellectual Merit and must be substantive, specific, and measurable.
**Five Pillars of Broader Impacts** (address at least one substantively):
1. **Advance discovery and understanding while promoting teaching, training, and learning**
- Integrate research into courses
- Develop new curriculum materials
- Train undergraduate, graduate, and postdoctoral researchers
- Provide research experiences for students
- Create educational resources (videos, software, databases)
- Offer workshops or training programs
*Example*: "We will develop a 10-module online course on computational genomics, incorporating data from this project, to be offered to 500+ students annually across 15 partner institutions. Course materials will be open-access and include Jupyter notebooks for hands-on analysis."
2. **Broaden participation of underrepresented groups (in STEM)**
- Partner with minority-serving institutions (HBCUs, HSIs, TCUs)
- Recruit students from underrepresented groups
- Provide mentoring and support programs
- Address systemic barriers to participation
- Create inclusive research environments
- Engage underrepresented communities in research
*Example*: "We will establish a summer research program for 8 undergraduates annually from 4 partner HBCUs, providing stipends, housing, and year-round mentoring. Program will include professional development workshops and pathways to graduate school."
3. **Enhance infrastructure for research and education**
- Develop shared instrumentation or facilities
- Create cyberinfrastructure, software, or databases
- Build collaborative networks
- Establish living stock centers or repositories
- Develop standards or protocols
- Create open-source tools
*Example*: "We will develop and maintain an open-source software platform for analyzing spatial transcriptomics data, with comprehensive documentation, tutorials, and user support forum. Software will be deposited on GitHub and indexed in bio.tools."
4. **Disseminate to enhance scientific and technological understanding**
- Public outreach and science communication
- Engagement with K-12 students and teachers
- Museum exhibits or science festivals
- Media engagement (podcasts, videos, articles)
- Policy briefs for decision-makers
- Community science projects
*Example*: "We will partner with the City Science Museum to create a hands-on exhibit on AI and climate modeling, reaching 50,000+ annual visitors. Exhibit will include interactive simulations and bilingual materials. We will also host quarterly 'Science Saturdays' for local K-12 students."
5. **Benefit society**
- Economic development and competitiveness
- Health and quality of life improvements
- Environmental sustainability
- National security
- Societal well-being
- Workforce development
*Example*: "Our drought prediction models will be integrated into USDA's decision support system, benefiting 15,000+ farmers in the Southwest. We will work with extension agents to provide training and accessible interfaces for non-technical users."
**Common Broader Impacts Mistakes**:
- ❌ Vague statements: "We will train graduate students" (everyone does this)
- ❌ No plan: Aspirational goals without concrete activities
- ❌ No metrics: No way to assess success
- ❌ Tacked on: Not integrated with research plan
- ❌ Unrealistic: Grand claims without resources or expertise
- ✅ Specific and measurable: Clear activities, timelines, and assessment
## Proposal Sections and Page Limits
### Project Summary (1 page)
**Required Structure** (NSF mandates three labeled sections):
**Overview** (first paragraph):
- Research question and approach in accessible language
- Suitable for public dissemination
**Intellectual Merit**:
- Potential to advance knowledge
- Innovative aspects
- Qualifications of team
**Broader Impacts**:
- Societal benefits and specific activities
- How success will be measured
**Formatting**: Must use section headings exactly as shown above
### Project Description (15 pages for most programs)
**No required structure, but typical organization**:
1. **Introduction / Background** (1-2 pages)
- Research question and significance
- Current state of knowledge
- Knowledge gaps
- Preliminary results (if applicable)
2. **Research Objectives** (0.5-1 page)
- Specific, measurable goals
- Hypotheses or research questions
3. **Research Plan / Methodology** (8-10 pages)
- Detailed approach for each objective
- Methods and techniques
- Timeline and milestones
- Expected outcomes
- Potential challenges and alternatives
4. **Broader Impacts** (1-2 pages)
- Can be integrated throughout OR separate section
- Specific activities and timelines
- Assessment and evaluation plan
5. **Results from Prior NSF Support** (if applicable, up to 5 pages)
- Required if PI or co-PI has had NSF award in past 5 years
- Intellectual merit of prior work
- Broader impacts of prior work
- Publications and products
**Formatting Requirements**:
- Font: 11-point or larger (Times Roman, Arial, Palatino, Computer Modern)
- Margins: 1 inch all sides
- Line spacing: No more than 6 lines per inch
- Page size: 8.5 x 11 inches
- No smaller fonts in figures (must be legible)
### References Cited (no page limit)
- Each reference must include:
- Names of all authors
- Article and journal title
- Volume, page numbers, year
- DOI if available
- Use consistent format (doesn't have to match specific style)
- Sufficient information for reviewers to locate references
### Biographical Sketch (3 pages max per person)
**Required NSF Format** (as of 2023 PAPPG):
**Section A: Professional Preparation**
- Undergraduate, graduate, postdoctoral institutions
- Majors and degrees with years
**Section B: Appointments and Positions**
- Last 5 positions, current first
**Section C: Products** (up to 5 most relevant to proposal)
- Publications, datasets, software, patents, etc.
- Can include products in preparation
**Section D: Synergistic Activities** (up to 5)
- Service, teaching, mentoring, outreach
- Demonstrates broader engagement beyond research
### Current and Pending Support (no page limit)
- All current and pending support for PI and co-PIs
- Include project/proposal title, source, award amount, dates
- Describe overlap with proposed project (if any)
- Must be updated until award/decline
### Facilities, Equipment, and Other Resources (no page limit)
- Describe available facilities (labs, computational, libraries)
- Major equipment accessible to project
- Other resources (personnel, core facilities, partnerships)
- Demonstrate institutional commitment
### Data Management and Sharing Plan (2 pages max)
**Required for all proposals** (as of 2023 PAPPG)
**Must address**:
1. **Types of data**: What data will be generated?
2. **Standards**: Formats, metadata, standards for data and metadata
3. **Access**: How and when will data be shared?
4. **Reuse**: Who can access and under what conditions?
5. **Repository**: Where will data be archived long-term?
6. **Protection**: Privacy, confidentiality, intellectual property considerations
**NSF Expectations**:
- Data should be made publicly available in a timely manner
- Use discipline-specific repositories when available
- Justify any restrictions on data sharing
- Plan for data preservation beyond project period
### Postdoctoral Researcher Mentoring Plan (1 page max)
**Required if funding postdocs**
**Must address**:
- Career development objectives
- Mentoring activities (research, teaching, professional skills)
- Metrics for success
- Mentoring plan should be specific, not generic
## Special NSF Proposal Types
### CAREER (Faculty Early Career Development Program)
**Eligibility**: Tenure-track (or equivalent) faculty who have not yet received tenure, within 6 years of PhD (or equivalent)
**Requirements**:
- Integration of research and education
- Demonstrate potential for leadership
- Department chair letter required
- 5-year project plan
- Typical budget: $400,000-$500,000
**Key Elements**:
- Ambitious research plan
- Innovative educational component
- Strong integration (not just parallel tracks)
- Path to independence and leadership
- Institutional commitment
**Review Criteria**: Same two criteria (Intellectual Merit, Broader Impacts) but with emphasis on:
- Integration of research and education
- Innovative educational component
- Potential for leadership in field
**Common CAREER Mistakes**:
- Education component feels tacked on
- Overly ambitious research plan
- Weak integration between research and education
- Generic mentoring or teaching plans
- Insufficient preliminary data
### Collaborative Research
**Structure**: Multiple proposals submitted separately from different institutions, reviewed as a single project
**Requirements**:
- Lead institution designated
- All proposals must have identical titles (except institution name)
- Project descriptions should be substantially similar
- Clear division of labor
- Coordination plan
**Budget**: Each institution submits own budget for their portion
**Review**: Reviewed together as single integrated project
**Benefits**: Brings together complementary expertise and resources
### RAPID (Rapid Response Research)
**Purpose**: Support time-sensitive research opportunities
**Examples**:
- Natural disasters
- Disease outbreaks
- Unique astronomical events
- Rare opportunities for data collection
**Requirements**:
- Urgent need justification
- Up to $200,000
- Up to 1 year duration
- Simplified review process (program officer discretion)
- No preliminary data required
**Submission**: Contact program officer first, then submit proposal
### EAGER (Early-concept Grants for Exploratory Research)
**Purpose**: Support exploratory work on untested, but potentially transformative, ideas
**Requirements**:
- High-risk, high-reward research
- Radically different approaches
- Up to $300,000
- Up to 2 years
- Program officer approval required before submission
- No panel review (program officer decision)
**Key**: Must be truly exploratory and high-risk, not incremental
## Budget Considerations
### Allowable Costs
**Personnel**:
- Senior personnel: Up to 2 months (summer salary) for 9-month faculty
- Postdoctoral scholars: Full salary and benefits
- Graduate students: Stipend (tuition typically covered under fringe/indirect)
- Undergraduate students: Hourly or stipend
- Technical and administrative staff
**Fringe Benefits**: Follow institutional rates
**Equipment**: Items ≥$5,000 per unit
- Must be justified
- Shared equipment requires letters from collaborators
**Travel**:
- Domestic and international scientific meetings
- Collaboration and fieldwork
- Justification required
**Participant Support Costs**: For workshops, training, conferences
- Stipends, travel, subsistence for participants
- Not subject to indirect costs
**Other Direct Costs**:
- Publication costs
- Consulting services
- Computer services
- Materials and supplies
- Subawards to collaborating institutions
**Indirect Costs (F&A)**: Institutional negotiated rate applies to modified total direct costs (MTDC)
- MTDC excludes: equipment, participant support, subawards >$25K
### Cost Sharing
**NSF Policy**: Cost sharing is not required and should not be voluntary
**Exceptions**: Some programs require cost sharing (check program solicitation)
**When Included**: Must be documented, verifiable, auditable, and necessary for project
## Submission and Review Process
### Submission Deadlines
**Varies by program**:
- Some programs have specific deadlines (e.g., twice per year)
- Some programs accept proposals anytime (check with program officer)
- CAREER: July deadline (directorate-specific)
**Submission Windows**: NSF deadlines are typically 5 PM submitter's local time
### Submission Portal
**Research.gov** or **Grants.gov**: NSF accepts both
**Process**:
1. Institutional authorization required
2. Upload all required documents
3. Verify PDF compilation
4. Submit (aim for 48 hours early)
5. Receive confirmation and proposal number
### Review Process
**Timeline**: Typically 6 months from submission to decision
**Steps**:
1. **Administrative Review**: NSF checks compliance (1-2 weeks)
2. **Program Officer Assignment**: Assigned to appropriate program (1-2 weeks)
3. **Reviewer Selection**: Panel and/or ad hoc reviewers identified (2-4 weeks)
4. **Review**: Reviewers assess proposals (4-8 weeks)
5. **Panel Discussion**: Panel meets (virtual or in-person) to discuss proposals (1 week)
6. **Program Officer Recommendation**: Based on reviews and panel discussion (2-4 weeks)
7. **Division/Directorate Approval**: Final decision (2-4 weeks)
**Review Formats**:
- **Panel Review**: 10-20 proposals discussed at panel meeting
- **Ad hoc Review**: External reviewers submit written reviews
- **Hybrid**: Combination of panel and ad hoc reviews
**Number of Reviewers**: Typically 3-5 reviewers per proposal
### Review Outcomes
**Possible Decisions**:
- **Funded**: Congratulations! Award forthcoming
- **Declined**: Not recommended for funding
- **Returned Without Review**: Non-compliant with requirements
**Feedback**: Panel summary and individual reviews provided regardless of outcome
**Success Rates**: Vary by program, typically 15-30%
## Communicating with Program Officers
### When to Contact
**Appropriate**:
- Before submission: Discuss fit with program, feasibility of idea
- After reviews: Discuss feedback, resubmission strategy
- During project: Report significant changes, request no-cost extensions
**How to Contact**:
- Email program officer (contact info in program solicitation)
- Request 15-30 minute phone call
- Prepare concise summary of research idea (1 page)
### What to Ask
**Good Questions**:
- Is my research appropriate for this program?
- Are there upcoming solicitations or special initiatives?
- What are key areas of emphasis for the program?
- Is the scope and budget appropriate?
- After reviews: What are key issues to address in resubmission?
**Avoid**:
- Asking for guarantee of funding
- Arguing with review outcome
- Inappropriate requests for information about reviewers
## Resubmission Strategy
### NSF Resubmission Policies
**No Formal Resubmission Category**: NSF treats resubmissions as new proposals
**Can Resubmit**:
- To same program (after addressing reviews)
- To different program (if better fit)
- After substantial revision
**No Introduction Section**: Unlike NIH, NSF doesn't have formal resubmission response
**Strategy**:
- Carefully review panel summary and individual reviews
- Address all major criticisms
- Strengthen weak areas (prelim data, broader impacts, methods)
- Consider discussing with program officer
- May want to wait for next funding cycle to gather more data
**Tracking**: Proposals reviewed previously may be assigned same reviewers (sometimes)
## Recent NSF Policy Updates
### 2023-2024 Changes
1. **Data Management and Sharing Plan**: Now required for all proposals (2 pages max)
2. **Biographical Sketch Format**: Updated to include "Products" instead of "Publications"
3. **Open Science**: Increased emphasis on open-access publications and data
4. **Plan for Dissemination**: Some programs require explicit dissemination plans
5. **Mentoring Plans**: Enhanced requirements for postdoc mentoring plans
### NSF Priorities (2024-2025)
- **Climate and Clean Energy**: Climate change mitigation and adaptation
- **Quantum Information Science**: Quantum computing, sensing, networking
- **AI and Machine Learning**: Trustworthy AI, AI for science
- **Biotechnology**: Synthetic biology, bioengineering
- **Microelectronics**: Semiconductor research and workforce
- **STEM Education**: Broadening participation, innovative pedagogy
- **Convergence Accelerators**: Use-inspired research with pathway to impact
## NSF Big Ideas and Special Initiatives
### NSF "Big Ideas"
1. **Harnessing the Data Revolution (HDR)**
2. **The Future of Work at the Human-Technology Frontier**
3. **Navigating the New Arctic**
4. **Windows on the Universe**
5. **The Quantum Leap**
6. **Understanding the Rules of Life**
7. **Mid-scale Research Infrastructure**
### Major NSF Initiatives
- **National AI Research Institutes**: $20M over 5 years per institute
- **Science and Technology Centers (STCs)**: Large-scale collaborative centers
- **Engineering Research Centers (ERCs)**: Engineering innovation ecosystems
- **Materials Research Science and Engineering Centers (MRSECs)**: Materials research
- **NSF Graduate Research Fellowship Program (GRFP)**: Student fellowships
## Tips for Competitive NSF Proposals
### Do's
**Start with specific aims/objectives** - Crystal clear research goals
**Make broader impacts substantive** - Specific activities, not platitudes
**Use figures effectively** - Conceptual diagrams, preliminary data, timelines
**Be realistic about scope** - Achievable within 3-5 years
**Address both review criteria explicitly** - Don't make reviewers search
**Get external feedback** - Mock review before submission
**Follow formatting requirements exactly** - Auto-rejection for non-compliance
**Explain jargon and acronyms** - Panel members may not be in your subfield
**Integrate research and education** - Show natural connections
**Demonstrate team qualifications** - Track record in proposed area
### Don'ts
**Don't exceed page limits** - Automatic return without review
**Don't use smaller fonts in figures** - Must be legible
**Don't make broader impacts generic** - "Train students" is not enough
**Don't ignore prior NSF support** - Must report if you've had NSF funding
**Don't be overly ambitious** - Reviewers will see through unrealistic plans
**Don't skip data management plan** - Required for all proposals
**Don't forget biosketches for all personnel** - Common mistake
**Don't submit at deadline** - Technical issues happen
**Don't ignore program solicitation** - Requirements vary by program
**Don't assume reviewers know your work** - Provide context
## Resources and Links
- **NSF Homepage**: https://www.nsf.gov
- **Award Search**: https://www.nsf.gov/awardsearch/
- **Proposal & Award Policies & Procedures Guide (PAPPG)**: https://www.nsf.gov/publications/pub_summ.jsp?ods_key=pappg
- **FastLane**: https://www.fastlane.nsf.gov/
- **Research.gov**: https://www.research.gov/
- **Broader Impacts Resources**: https://www.nsf.gov/od/oia/special/broaderimpacts/
- **NSF Funding Statistics**: https://www.nsf.gov/statistics/
---
**Key Takeaway**: NSF values both scientific excellence (Intellectual Merit) and societal benefit (Broader Impacts) equally. Successful proposals demonstrate innovative, feasible research that advances knowledge while contributing to education, diversity, infrastructure, or societal well-being in specific, measurable ways.

View File

@@ -0,0 +1,458 @@
# NIH Specific Aims Page: The Complete Guide
## Overview
The **Specific Aims page** is the most important page of your entire NIH grant application. It's the first thing reviewers read, often determines their initial impression, and may be the only page read by some panel members before scoring begins.
**Length**: Exactly 1 page
**Margins**: 0.5 inches (all sides)
**Font**: 11-point Arial, Helvetica, or similar (no smaller)
**Line spacing**: Must be readable
**Purpose**:
- Communicate your research vision clearly and compellingly
- Establish significance and innovation
- Demonstrate feasibility
- Show that you can accomplish meaningful work in the proposed timeframe
- Make reviewers excited to fund your work
## Anatomy of a Specific Aims Page
### Essential Components (in order)
1. **Opening Hook** (2-4 sentences)
2. **Gap/Problem Statement** (2-4 sentences)
3. **Long-Term Goal** (1 sentence)
4. **Objective** (1-2 sentences)
5. **Central Hypothesis** (1 sentence) [or Research Questions]
6. **Rationale** (2-3 sentences with preliminary data mention)
7. **Specific Aims** (2-4 aims, ~½ page total)
8. **Expected Outcomes and Impact** (2-4 sentences)
## Detailed Structure
### Opening Paragraph: The Hook
**Purpose**: Establish importance and grab attention
**What to include**:
- Broad context (disease burden, biological importance, technological need)
- Epidemiological data or statistics that establish scale
- Why this problem matters for health or science
- Create urgency
**Length**: 2-4 sentences
**Writing tips**:
- Start strong with compelling statement
- Use concrete numbers (prevalence, mortality, costs)
- Avoid jargon in first sentence
- Make it accessible to non-specialists on panel
**Examples**:
*Clinical Example*:
"Pancreatic ductal adenocarcinoma (PDAC) is the third leading cause of cancer death in the United States, with a devastating 5-year survival rate of only 11%. Despite decades of research, therapeutic options remain limited, and most patients present with advanced, unresectable disease. The lack of effective early detection methods and targeted therapies represents a critical unmet medical need affecting over 62,000 Americans diagnosed annually."
*Basic Science Example*:
"Mitochondrial dysfunction is a hallmark of aging and age-related diseases, yet the mechanisms linking mitochondrial decline to cellular senescence remain poorly understood. Emerging evidence suggests that mitochondrial-nuclear communication pathways play a central role in longevity determination across species, from yeast to mammals. Understanding how cells sense and respond to mitochondrial stress could reveal new therapeutic targets for age-related diseases affecting millions worldwide."
### Second Paragraph: Gap and Context
**Purpose**: Define what's known, what's unknown, and why it matters
**What to include**:
- Current state of knowledge (brief literature context)
- Specific gap or barrier to progress
- Why this gap is critical to address
- Why current approaches are insufficient
**Length**: 3-5 sentences
**Structure**:
1. What we know (1-2 sentences)
2. What we don't know / what's limiting progress (1-2 sentences)
3. Why this gap matters (1 sentence)
**Examples**:
"Prior studies have identified numerous genetic mutations associated with PDAC development, including KRAS, TP53, SMAD4, and CDKN2A. However, the tumor microenvironment (TME), comprising immune cells, fibroblasts, and extracellular matrix, is increasingly recognized as a critical determinant of therapeutic resistance. Current models fail to recapitulate the complex TME architecture and cell-cell interactions that drive therapy resistance in vivo, limiting our ability to develop effective treatments. Understanding how the TME protects tumor cells from chemotherapy is essential for designing combination therapies that overcome resistance."
### Third Paragraph: Long-Term Goal, Objective, Hypothesis, Rationale
**Purpose**: Set up your specific approach and justification
**Structure**:
**Long-Term Goal** (1 sentence):
- Your overarching research program direction
- Broader than this specific proposal
- Provides context for this work
*Example*: "The long-term goal of our research is to elucidate the molecular mechanisms by which the tumor microenvironment promotes therapeutic resistance in pancreatic cancer."
**Objective** (1-2 sentences):
- Specific objective of THIS grant
- What you will accomplish in 3-5 years
- More focused than long-term goal
*Example*: "The objective of this application is to define the role of cancer-associated fibroblasts (CAFs) in mediating gemcitabine resistance and to develop combination therapies targeting CAF-tumor interactions."
**Central Hypothesis** (1 sentence):
- Testable prediction
- Should unify the specific aims
- Based on preliminary data or logical reasoning
- Clear and specific
*Example*: "Our central hypothesis is that CAF-secreted factors activate protective autophagy in tumor cells, conferring resistance to gemcitabine, and that dual inhibition of CAF signaling and autophagy will restore drug sensitivity."
**Alternative: Research Questions** (if hypothesis-testing isn't appropriate):
- 2-3 focused questions
- Should correspond to specific aims
*Example*: "This project will address the following questions: (1) What factors secreted by CAFs promote tumor cell survival during chemotherapy? (2) How do tumor cells integrate CAF signals to activate protective responses? (3) Can targeting CAF-tumor interactions enhance therapeutic efficacy in preclinical models?"
**Rationale** (2-3 sentences):
- Why you think the hypothesis is true
- Mention key preliminary data (very briefly)
- Logical basis for your approach
- Why this approach will work
*Example*: "This hypothesis is based on our preliminary data showing that CAF-conditioned medium protects tumor cells from gemcitabine-induced apoptosis by 60% (Fig. 1), and that this protection is blocked by autophagy inhibitors (Fig. 2). Proteomic analysis of CAF secretomes identified 15 candidate factors enriched in drug-resistant contexts (Table 1). These findings suggest a targetable pathway linking CAF signaling to tumor cell survival that could be exploited therapeutically."
### Specific Aims (Main Section)
**How many aims**: 2-4 aims (3 is most common for R01)
- **Too few (1)**: Insufficient work, appears risky
- **Just right (2-3)**: Focused, achievable, synergistic
- **Too many (4+)**: Overly ambitious, unlikely to complete
**Structure for each aim**:
1. **Aim Statement** (1-2 sentences, bold or underlined)
2. **Rationale and Background** (1-3 sentences)
3. **Working Hypothesis** (1 sentence, if applicable)
4. **Approach Summary** (2-4 sentences)
5. **Expected Outcomes and Interpretation** (1-2 sentences)
**Length per aim**: ~4-6 sentences (¼ to ⅓ page)
**Relationships between aims**:
- **Independent**: Failure of one aim doesn't doom the others
- **Synergistic**: Aims build on each other or address complementary questions
- **Progressive**: Aim 1 enables Aim 2, Aim 2 enables Aim 3 (be careful—creates risk)
#### Example Aim Structure:
**Aim 1: Identify CAF-secreted factors that mediate gemcitabine resistance.**
*Rationale*: CAF-conditioned medium confers significant protection against gemcitabine (Fig. 1), suggesting secreted factors are responsible. We have identified 15 candidate proteins enriched in CAF secretomes from resistant versus sensitive contexts (Table 1).
*Working Hypothesis*: CAFs secrete specific growth factors and cytokines (including IL-6, CXCL12, and HGF) that activate pro-survival pathways in tumor cells.
*Approach*: We will (1) validate candidate factors using neutralizing antibodies in co-culture assays, (2) measure activation of downstream signaling pathways (STAT3, PI3K/AKT, MAPK) in tumor cells, and (3) perform CRISPR screens in CAFs to identify factors required for resistance phenotype. We will use patient-derived CAFs and tumor cells to ensure clinical relevance.
*Expected Outcomes*: We expect to identify 3-5 CAF-secreted factors sufficient and necessary for gemcitabine resistance, and define their signaling mechanisms. These will serve as therapeutic targets for Aims 2-3.
---
**Aim 2: Determine the mechanisms by which CAF signals activate protective autophagy in tumor cells.**
*Rationale*: Our data show that CAF-mediated resistance requires autophagy (Fig. 2), but the signaling pathways linking CAF factors to autophagy activation remain unknown.
*Working Hypothesis*: CAF-secreted factors activate mTOR-independent autophagy through AMPK and ULK1 phosphorylation.
*Approach*: We will (1) measure autophagy flux in tumor cells exposed to CAF factors using LC3 turnover assays and electron microscopy, (2) define signaling pathways using phosphoproteomic analysis and pharmacologic inhibitors, and (3) validate pathways using genetic knockdowns (shRNA/CRISPR) of key nodes. Studies will be performed in 2D and 3D co-culture systems.
*Expected Outcomes*: We will define the signaling cascade from CAF factors to autophagy activation, identifying druggable nodes for combination therapy. Results will inform Aim 3 therapeutic strategies.
---
**Aim 3: Evaluate combination therapies targeting CAF-tumor interactions in preclinical models.**
*Rationale*: Single-agent therapies targeting CAFs or autophagy have shown limited efficacy clinically, suggesting combination approaches are needed.
*Working Hypothesis*: Dual inhibition of CAF signaling and autophagy will synergistically restore gemcitabine sensitivity in vivo.
*Approach*: Using patient-derived xenograft (PDX) models and genetically engineered mouse models (GEMM) of PDAC, we will test combinations of (1) gemcitabine + CAF pathway inhibitors identified in Aim 1, (2) gemcitabine + autophagy inhibitors, and (3) triple combinations. We will assess tumor growth, survival, and mechanism (IHC, RNA-seq) in n=10-15 mice per group.
*Expected Outcomes*: We expect combination therapies will reduce tumor growth by ≥60% compared to gemcitabine alone, with synergistic effects. The most effective regimen will be advanced toward clinical translation through an investigator-initiated trial (we have IND-enabling resources available at our institution).
### Closing Paragraph: Impact and Significance
**Purpose**: Leave reviewers with enthusiasm and clear understanding of importance
**What to include**:
- Expected outcomes of the overall project
- How findings will advance the field
- Positive impact on health or science
- Next steps or future directions
- Why this matters
**Length**: 2-4 sentences
**Writing tips**:
- Be confident but not arrogant
- Connect back to opening (full circle)
- Emphasize transformative potential
- Avoid over-promising
**Examples**:
"The proposed research is significant because it will define a novel mechanism of chemotherapy resistance in pancreatic cancer and identify new therapeutic targets to overcome this resistance. Results will provide mechanistic insights into CAF-tumor interactions that drive drug resistance, immediately applicable to clinical trial design. We expect findings will enable rational design of combination therapies that improve outcomes for PDAC patients, who currently have few effective treatment options. This work will establish new paradigms for targeting the tumor microenvironment in solid cancers."
## Writing Principles
### Clarity and Accessibility
**Write for a mixed audience**:
- Some panel members will be experts in your area
- Others will be in related but not identical fields
- Program officers and council members will read it
- Some reviewers will only read this page before scoring
**Strategies**:
- Define technical terms at first use
- Explain abbreviations (except very common ones)
- Use clear, direct language
- Avoid excessive jargon
- Make logical flow obvious
### Confidence Without Arrogance
**Confident** ✅:
- "Our preliminary data demonstrate..."
- "We have established a robust model system..."
- "This approach will elucidate..."
**Arrogant** ❌:
- "We are uniquely qualified..."
- "Only our lab can do this..."
- "This will revolutionize the field..."
**Tentative** ❌:
- "We hope to..."
- "We will try to..."
- "It is possible that..."
### Active and Specific
**Aim statements should**:
- Start with action verbs (Determine, Identify, Elucidate, Define, Characterize, Validate, Develop)
- Be specific and testable
- Indicate what will be learned
**Weak Aim** ❌:
"Aim 1: Study the role of protein X in disease Y"
**Strong Aim** ✅:
"Aim 1: Determine how protein X phosphorylation regulates disease Y progression using genetic and pharmacologic approaches"
### Show Feasibility
**Throughout the aims page**:
- Mention preliminary data (figures, tables)
- Reference established methods
- Show you have necessary resources
- Demonstrate expertise
- Indicate prior success
**Don't**:
- Relegate all preliminary data to Research Strategy
- Make it seem like you're starting from scratch
- Propose overly ambitious aims without support
## Common Mistakes
### Mistake 1: Too Much Background
❌ Half page of background before getting to aims
✅ Focused background that motivates your specific approach
The aims page is NOT a mini review article. Provide only enough background to establish importance and gaps.
### Mistake 2: Vague Objectives
❌ "We will study the mechanisms of disease X"
❌ "We will investigate the role of protein Y"
✅ "We will identify the phosphorylation sites on protein Y that regulate its interaction with Z using mass spectrometry and mutagenesis"
### Mistake 3: Overly Ambitious Scope
❌ Four aims, each of which could be a separate R01
❌ Proposing to solve multiple major questions in the field
❌ "Boil the ocean" approach
✅ Focused aims that are clearly achievable in 3-5 years
### Mistake 4: Dependent Aims
❌ Aim 2 and Aim 3 both require Aim 1 to succeed
✅ Aims are synergistic but independent (failure of one doesn't doom the others)
### Mistake 5: No Preliminary Data Mentioned
❌ Seems like a fishing expedition
❌ Reviewers wonder if it's feasible
✅ Brief mentions of preliminary data throughout (refer to figures)
### Mistake 6: Weak Impact Statement
❌ "This will advance our understanding of X"
❌ "Results will be published and presented"
✅ "This will identify new therapeutic targets for disease X, affecting 500,000 patients annually, and provide the foundation for investigator-initiated clinical trials"
### Mistake 7: Jargon-Heavy First Paragraph
❌ Opening sentence full of abbreviations and specialized terminology
❌ Assumes all reviewers are experts in your subfield
✅ Opening that's comprehensible to broad scientific audience
### Mistake 8: No Clear Hypothesis
❌ Just listing aims without unifying framework
❌ Purely descriptive aims
✅ Clear, testable hypothesis that unifies the aims
### Mistake 9: Forgetting Page Limits
❌ Using 1.1 pages (will be deleted or rejected)
❌ Tiny fonts to cram in more content (violations)
✅ Exactly 1 page with compliant formatting
### Mistake 10: Not Telling a Story
❌ Disconnected aims that feel like 3 separate projects
❌ No logical flow or coherence
✅ Unified narrative with aims building on each other
## Advanced Tips
### Use Visual Elements
**Figures on Specific Aims Page**:
- NIH allows figures on aims page
- Can be very effective to show key preliminary data
- Must be legible (font size requirements apply)
- Don't let figure crowd out text
- Typical: 1 small figure or panel showing most critical data
**Tables**:
- Can summarize preliminary data compactly
- Show patient characteristics, gene lists, etc.
- Must be readable
### Strategic Use of Bold/Italics
**Appropriate**:
- Bold aim statements to make them stand out
- Italicize gene names (standard convention)
- Underline key points (sparingly)
**Avoid**:
- Excessive formatting that looks cluttered
- All caps (looks like shouting)
- Colors (may not print/display correctly)
### The "Skim Test"
**Your aims page should pass the skim test**:
- Someone reading just aim statements should understand the project
- Bold aim statements that can be read independently
- Each paragraph has clear topic sentence
- Logical flow is apparent even when skimming
**Exercise**: Ask colleague to read only bold/underlined text—can they understand the project?
### Tailoring to Career Stage
**Early Stage Investigators**:
- Show you've thought through challenges
- Demonstrate strong mentorship and institutional support
- Emphasize innovation while ensuring feasibility
- Don't over-promise
**Established Investigators**:
- Show how this extends your research program
- Emphasize track record implicitly
- Can propose more ambitious aims if supported by extensive preliminary data
- Show how this opens new directions
## Examples of Strong Opening Paragraphs
### Example 1: Cancer Biology
"Metastatic breast cancer kills over 42,000 women annually in the United States, with median survival of only 2-3 years after diagnosis. While primary tumors are often curable, metastatic disease remains incurable due to therapy resistance and tumor heterogeneity. The emergence of drug-resistant cell populations during treatment represents the major barrier to long-term survival, yet the mechanisms governing resistance evolution remain poorly understood. Understanding how tumor heterogeneity and plasticity drive resistance could reveal new therapeutic strategies to prevent or reverse treatment failure."
### Example 2: Neuroscience
"Alzheimer's disease (AD) affects 6.7 million Americans and is projected to reach 13 million by 2050, with annual costs exceeding $355 billion. Despite decades of research focused on amyloid-β and tau pathologies, no disease-modifying therapies exist. Emerging evidence implicates synaptic dysfunction as the earliest pathological event in AD, preceding neurodegeneration by years. The molecular mechanisms linking synaptic failure to cognitive decline represent a critical therapeutic window, yet remain poorly defined. Identifying early synaptic alterations could enable intervention before irreversible neuronal loss occurs."
### Example 3: Infectious Disease
"Antimicrobial-resistant (AMR) infections cause over 2.8 million illnesses and 35,000 deaths annually in the US, with healthcare costs exceeding $4.6 billion. Carbapenem-resistant Enterobacterales (CRE) represent an urgent threat, with mortality rates exceeding 50% for bloodstream infections. Despite this crisis, only two new antibiotics targeting CRE have been approved in the past decade, both with significant limitations. Novel therapeutic approaches that bypass traditional antibiotic mechanisms are urgently needed to combat this growing threat. Targeting host-pathogen interactions rather than bacterial viability represents a promising strategy to combat AMR while reducing selection pressure for resistance."
## Revision Checklist
Before finalizing, ensure your aims page:
**Content**:
- [ ] Opens with compelling statement of importance
- [ ] Clearly defines the gap or problem
- [ ] States specific, measurable objective
- [ ] Presents testable hypothesis (or focused research questions)
- [ ] Mentions preliminary data supporting feasibility
- [ ] Includes 2-4 specific aims
- [ ] Each aim is testable and achievable
- [ ] Aims are independent but synergistic
- [ ] Expected outcomes are clearly stated
- [ ] Closes with impact and significance
**Clarity**:
- [ ] First paragraph is accessible to non-specialists
- [ ] Technical terms are defined
- [ ] Abbreviations are spelled out at first use
- [ ] Logical flow is clear
- [ ] Aim statements can stand alone
- [ ] Language is confident and active
**Format**:
- [ ] Exactly 1 page
- [ ] 0.5-inch margins
- [ ] 11-point font or larger
- [ ] Readable line spacing
- [ ] Compliant with NIH formatting requirements
- [ ] Figures (if included) are legible
**Impact**:
- [ ] Passes the "skim test"
- [ ] Would make you excited if you were a reviewer
- [ ] Clearly articulates significance
- [ ] Shows feasibility without over-selling
- [ ] Connects to health or scientific impact
## Final Thoughts
The Specific Aims page is where grants are won or lost. **Invest time in getting this right**:
- Write 10+ drafts
- Get feedback from colleagues and mentors
- Test it on people outside your field
- Read it aloud to check flow
- Let it sit, then revise with fresh eyes
- Study funded examples in your field
**Remember**: Reviewers are reading 10-20 applications. Your aims page needs to immediately communicate importance, innovation, and feasibility—and make them want to fund your work.
---
**Key Takeaway**: The perfect Specific Aims page tells a compelling story in exactly one page—establishing a significant problem, presenting an innovative and feasible solution, showing preliminary evidence of success, and articulating transformative impact. Every sentence must earn its place.