Initial commit

This commit is contained in:
Zhongwei Li
2025-11-29 18:19:41 +08:00
commit 0b8462a252
8 changed files with 1222 additions and 0 deletions

View File

@@ -0,0 +1,166 @@
# Critical Questions by Document Type
This reference provides type-specific questions for Third Pass critical analysis. Use these to supplement the universal questions in SKILL.md.
## Tech Blog
**Core Analysis:**
- What problem does this solve, and for whom?
- What are the trade-offs of this solution vs alternatives?
- What's the assumed technical environment (stack, scale, team size)?
- How would this approach fail or scale poorly?
**Design & Implementation:**
- Why this approach instead of simpler alternatives?
- What are the hidden costs (complexity, maintenance, performance)?
- What edge cases or error scenarios are not addressed?
- How testable/debuggable is this approach?
**Applicability:**
- What prerequisites or constraints does this assume?
- How well does this generalize to other contexts?
- What would need to change for my use case?
- Are there better-suited alternatives for my situation?
**Evidence Quality:**
- Are benchmarks/metrics provided? Are they representative?
- Is there production experience or just prototypes?
- What's missing from the evaluation?
## Retrospective
**Context & Validity:**
- What was the specific context (team, org, timeline, constraints)?
- How much is success due to approach vs context/luck?
- Would this work in a different setting? Which aspects are transferable?
**Lessons & Generalization:**
- Are lessons backed by specific evidence or anecdotal?
- What's the sample size (one project, multiple iterations)?
- Are there alternative explanations for outcomes?
- What would have happened with different decisions?
**Missing Perspectives:**
- What didn't work that isn't mentioned?
- Who else was involved? What's their perspective?
- What conflicts or tensions were glossed over?
- What failed attempts preceded success?
**Temporal Factors:**
- When was this? Are lessons still valid?
- How has the landscape changed since then?
- What would be different if done today?
## Technical Documentation
**Design & Rationale:**
- What design philosophy underlies this approach?
- Why these abstractions vs alternatives?
- What trade-offs were made (flexibility vs simplicity, etc.)?
- What's explicitly not supported, and why?
**Completeness & Accuracy:**
- What common use cases are not documented?
- Are there undocumented edge cases or gotchas?
- Is error handling clearly explained?
- What's the migration/upgrade story?
**Context & Constraints:**
- What assumptions about usage patterns?
- What scale/performance characteristics?
- What dependencies or prerequisites?
- What's the intended audience expertise level?
**Alternatives & Ecosystem:**
- How does this compare to competing approaches?
- What problems does this NOT solve well?
- When should you use something else?
## Personal Writing
**Argument Structure:**
- What's the core thesis or claim?
- Is the claim clearly stated or implicit?
- What evidence supports each point?
- Are there logical gaps or leaps?
**Assumptions & Biases:**
- What's taken for granted without justification?
- What perspectives or counterarguments are missing?
- Are there hidden biases in framing?
- What's the assumed audience knowledge?
**Clarity & Coherence:**
- Is the argument easy to follow?
- Do examples actually support the points?
- Are terms used consistently?
- Is the conclusion justified by the body?
**Rigor & Evidence:**
- Are claims supported by evidence or assertion?
- Are sources credible and relevant?
- Are alternative explanations considered?
- What would strengthen this argument?
**Practical Application:**
- Is this actionable or purely theoretical?
- Who benefits from these ideas?
- What's needed to implement these suggestions?
- What could go wrong in practice?
## Academic Paper
**Research Design:**
- Are research questions clearly stated?
- Is methodology appropriate for questions?
- Are there confounding variables not controlled?
- Is sample size adequate and representative?
- Could results be explained differently?
**Assumptions & Validity:**
- What theoretical assumptions underlie this work?
- Are measurements valid for constructs?
- Are there threats to internal/external validity?
- How replicable is this research?
**Related Work:**
- Is related work comprehensive and fair?
- What relevant prior work is missing?
- How does this advance beyond existing work?
- Are comparisons appropriate and fair?
**Results & Interpretation:**
- Do conclusions follow from results?
- Are alternative interpretations possible?
- Is statistical significance practical significance?
- What are limitations and boundary conditions?
**Contribution & Impact:**
- What's genuinely novel here?
- Is contribution incremental or significant?
- What future research does this enable?
- What are real-world applications?
## Cross-Cutting Questions
These apply to all document types:
**Authority & Credibility:**
- What's the author's expertise in this area?
- What potential conflicts of interest exist?
- Is tone confident or overconfident?
**Temporal Context:**
- When was this written?
- What was the state of the field then?
- What's changed since publication?
**Rhetorical Choices:**
- What's emphasized vs downplayed?
- What language choices reveal assumptions?
- What's the intended audience and purpose?
**Actionability:**
- What can I do with this information?
- What would I need to verify before using this?
- What risks come with applying this?