Initial commit

This commit is contained in:
Zhongwei Li
2025-11-29 18:13:57 +08:00
commit 66fc217a0c
4 changed files with 452 additions and 0 deletions

View File

@@ -0,0 +1,12 @@
{
"name": "quality-validator",
"description": "An easy way to force agent to think again if it's statement that the \"Job is done and production ready\" is actually done - usually it's not. Thanks to this command you don't have to check after the agent if they did their job.",
"version": "1.0.0",
"author": {
"name": "ClaudeForge Community",
"url": "https://github.com/claudeforge/marketplace"
},
"commands": [
"./commands"
]
}

3
README.md Normal file
View File

@@ -0,0 +1,3 @@
# quality-validator
An easy way to force agent to think again if it's statement that the "Job is done and production ready" is actually done - usually it's not. Thanks to this command you don't have to check after the agent if they did their job.

392
commands/double-check.md Normal file
View File

@@ -0,0 +1,392 @@
---
description: Comprehensive quality validation command that forces rigorous self-review before marking work complete. Ensures production-ready deliverables through multi-angle analysis, completeness verification, and quality gate enforcement.
author: ClaudeForge Community
version: 2.0.0
tools: ['analysis', 'testing', 'validation']
capabilities: ['quality-assurance', 'completeness-checking', 'multi-perspective-analysis']
---
# Quality Validator - Production Readiness Verification
## Overview
The Quality Validator command provides a structured, comprehensive approach to verifying that work is truly complete and production-ready. It prevents premature completion claims by enforcing rigorous self-review across multiple dimensions, ensuring all deliverables meet professional standards before being marked as done.
## Core Purpose
In software development, declaring something "done" prematurely leads to:
- Technical debt accumulation
- Production bugs and incidents
- Rework and wasted time
- Loss of stakeholder trust
- Missed requirements
This command combats these issues by enforcing a mandatory quality gate that validates completeness across multiple angles before allowing completion.
## Validation Framework
### Phase 1: Scope Definition & Understanding
Before validating, clearly define what "complete" means for this specific task:
**Questions to Answer:**
1. What was the original request or requirement?
2. What were the explicit deliverables?
3. What were the implicit expectations?
4. What are the acceptance criteria?
5. Who is the audience or stakeholder?
6. What is the use case or application?
**Define Completeness:**
```
Complete = All Requirements Met + Quality Standards Satisfied + No Known Issues + Documented + Tested
```
**Document Interpretation:**
- Original request: [Restate user's request]
- Explicit deliverables: [List specific items to deliver]
- Implicit requirements: [Inferred expectations based on context]
- Success criteria: [How will success be measured?]
- Out of scope: [What is explicitly NOT included]
### Phase 2: Multi-Angle Analysis
Approach the validation from multiple perspectives to ensure comprehensive coverage:
#### Angle 1: Functional Completeness
**Question:** Does it do what it's supposed to do?
**Validation checklist:**
- [ ] All specified features implemented
- [ ] Core functionality works as designed
- [ ] Edge cases handled appropriately
- [ ] Error conditions managed gracefully
- [ ] User workflows complete end-to-end
- [ ] Integration points functional
- [ ] Data flows work correctly
#### Angle 2: Technical Quality
**Question:** Is it built well?
**Validation checklist:**
- [ ] Code follows project conventions and style guides
- [ ] Architecture is sound and scalable
- [ ] No code smells or anti-patterns
- [ ] Proper error handling and logging
- [ ] Security best practices followed
- [ ] Performance is acceptable
- [ ] No unnecessary complexity
#### Angle 3: Testing & Validation
**Question:** Has it been properly tested?
**Validation checklist:**
- [ ] Unit tests written and passing
- [ ] Integration tests cover key flows
- [ ] Manual testing completed
- [ ] Edge cases tested
- [ ] Error scenarios validated
- [ ] Performance tested under load
- [ ] Security testing completed (if applicable)
#### Angle 4: Documentation
**Question:** Can others understand and use it?
**Validation checklist:**
- [ ] Code comments explain "why", not just "what"
- [ ] API documentation complete
- [ ] User documentation written
- [ ] Setup/deployment instructions provided
- [ ] Known limitations documented
- [ ] Troubleshooting guide included
- [ ] Examples provided
#### Angle 5: User Experience
**Question:** Is it usable and intuitive?
**Validation checklist:**
- [ ] User interface is clear and intuitive
- [ ] Error messages are helpful
- [ ] Loading states and feedback provided
- [ ] Responsive and performant
- [ ] Accessible (a11y considerations)
- [ ] Consistent with design system/patterns
#### Angle 6: Operations & Maintainability
**Question:** Can it be deployed, monitored, and maintained?
**Validation checklist:**
- [ ] Deployment process documented
- [ ] Configuration externalized
- [ ] Monitoring and alerting configured
- [ ] Logs provide useful debugging information
- [ ] Rollback procedure defined
- [ ] Dependencies documented
- [ ] Upgrade/migration path clear
### Phase 3: Deep Dive Review
After multi-angle analysis, perform detailed review:
**Code Review (if applicable):**
1. Review every file changed
2. Check for hardcoded values or magic numbers
3. Verify proper resource cleanup (connections, files, etc.)
4. Ensure thread safety where needed
5. Validate input sanitization and validation
6. Check for potential memory leaks
7. Review database queries for efficiency
**Logic Review:**
1. Walk through each user flow step-by-step
2. Consider alternative paths and scenarios
3. Verify business rules are correctly implemented
4. Check calculations and algorithms
5. Validate state management
6. Review error handling paths
**Integration Review:**
1. Verify API contracts are respected
2. Check external service dependencies
3. Validate data transformations
4. Review authentication and authorization
5. Test cross-component communication
6. Verify backward compatibility
### Phase 4: Quality Gates
Each quality gate must pass before proceeding:
#### Gate 1: Functionality ✓
- All features work as specified
- No critical bugs
- Edge cases handled
- Performance acceptable
#### Gate 2: Code Quality ✓
- Linting passes
- Type checking passes (if applicable)
- Code review standards met
- No security vulnerabilities
#### Gate 3: Testing ✓
- All tests pass
- Coverage targets met
- Integration tests successful
- Manual testing completed
#### Gate 4: Documentation ✓
- Code documented
- User docs written
- API docs complete
- README updated
#### Gate 5: Production Readiness ✓
- Deployable
- Monitorable
- Rollback plan exists
- Runbook created
### Phase 5: Risk Assessment
Identify and assess risks before declaring complete:
**Risk Categories:**
1. **Technical Risks**
- Scalability concerns
- Performance bottlenecks
- Security vulnerabilities
- Technical debt introduced
2. **Business Risks**
- Incomplete requirements
- User experience issues
- Data loss possibilities
- Compliance concerns
3. **Operational Risks**
- Deployment complexity
- Monitoring gaps
- Support burden
- Maintenance difficulty
**Risk Matrix:**
```
For each identified risk:
- Likelihood: Low / Medium / High
- Impact: Low / Medium / High
- Mitigation: Planned action or acceptance
- Owner: Who is responsible
```
**Blocker Assessment:**
- Are there any high-likelihood, high-impact risks?
- If yes, work is NOT complete until mitigated
### Phase 6: Completeness Verification
Final checklist before declaring done:
**Requirement Traceability:**
- [ ] Every stated requirement has been addressed
- [ ] No requirements were overlooked or forgotten
- [ ] All acceptance criteria are met
- [ ] Stakeholder expectations will be satisfied
**Quality Standards:**
- [ ] Meets or exceeds quality bar for this project
- [ ] No known defects (or acceptable defects documented)
- [ ] Performance within acceptable range
- [ ] Security standards met
**Production Readiness:**
- [ ] Can be deployed to production safely
- [ ] Will not cause production incidents
- [ ] Can be monitored effectively
- [ ] Can be rolled back if needed
- [ ] Support team is prepared
**Knowledge Transfer:**
- [ ] Documentation is complete and accurate
- [ ] Team members understand the changes
- [ ] Support procedures documented
- [ ] Runbook created for operations
## Validation Scoring
Rate each dimension on a scale of 1-10:
```
Functional Completeness: __/10
Technical Quality: __/10
Testing Coverage: __/10
Documentation: __/10
User Experience: __/10
Production Readiness: __/10
TOTAL: __/60
Scoring:
54-60: Excellent - Production ready
48-53: Good - Minor improvements recommended
42-47: Fair - Significant gaps, not ready
<42: Poor - Major work required
```
**Minimum threshold:** 48/60 required to declare complete
## Common Pitfalls to Avoid
### Pitfall 1: "It Works on My Machine"
**Problem:** Untested in production-like environment
**Validation:** Test in staging/pre-prod environment
### Pitfall 2: "I'll Document It Later"
**Problem:** Documentation never happens
**Validation:** Documentation is required for completion
### Pitfall 3: "That's an Edge Case"
**Problem:** Edge cases cause production bugs
**Validation:** All reasonable edge cases must be handled
### Pitfall 4: "Tests Are Optional"
**Problem:** Regressions and hidden bugs
**Validation:** Testing is mandatory, not optional
### Pitfall 5: "Good Enough"
**Problem:** Technical debt and quality degradation
**Validation:** Must meet quality standards, not just "work"
## Command Execution
When invoked with `/double-check $ARGUMENTS`:
1. **Parse Context:** Understand what work was done (from arguments or recent activity)
2. **Define Complete:** Explicitly state what "complete" means for this task
3. **Multi-Angle Analysis:** Evaluate from all 6 angles (functional, technical, testing, docs, UX, ops)
4. **Quality Gates:** Verify each gate passes
5. **Risk Assessment:** Identify and assess risks
6. **Completeness Score:** Calculate overall completeness rating
7. **Gap Analysis:** List what's missing or incomplete
8. **Recommendations:** Provide specific actions to reach true completion
9. **Decision:** Is it actually complete? Yes/No with justification
## Best Practices
### When to Use
- Before marking any task as "done"
- After completing a feature or bug fix
- Before creating a pull request
- Before deploying to production
- When stakeholder asks "is it ready?"
### How to Use Effectively
1. Be honest and critical in self-assessment
2. Don't skip dimensions that seem "obvious"
3. Consider perspectives of different stakeholders
4. Document gaps even if they seem small
5. Use as continuous improvement tool
### Integration with Workflow
```
Typical Flow:
1. Implement feature/fix
2. Run local tests
3. Initial self-review
4. Run /double-check command
5. Address identified gaps
6. Run /double-check again
7. Create pull request
8. Request peer review
```
## Metrics & Success Indicators
**Quality Metrics:**
- Reduction in post-deployment bugs
- Fewer production incidents
- Decreased rework time
- Higher stakeholder satisfaction
- Faster code review cycles
**Success Indicators:**
- Passing score on first /double-check run: Good preparation
- Multiple gaps identified: Tool is working as intended
- Repeated use leads to fewer gaps: Skill improvement
- Team adoption increases: Cultural shift toward quality
## Command Arguments
**Usage:** `/double-check [task-description] [--strict] [--quick]`
**Flags:**
- `--strict`: Apply higher standards (55/60 minimum score)
- `--quick`: Fast validation focusing on critical dimensions only
- No flags: Standard comprehensive validation
**Examples:**
```bash
/double-check # Validate recent work
/double-check user authentication feature # Validate specific feature
/double-check --strict # Apply stricter criteria
/double-check --quick bug fix # Quick validation of bug fix
```
## Summary
The Quality Validator command enforces a culture of excellence by making "done" mean truly done. It systematically validates completeness across functional, technical, testing, documentation, user experience, and operational dimensions. By preventing premature completion and catching issues early, it reduces technical debt, improves quality, and builds stakeholder trust.
Use this command liberally - it's better to discover gaps before committing than to discover them in production.
**Remember:** Professional work isn't just functional - it's complete, tested, documented, and production-ready. This command helps you get there every time.
$ARGUMENTS

45
plugin.lock.json Normal file
View File

@@ -0,0 +1,45 @@
{
"$schema": "internal://schemas/plugin.lock.v1.json",
"pluginId": "gh:claudeforge/marketplace:plugins/commands/quality-validator",
"normalized": {
"repo": null,
"ref": "refs/tags/v20251128.0",
"commit": "fe0b741ac50fa99d8eb677b7504f36312b5556f4",
"treeHash": "2049ca1e576144ad4e57087977303dfb6db55e3b18a8a485e0e2a5e9159bf468",
"generatedAt": "2025-11-28T10:15:36.143323Z",
"toolVersion": "publish_plugins.py@0.2.0"
},
"origin": {
"remote": "git@github.com:zhongweili/42plugin-data.git",
"branch": "master",
"commit": "aa1497ed0949fd50e99e70d6324a29c5b34f9390",
"repoRoot": "/Users/zhongweili/projects/openmind/42plugin-data"
},
"manifest": {
"name": "quality-validator",
"description": "An easy way to force agent to think again if it's statement that the \"Job is done and production ready\" is actually done - usually it's not. Thanks to this command you don't have to check after the agent if they did their job.",
"version": "1.0.0"
},
"content": {
"files": [
{
"path": "README.md",
"sha256": "22c14c3bc0b9e75990b66ae44b88d77eb9c55df7892460b649efcdb60ae4578a"
},
{
"path": ".claude-plugin/plugin.json",
"sha256": "fbc802d1f53d3c2b3bc5ae53637aed5d6b0044a380203455003532dce9e1de79"
},
{
"path": "commands/double-check.md",
"sha256": "20e8b017fc6550ce8bfe0c9adc9fb104253fef4d938b686bc4f47878481d3692"
}
],
"dirSha256": "2049ca1e576144ad4e57087977303dfb6db55e3b18a8a485e0e2a5e9159bf468"
},
"security": {
"scannedAt": null,
"scannerVersion": null,
"flags": []
}
}